May 16 03:27:49.062067 kernel: Linux version 6.6.90-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu May 15 22:08:20 -00 2025 May 16 03:27:49.062101 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=5e2f56b68c7f7e65e4df73d074f249f99b5795b677316c47e2ad758e6bd99733 May 16 03:27:49.062111 kernel: BIOS-provided physical RAM map: May 16 03:27:49.062119 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 16 03:27:49.064156 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 16 03:27:49.064177 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 16 03:27:49.064198 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdcfff] usable May 16 03:27:49.064206 kernel: BIOS-e820: [mem 0x00000000bffdd000-0x00000000bfffffff] reserved May 16 03:27:49.064214 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 16 03:27:49.064221 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 16 03:27:49.064246 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000013fffffff] usable May 16 03:27:49.064263 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 16 03:27:49.064287 kernel: NX (Execute Disable) protection: active May 16 03:27:49.064311 kernel: APIC: Static calls initialized May 16 03:27:49.064348 kernel: SMBIOS 3.0.0 present. May 16 03:27:49.064357 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.16.3-debian-1.16.3-2 04/01/2014 May 16 03:27:49.064365 kernel: Hypervisor detected: KVM May 16 03:27:49.064373 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 16 03:27:49.064380 kernel: kvm-clock: using sched offset of 3660682174 cycles May 16 03:27:49.064389 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 16 03:27:49.064400 kernel: tsc: Detected 1996.249 MHz processor May 16 03:27:49.064408 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 16 03:27:49.064417 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 16 03:27:49.064425 kernel: last_pfn = 0x140000 max_arch_pfn = 0x400000000 May 16 03:27:49.064433 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 16 03:27:49.064441 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 16 03:27:49.064449 kernel: last_pfn = 0xbffdd max_arch_pfn = 0x400000000 May 16 03:27:49.064457 kernel: ACPI: Early table checksum verification disabled May 16 03:27:49.064467 kernel: ACPI: RSDP 0x00000000000F51E0 000014 (v00 BOCHS ) May 16 03:27:49.064475 kernel: ACPI: RSDT 0x00000000BFFE1B65 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 16 03:27:49.064484 kernel: ACPI: FACP 0x00000000BFFE1A49 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 16 03:27:49.064492 kernel: ACPI: DSDT 0x00000000BFFE0040 001A09 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 16 03:27:49.064500 kernel: ACPI: FACS 0x00000000BFFE0000 000040 May 16 03:27:49.064508 kernel: ACPI: APIC 0x00000000BFFE1ABD 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 16 03:27:49.064516 kernel: ACPI: WAET 0x00000000BFFE1B3D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 16 03:27:49.064524 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1a49-0xbffe1abc] May 16 03:27:49.064532 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffe0040-0xbffe1a48] May 16 03:27:49.064542 kernel: ACPI: Reserving FACS table memory at [mem 0xbffe0000-0xbffe003f] May 16 03:27:49.064562 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe1abd-0xbffe1b3c] May 16 03:27:49.064570 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1b3d-0xbffe1b64] May 16 03:27:49.064582 kernel: No NUMA configuration found May 16 03:27:49.064590 kernel: Faking a node at [mem 0x0000000000000000-0x000000013fffffff] May 16 03:27:49.064599 kernel: NODE_DATA(0) allocated [mem 0x13fffa000-0x13fffffff] May 16 03:27:49.064607 kernel: Zone ranges: May 16 03:27:49.064618 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 16 03:27:49.064626 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] May 16 03:27:49.064635 kernel: Normal [mem 0x0000000100000000-0x000000013fffffff] May 16 03:27:49.064643 kernel: Movable zone start for each node May 16 03:27:49.064651 kernel: Early memory node ranges May 16 03:27:49.064659 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 16 03:27:49.064668 kernel: node 0: [mem 0x0000000000100000-0x00000000bffdcfff] May 16 03:27:49.064676 kernel: node 0: [mem 0x0000000100000000-0x000000013fffffff] May 16 03:27:49.064687 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000013fffffff] May 16 03:27:49.064695 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 16 03:27:49.064703 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 16 03:27:49.064712 kernel: On node 0, zone Normal: 35 pages in unavailable ranges May 16 03:27:49.064720 kernel: ACPI: PM-Timer IO Port: 0x608 May 16 03:27:49.064728 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 16 03:27:49.064737 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 16 03:27:49.064745 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 16 03:27:49.064754 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 16 03:27:49.064764 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 16 03:27:49.064773 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 16 03:27:49.064781 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 16 03:27:49.064790 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 16 03:27:49.064798 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs May 16 03:27:49.064806 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 16 03:27:49.064815 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices May 16 03:27:49.064823 kernel: Booting paravirtualized kernel on KVM May 16 03:27:49.064831 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 16 03:27:49.064842 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 16 03:27:49.064851 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 May 16 03:27:49.064859 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 May 16 03:27:49.064867 kernel: pcpu-alloc: [0] 0 1 May 16 03:27:49.064875 kernel: kvm-guest: PV spinlocks disabled, no host support May 16 03:27:49.064885 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=5e2f56b68c7f7e65e4df73d074f249f99b5795b677316c47e2ad758e6bd99733 May 16 03:27:49.064894 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 16 03:27:49.064903 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 16 03:27:49.064914 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 16 03:27:49.064922 kernel: Fallback order for Node 0: 0 May 16 03:27:49.064931 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1031901 May 16 03:27:49.064939 kernel: Policy zone: Normal May 16 03:27:49.064947 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 16 03:27:49.064956 kernel: software IO TLB: area num 2. May 16 03:27:49.064965 kernel: Memory: 3962120K/4193772K available (14336K kernel code, 2296K rwdata, 25068K rodata, 43600K init, 1472K bss, 231392K reserved, 0K cma-reserved) May 16 03:27:49.064973 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 16 03:27:49.064982 kernel: ftrace: allocating 37997 entries in 149 pages May 16 03:27:49.064992 kernel: ftrace: allocated 149 pages with 4 groups May 16 03:27:49.065000 kernel: Dynamic Preempt: voluntary May 16 03:27:49.065008 kernel: rcu: Preemptible hierarchical RCU implementation. May 16 03:27:49.065018 kernel: rcu: RCU event tracing is enabled. May 16 03:27:49.065026 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 16 03:27:49.065035 kernel: Trampoline variant of Tasks RCU enabled. May 16 03:27:49.065043 kernel: Rude variant of Tasks RCU enabled. May 16 03:27:49.065052 kernel: Tracing variant of Tasks RCU enabled. May 16 03:27:49.065060 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 16 03:27:49.065071 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 16 03:27:49.065079 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 May 16 03:27:49.065088 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 16 03:27:49.065096 kernel: Console: colour VGA+ 80x25 May 16 03:27:49.065104 kernel: printk: console [tty0] enabled May 16 03:27:49.065112 kernel: printk: console [ttyS0] enabled May 16 03:27:49.065121 kernel: ACPI: Core revision 20230628 May 16 03:27:49.066168 kernel: APIC: Switch to symmetric I/O mode setup May 16 03:27:49.066180 kernel: x2apic enabled May 16 03:27:49.066192 kernel: APIC: Switched APIC routing to: physical x2apic May 16 03:27:49.066201 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 16 03:27:49.066209 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized May 16 03:27:49.066218 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) May 16 03:27:49.066226 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 May 16 03:27:49.066235 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 May 16 03:27:49.066243 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 16 03:27:49.066252 kernel: Spectre V2 : Mitigation: Retpolines May 16 03:27:49.066260 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 16 03:27:49.066271 kernel: Speculative Store Bypass: Vulnerable May 16 03:27:49.066279 kernel: x86/fpu: x87 FPU will use FXSAVE May 16 03:27:49.066287 kernel: Freeing SMP alternatives memory: 32K May 16 03:27:49.066296 kernel: pid_max: default: 32768 minimum: 301 May 16 03:27:49.066311 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity May 16 03:27:49.066323 kernel: landlock: Up and running. May 16 03:27:49.066331 kernel: SELinux: Initializing. May 16 03:27:49.066340 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 16 03:27:49.066349 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 16 03:27:49.066358 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) May 16 03:27:49.066367 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 16 03:27:49.066376 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 16 03:27:49.066386 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 16 03:27:49.066395 kernel: Performance Events: AMD PMU driver. May 16 03:27:49.066404 kernel: ... version: 0 May 16 03:27:49.066413 kernel: ... bit width: 48 May 16 03:27:49.066421 kernel: ... generic registers: 4 May 16 03:27:49.066432 kernel: ... value mask: 0000ffffffffffff May 16 03:27:49.066441 kernel: ... max period: 00007fffffffffff May 16 03:27:49.066450 kernel: ... fixed-purpose events: 0 May 16 03:27:49.066458 kernel: ... event mask: 000000000000000f May 16 03:27:49.066467 kernel: signal: max sigframe size: 1440 May 16 03:27:49.066476 kernel: rcu: Hierarchical SRCU implementation. May 16 03:27:49.066485 kernel: rcu: Max phase no-delay instances is 400. May 16 03:27:49.066494 kernel: smp: Bringing up secondary CPUs ... May 16 03:27:49.066502 kernel: smpboot: x86: Booting SMP configuration: May 16 03:27:49.066514 kernel: .... node #0, CPUs: #1 May 16 03:27:49.066522 kernel: smp: Brought up 1 node, 2 CPUs May 16 03:27:49.066531 kernel: smpboot: Max logical packages: 2 May 16 03:27:49.066540 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) May 16 03:27:49.066549 kernel: devtmpfs: initialized May 16 03:27:49.066557 kernel: x86/mm: Memory block size: 128MB May 16 03:27:49.066566 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 16 03:27:49.066576 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 16 03:27:49.066585 kernel: pinctrl core: initialized pinctrl subsystem May 16 03:27:49.066596 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 16 03:27:49.066604 kernel: audit: initializing netlink subsys (disabled) May 16 03:27:49.066613 kernel: audit: type=2000 audit(1747366068.672:1): state=initialized audit_enabled=0 res=1 May 16 03:27:49.066622 kernel: thermal_sys: Registered thermal governor 'step_wise' May 16 03:27:49.066631 kernel: thermal_sys: Registered thermal governor 'user_space' May 16 03:27:49.066639 kernel: cpuidle: using governor menu May 16 03:27:49.066648 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 16 03:27:49.066657 kernel: dca service started, version 1.12.1 May 16 03:27:49.066666 kernel: PCI: Using configuration type 1 for base access May 16 03:27:49.066677 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 16 03:27:49.066686 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 16 03:27:49.066695 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 16 03:27:49.066704 kernel: ACPI: Added _OSI(Module Device) May 16 03:27:49.066713 kernel: ACPI: Added _OSI(Processor Device) May 16 03:27:49.066721 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 16 03:27:49.066730 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 16 03:27:49.066739 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 16 03:27:49.066748 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC May 16 03:27:49.066760 kernel: ACPI: Interpreter enabled May 16 03:27:49.066769 kernel: ACPI: PM: (supports S0 S3 S5) May 16 03:27:49.066777 kernel: ACPI: Using IOAPIC for interrupt routing May 16 03:27:49.066786 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 16 03:27:49.066795 kernel: PCI: Using E820 reservations for host bridge windows May 16 03:27:49.066804 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F May 16 03:27:49.066813 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 16 03:27:49.066978 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] May 16 03:27:49.067082 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] May 16 03:27:49.069211 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge May 16 03:27:49.069227 kernel: acpiphp: Slot [3] registered May 16 03:27:49.069236 kernel: acpiphp: Slot [4] registered May 16 03:27:49.069245 kernel: acpiphp: Slot [5] registered May 16 03:27:49.069254 kernel: acpiphp: Slot [6] registered May 16 03:27:49.069263 kernel: acpiphp: Slot [7] registered May 16 03:27:49.069271 kernel: acpiphp: Slot [8] registered May 16 03:27:49.069285 kernel: acpiphp: Slot [9] registered May 16 03:27:49.069294 kernel: acpiphp: Slot [10] registered May 16 03:27:49.069303 kernel: acpiphp: Slot [11] registered May 16 03:27:49.069311 kernel: acpiphp: Slot [12] registered May 16 03:27:49.069320 kernel: acpiphp: Slot [13] registered May 16 03:27:49.069329 kernel: acpiphp: Slot [14] registered May 16 03:27:49.069337 kernel: acpiphp: Slot [15] registered May 16 03:27:49.069346 kernel: acpiphp: Slot [16] registered May 16 03:27:49.069355 kernel: acpiphp: Slot [17] registered May 16 03:27:49.069363 kernel: acpiphp: Slot [18] registered May 16 03:27:49.069374 kernel: acpiphp: Slot [19] registered May 16 03:27:49.069382 kernel: acpiphp: Slot [20] registered May 16 03:27:49.069391 kernel: acpiphp: Slot [21] registered May 16 03:27:49.069399 kernel: acpiphp: Slot [22] registered May 16 03:27:49.069408 kernel: acpiphp: Slot [23] registered May 16 03:27:49.069416 kernel: acpiphp: Slot [24] registered May 16 03:27:49.069425 kernel: acpiphp: Slot [25] registered May 16 03:27:49.069434 kernel: acpiphp: Slot [26] registered May 16 03:27:49.069442 kernel: acpiphp: Slot [27] registered May 16 03:27:49.069454 kernel: acpiphp: Slot [28] registered May 16 03:27:49.069462 kernel: acpiphp: Slot [29] registered May 16 03:27:49.069471 kernel: acpiphp: Slot [30] registered May 16 03:27:49.069480 kernel: acpiphp: Slot [31] registered May 16 03:27:49.069488 kernel: PCI host bridge to bus 0000:00 May 16 03:27:49.069588 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 16 03:27:49.069681 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 16 03:27:49.069768 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 16 03:27:49.069857 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 16 03:27:49.069941 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc07fffffff window] May 16 03:27:49.070023 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 16 03:27:49.070235 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 May 16 03:27:49.070350 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 May 16 03:27:49.070454 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 May 16 03:27:49.070556 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc120-0xc12f] May 16 03:27:49.070649 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] May 16 03:27:49.070743 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] May 16 03:27:49.070837 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] May 16 03:27:49.070929 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] May 16 03:27:49.071031 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 May 16 03:27:49.071155 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI May 16 03:27:49.071273 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB May 16 03:27:49.071374 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 May 16 03:27:49.071466 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] May 16 03:27:49.071560 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xc000000000-0xc000003fff 64bit pref] May 16 03:27:49.071654 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] May 16 03:27:49.071751 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] May 16 03:27:49.071847 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 16 03:27:49.071956 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 May 16 03:27:49.072083 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] May 16 03:27:49.073228 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] May 16 03:27:49.073325 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xc000004000-0xc000007fff 64bit pref] May 16 03:27:49.073418 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] May 16 03:27:49.073523 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 May 16 03:27:49.073619 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] May 16 03:27:49.073718 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] May 16 03:27:49.073811 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xc000008000-0xc00000bfff 64bit pref] May 16 03:27:49.073910 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 May 16 03:27:49.074007 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] May 16 03:27:49.074100 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xc00000c000-0xc00000ffff 64bit pref] May 16 03:27:49.075276 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 May 16 03:27:49.075373 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc100-0xc11f] May 16 03:27:49.075471 kernel: pci 0000:00:06.0: reg 0x14: [mem 0xfeb93000-0xfeb93fff] May 16 03:27:49.075563 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xc000010000-0xc000013fff 64bit pref] May 16 03:27:49.075576 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 16 03:27:49.075586 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 16 03:27:49.075595 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 16 03:27:49.075604 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 16 03:27:49.075613 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 May 16 03:27:49.075622 kernel: iommu: Default domain type: Translated May 16 03:27:49.075634 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 16 03:27:49.075643 kernel: PCI: Using ACPI for IRQ routing May 16 03:27:49.075652 kernel: PCI: pci_cache_line_size set to 64 bytes May 16 03:27:49.075660 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 16 03:27:49.075669 kernel: e820: reserve RAM buffer [mem 0xbffdd000-0xbfffffff] May 16 03:27:49.075761 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device May 16 03:27:49.075853 kernel: pci 0000:00:02.0: vgaarb: bridge control possible May 16 03:27:49.075945 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 16 03:27:49.075959 kernel: vgaarb: loaded May 16 03:27:49.075972 kernel: clocksource: Switched to clocksource kvm-clock May 16 03:27:49.075981 kernel: VFS: Disk quotas dquot_6.6.0 May 16 03:27:49.075989 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 16 03:27:49.075998 kernel: pnp: PnP ACPI init May 16 03:27:49.076093 kernel: pnp 00:03: [dma 2] May 16 03:27:49.076108 kernel: pnp: PnP ACPI: found 5 devices May 16 03:27:49.076117 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 16 03:27:49.077160 kernel: NET: Registered PF_INET protocol family May 16 03:27:49.077180 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 16 03:27:49.077189 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 16 03:27:49.077198 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 16 03:27:49.077207 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 16 03:27:49.077217 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 16 03:27:49.077226 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 16 03:27:49.077234 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 16 03:27:49.077243 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 16 03:27:49.077252 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 16 03:27:49.077264 kernel: NET: Registered PF_XDP protocol family May 16 03:27:49.077356 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 16 03:27:49.077438 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 16 03:27:49.077519 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 16 03:27:49.077599 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] May 16 03:27:49.077680 kernel: pci_bus 0000:00: resource 8 [mem 0xc000000000-0xc07fffffff window] May 16 03:27:49.077774 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release May 16 03:27:49.077867 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers May 16 03:27:49.077886 kernel: PCI: CLS 0 bytes, default 64 May 16 03:27:49.077895 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) May 16 03:27:49.077904 kernel: software IO TLB: mapped [mem 0x00000000bbfdd000-0x00000000bffdd000] (64MB) May 16 03:27:49.077913 kernel: Initialise system trusted keyrings May 16 03:27:49.077922 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 16 03:27:49.077931 kernel: Key type asymmetric registered May 16 03:27:49.077940 kernel: Asymmetric key parser 'x509' registered May 16 03:27:49.077949 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) May 16 03:27:49.077958 kernel: io scheduler mq-deadline registered May 16 03:27:49.077968 kernel: io scheduler kyber registered May 16 03:27:49.077977 kernel: io scheduler bfq registered May 16 03:27:49.077986 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 16 03:27:49.077995 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 May 16 03:27:49.078005 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 May 16 03:27:49.078014 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 May 16 03:27:49.078023 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 May 16 03:27:49.078032 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 16 03:27:49.078041 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 16 03:27:49.078051 kernel: random: crng init done May 16 03:27:49.078060 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 16 03:27:49.078069 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 16 03:27:49.078078 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 16 03:27:49.079210 kernel: rtc_cmos 00:04: RTC can wake from S4 May 16 03:27:49.079229 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 16 03:27:49.079318 kernel: rtc_cmos 00:04: registered as rtc0 May 16 03:27:49.079408 kernel: rtc_cmos 00:04: setting system clock to 2025-05-16T03:27:48 UTC (1747366068) May 16 03:27:49.079504 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram May 16 03:27:49.079519 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 16 03:27:49.079529 kernel: NET: Registered PF_INET6 protocol family May 16 03:27:49.079538 kernel: Segment Routing with IPv6 May 16 03:27:49.079548 kernel: In-situ OAM (IOAM) with IPv6 May 16 03:27:49.079557 kernel: NET: Registered PF_PACKET protocol family May 16 03:27:49.079567 kernel: Key type dns_resolver registered May 16 03:27:49.079576 kernel: IPI shorthand broadcast: enabled May 16 03:27:49.079586 kernel: sched_clock: Marking stable (1037007227, 172076988)->(1240106729, -31022514) May 16 03:27:49.079599 kernel: registered taskstats version 1 May 16 03:27:49.079608 kernel: Loading compiled-in X.509 certificates May 16 03:27:49.079618 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.90-flatcar: 36d9e3bf63b9b28466bcfa7a508d814673a33a26' May 16 03:27:49.079627 kernel: Key type .fscrypt registered May 16 03:27:49.079636 kernel: Key type fscrypt-provisioning registered May 16 03:27:49.079646 kernel: ima: No TPM chip found, activating TPM-bypass! May 16 03:27:49.079655 kernel: ima: Allocated hash algorithm: sha1 May 16 03:27:49.079665 kernel: ima: No architecture policies found May 16 03:27:49.079676 kernel: clk: Disabling unused clocks May 16 03:27:49.079685 kernel: Freeing unused kernel image (initmem) memory: 43600K May 16 03:27:49.079695 kernel: Write protecting the kernel read-only data: 40960k May 16 03:27:49.079704 kernel: Freeing unused kernel image (rodata/data gap) memory: 1556K May 16 03:27:49.079714 kernel: Run /init as init process May 16 03:27:49.079723 kernel: with arguments: May 16 03:27:49.079732 kernel: /init May 16 03:27:49.079741 kernel: with environment: May 16 03:27:49.079750 kernel: HOME=/ May 16 03:27:49.079760 kernel: TERM=linux May 16 03:27:49.079771 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 16 03:27:49.079782 systemd[1]: Successfully made /usr/ read-only. May 16 03:27:49.079796 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 16 03:27:49.079807 systemd[1]: Detected virtualization kvm. May 16 03:27:49.079817 systemd[1]: Detected architecture x86-64. May 16 03:27:49.079827 systemd[1]: Running in initrd. May 16 03:27:49.079841 systemd[1]: No hostname configured, using default hostname. May 16 03:27:49.079851 systemd[1]: Hostname set to . May 16 03:27:49.079861 systemd[1]: Initializing machine ID from VM UUID. May 16 03:27:49.079871 systemd[1]: Queued start job for default target initrd.target. May 16 03:27:49.079881 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 16 03:27:49.079892 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 16 03:27:49.079903 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 16 03:27:49.079923 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 16 03:27:49.079936 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 16 03:27:49.079948 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 16 03:27:49.079959 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 16 03:27:49.079970 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 16 03:27:49.079981 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 16 03:27:49.079994 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 16 03:27:49.080004 systemd[1]: Reached target paths.target - Path Units. May 16 03:27:49.080015 systemd[1]: Reached target slices.target - Slice Units. May 16 03:27:49.080028 systemd[1]: Reached target swap.target - Swaps. May 16 03:27:49.080037 systemd[1]: Reached target timers.target - Timer Units. May 16 03:27:49.080047 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 16 03:27:49.080057 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 16 03:27:49.080067 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 16 03:27:49.080076 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 16 03:27:49.080088 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 16 03:27:49.080098 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 16 03:27:49.080108 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 16 03:27:49.080118 systemd[1]: Reached target sockets.target - Socket Units. May 16 03:27:49.081163 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 16 03:27:49.081178 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 16 03:27:49.081188 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 16 03:27:49.081199 systemd[1]: Starting systemd-fsck-usr.service... May 16 03:27:49.081214 systemd[1]: Starting systemd-journald.service - Journal Service... May 16 03:27:49.081224 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 16 03:27:49.081235 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 16 03:27:49.081245 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 16 03:27:49.081256 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 16 03:27:49.081267 systemd[1]: Finished systemd-fsck-usr.service. May 16 03:27:49.081280 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 16 03:27:49.081313 systemd-journald[184]: Collecting audit messages is disabled. May 16 03:27:49.081345 systemd-journald[184]: Journal started May 16 03:27:49.081370 systemd-journald[184]: Runtime Journal (/run/log/journal/b91f34db0e7648f28ad8fbad8a36c03f) is 8M, max 78.2M, 70.2M free. May 16 03:27:49.066209 systemd-modules-load[185]: Inserted module 'overlay' May 16 03:27:49.105355 systemd[1]: Started systemd-journald.service - Journal Service. May 16 03:27:49.105373 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 16 03:27:49.105385 kernel: Bridge firewalling registered May 16 03:27:49.095419 systemd-modules-load[185]: Inserted module 'br_netfilter' May 16 03:27:49.108569 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 16 03:27:49.109287 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 16 03:27:49.114298 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 16 03:27:49.115578 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 16 03:27:49.120289 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 16 03:27:49.121476 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 16 03:27:49.127248 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 16 03:27:49.135372 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 16 03:27:49.139475 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 16 03:27:49.143545 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 16 03:27:49.144926 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 16 03:27:49.155477 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 16 03:27:49.157247 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 16 03:27:49.179720 dracut-cmdline[221]: dracut-dracut-053 May 16 03:27:49.181728 dracut-cmdline[221]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=5e2f56b68c7f7e65e4df73d074f249f99b5795b677316c47e2ad758e6bd99733 May 16 03:27:49.195808 systemd-resolved[214]: Positive Trust Anchors: May 16 03:27:49.196229 systemd-resolved[214]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 16 03:27:49.196273 systemd-resolved[214]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 16 03:27:49.202601 systemd-resolved[214]: Defaulting to hostname 'linux'. May 16 03:27:49.204532 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 16 03:27:49.205143 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 16 03:27:49.255230 kernel: SCSI subsystem initialized May 16 03:27:49.266207 kernel: Loading iSCSI transport class v2.0-870. May 16 03:27:49.279181 kernel: iscsi: registered transport (tcp) May 16 03:27:49.303718 kernel: iscsi: registered transport (qla4xxx) May 16 03:27:49.303811 kernel: QLogic iSCSI HBA Driver May 16 03:27:49.363815 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 16 03:27:49.366261 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 16 03:27:49.427637 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 16 03:27:49.427741 kernel: device-mapper: uevent: version 1.0.3 May 16 03:27:49.430990 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com May 16 03:27:49.493286 kernel: raid6: sse2x4 gen() 5189 MB/s May 16 03:27:49.512325 kernel: raid6: sse2x2 gen() 5995 MB/s May 16 03:27:49.530718 kernel: raid6: sse2x1 gen() 9520 MB/s May 16 03:27:49.530877 kernel: raid6: using algorithm sse2x1 gen() 9520 MB/s May 16 03:27:49.549620 kernel: raid6: .... xor() 7370 MB/s, rmw enabled May 16 03:27:49.549752 kernel: raid6: using ssse3x2 recovery algorithm May 16 03:27:49.572534 kernel: xor: measuring software checksum speed May 16 03:27:49.572699 kernel: prefetch64-sse : 18515 MB/sec May 16 03:27:49.572729 kernel: generic_sse : 15899 MB/sec May 16 03:27:49.573794 kernel: xor: using function: prefetch64-sse (18515 MB/sec) May 16 03:27:49.750203 kernel: Btrfs loaded, zoned=no, fsverity=no May 16 03:27:49.767744 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 16 03:27:49.774218 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 16 03:27:49.801529 systemd-udevd[403]: Using default interface naming scheme 'v255'. May 16 03:27:49.806422 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 16 03:27:49.814232 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 16 03:27:49.858191 dracut-pre-trigger[413]: rd.md=0: removing MD RAID activation May 16 03:27:49.923626 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 16 03:27:49.928104 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 16 03:27:50.024694 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 16 03:27:50.036405 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 16 03:27:50.084406 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 16 03:27:50.089510 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 16 03:27:50.091855 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 16 03:27:50.094112 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 16 03:27:50.096237 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 16 03:27:50.121649 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 16 03:27:50.138152 kernel: libata version 3.00 loaded. May 16 03:27:50.141523 kernel: ata_piix 0000:00:01.1: version 2.13 May 16 03:27:50.144156 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues May 16 03:27:50.147195 kernel: scsi host0: ata_piix May 16 03:27:50.153151 kernel: virtio_blk virtio2: [vda] 20971520 512-byte logical blocks (10.7 GB/10.0 GiB) May 16 03:27:50.154681 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 16 03:27:50.154834 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 16 03:27:50.155543 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 16 03:27:50.158082 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 16 03:27:50.158234 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 16 03:27:50.158829 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 16 03:27:50.162090 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 16 03:27:50.162959 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 16 03:27:50.169700 kernel: scsi host1: ata_piix May 16 03:27:50.169847 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 May 16 03:27:50.169867 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 May 16 03:27:50.176380 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 16 03:27:50.176443 kernel: GPT:17805311 != 20971519 May 16 03:27:50.176464 kernel: GPT:Alternate GPT header not at the end of the disk. May 16 03:27:50.176481 kernel: GPT:17805311 != 20971519 May 16 03:27:50.176492 kernel: GPT: Use GNU Parted to correct GPT errors. May 16 03:27:50.183164 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 16 03:27:50.231577 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 16 03:27:50.234504 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 16 03:27:50.256618 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 16 03:27:50.376190 kernel: BTRFS: device fsid a728581e-9e7f-4655-895a-4f66e17e3645 devid 1 transid 40 /dev/vda3 scanned by (udev-worker) (455) May 16 03:27:50.388196 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (451) May 16 03:27:50.426756 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 16 03:27:50.437680 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 16 03:27:50.449626 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 16 03:27:50.458647 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 16 03:27:50.459245 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 16 03:27:50.462250 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 16 03:27:50.484746 disk-uuid[513]: Primary Header is updated. May 16 03:27:50.484746 disk-uuid[513]: Secondary Entries is updated. May 16 03:27:50.484746 disk-uuid[513]: Secondary Header is updated. May 16 03:27:50.496213 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 16 03:27:51.514228 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 16 03:27:51.517305 disk-uuid[514]: The operation has completed successfully. May 16 03:27:51.592460 systemd[1]: disk-uuid.service: Deactivated successfully. May 16 03:27:51.592610 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 16 03:27:51.642768 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 16 03:27:51.671508 sh[526]: Success May 16 03:27:51.687227 kernel: device-mapper: verity: sha256 using implementation "sha256-ssse3" May 16 03:27:51.787017 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 16 03:27:51.794322 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 16 03:27:51.806402 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 16 03:27:51.832234 kernel: BTRFS info (device dm-0): first mount of filesystem a728581e-9e7f-4655-895a-4f66e17e3645 May 16 03:27:51.832284 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 16 03:27:51.832297 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead May 16 03:27:51.834527 kernel: BTRFS info (device dm-0): disabling log replay at mount time May 16 03:27:51.836108 kernel: BTRFS info (device dm-0): using free space tree May 16 03:27:51.854250 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 16 03:27:51.856830 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 16 03:27:51.861079 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 16 03:27:51.867368 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 16 03:27:51.921116 kernel: BTRFS info (device vda6): first mount of filesystem 206158fa-d3b7-4891-accd-2db768e6ca22 May 16 03:27:51.921240 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 16 03:27:51.921272 kernel: BTRFS info (device vda6): using free space tree May 16 03:27:51.932200 kernel: BTRFS info (device vda6): auto enabling async discard May 16 03:27:51.943174 kernel: BTRFS info (device vda6): last unmount of filesystem 206158fa-d3b7-4891-accd-2db768e6ca22 May 16 03:27:51.952604 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 16 03:27:51.956276 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 16 03:27:51.998223 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 16 03:27:52.000527 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 16 03:27:52.035928 systemd-networkd[705]: lo: Link UP May 16 03:27:52.035937 systemd-networkd[705]: lo: Gained carrier May 16 03:27:52.037922 systemd-networkd[705]: Enumeration completed May 16 03:27:52.039113 systemd[1]: Started systemd-networkd.service - Network Configuration. May 16 03:27:52.040204 systemd[1]: Reached target network.target - Network. May 16 03:27:52.040296 systemd-networkd[705]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 16 03:27:52.040301 systemd-networkd[705]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 16 03:27:52.042141 systemd-networkd[705]: eth0: Link UP May 16 03:27:52.042145 systemd-networkd[705]: eth0: Gained carrier May 16 03:27:52.042155 systemd-networkd[705]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 16 03:27:52.056262 systemd-networkd[705]: eth0: DHCPv4 address 172.24.4.18/24, gateway 172.24.4.1 acquired from 172.24.4.1 May 16 03:27:52.104678 ignition[645]: Ignition 2.20.0 May 16 03:27:52.104691 ignition[645]: Stage: fetch-offline May 16 03:27:52.104730 ignition[645]: no configs at "/usr/lib/ignition/base.d" May 16 03:27:52.106179 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 16 03:27:52.104741 ignition[645]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 16 03:27:52.104840 ignition[645]: parsed url from cmdline: "" May 16 03:27:52.108261 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 16 03:27:52.104844 ignition[645]: no config URL provided May 16 03:27:52.104850 ignition[645]: reading system config file "/usr/lib/ignition/user.ign" May 16 03:27:52.104859 ignition[645]: no config at "/usr/lib/ignition/user.ign" May 16 03:27:52.104864 ignition[645]: failed to fetch config: resource requires networking May 16 03:27:52.105061 ignition[645]: Ignition finished successfully May 16 03:27:52.128093 ignition[716]: Ignition 2.20.0 May 16 03:27:52.128105 ignition[716]: Stage: fetch May 16 03:27:52.128292 ignition[716]: no configs at "/usr/lib/ignition/base.d" May 16 03:27:52.128303 ignition[716]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 16 03:27:52.128383 ignition[716]: parsed url from cmdline: "" May 16 03:27:52.128386 ignition[716]: no config URL provided May 16 03:27:52.128391 ignition[716]: reading system config file "/usr/lib/ignition/user.ign" May 16 03:27:52.128399 ignition[716]: no config at "/usr/lib/ignition/user.ign" May 16 03:27:52.128468 ignition[716]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... May 16 03:27:52.128490 ignition[716]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... May 16 03:27:52.128511 ignition[716]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 May 16 03:27:52.332385 ignition[716]: GET result: OK May 16 03:27:52.332607 ignition[716]: parsing config with SHA512: e6b3df8b5c2a7cd037750c6da1f45106df2ac227e2d8218e0404bbc3fb97782b80eaf96e4a2f7068a25712d310122e0dbe606ac8e922680b596024ad31a58cbd May 16 03:27:52.344059 unknown[716]: fetched base config from "system" May 16 03:27:52.344076 unknown[716]: fetched base config from "system" May 16 03:27:52.344803 ignition[716]: fetch: fetch complete May 16 03:27:52.344087 unknown[716]: fetched user config from "openstack" May 16 03:27:52.344813 ignition[716]: fetch: fetch passed May 16 03:27:52.347255 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 16 03:27:52.344882 ignition[716]: Ignition finished successfully May 16 03:27:52.353334 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 16 03:27:52.391336 ignition[722]: Ignition 2.20.0 May 16 03:27:52.391365 ignition[722]: Stage: kargs May 16 03:27:52.391675 ignition[722]: no configs at "/usr/lib/ignition/base.d" May 16 03:27:52.395364 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 16 03:27:52.391695 ignition[722]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 16 03:27:52.393401 ignition[722]: kargs: kargs passed May 16 03:27:52.393477 ignition[722]: Ignition finished successfully May 16 03:27:52.400379 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 16 03:27:52.433661 ignition[730]: Ignition 2.20.0 May 16 03:27:52.434946 ignition[730]: Stage: disks May 16 03:27:52.435304 ignition[730]: no configs at "/usr/lib/ignition/base.d" May 16 03:27:52.435324 ignition[730]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 16 03:27:52.440219 ignition[730]: disks: disks passed May 16 03:27:52.441209 ignition[730]: Ignition finished successfully May 16 03:27:52.442909 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 16 03:27:52.444942 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 16 03:27:52.446445 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 16 03:27:52.447807 systemd[1]: Reached target local-fs.target - Local File Systems. May 16 03:27:52.450253 systemd[1]: Reached target sysinit.target - System Initialization. May 16 03:27:52.452631 systemd[1]: Reached target basic.target - Basic System. May 16 03:27:52.456937 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 16 03:27:52.507482 systemd-fsck[739]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks May 16 03:27:52.518060 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 16 03:27:52.523871 systemd[1]: Mounting sysroot.mount - /sysroot... May 16 03:27:52.690183 kernel: EXT4-fs (vda9): mounted filesystem f27adc75-a467-4bfb-9c02-79a2879452a3 r/w with ordered data mode. Quota mode: none. May 16 03:27:52.690308 systemd[1]: Mounted sysroot.mount - /sysroot. May 16 03:27:52.691181 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 16 03:27:52.694095 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 16 03:27:52.696220 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 16 03:27:52.698446 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 16 03:27:52.701480 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... May 16 03:27:52.703549 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 16 03:27:52.703583 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 16 03:27:52.712098 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 16 03:27:52.715095 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 16 03:27:52.724168 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (747) May 16 03:27:52.742304 kernel: BTRFS info (device vda6): first mount of filesystem 206158fa-d3b7-4891-accd-2db768e6ca22 May 16 03:27:52.742385 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 16 03:27:52.744810 kernel: BTRFS info (device vda6): using free space tree May 16 03:27:52.750153 kernel: BTRFS info (device vda6): auto enabling async discard May 16 03:27:52.752617 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 16 03:27:52.863102 initrd-setup-root[775]: cut: /sysroot/etc/passwd: No such file or directory May 16 03:27:52.869681 initrd-setup-root[782]: cut: /sysroot/etc/group: No such file or directory May 16 03:27:52.875898 initrd-setup-root[789]: cut: /sysroot/etc/shadow: No such file or directory May 16 03:27:52.882419 initrd-setup-root[796]: cut: /sysroot/etc/gshadow: No such file or directory May 16 03:27:53.002386 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 16 03:27:53.004651 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 16 03:27:53.008225 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 16 03:27:53.020628 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 16 03:27:53.026549 kernel: BTRFS info (device vda6): last unmount of filesystem 206158fa-d3b7-4891-accd-2db768e6ca22 May 16 03:27:53.056323 ignition[864]: INFO : Ignition 2.20.0 May 16 03:27:53.057153 ignition[864]: INFO : Stage: mount May 16 03:27:53.058082 ignition[864]: INFO : no configs at "/usr/lib/ignition/base.d" May 16 03:27:53.058834 ignition[864]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 16 03:27:53.060606 ignition[864]: INFO : mount: mount passed May 16 03:27:53.061235 ignition[864]: INFO : Ignition finished successfully May 16 03:27:53.065030 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 16 03:27:53.065748 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 16 03:27:53.916335 systemd-networkd[705]: eth0: Gained IPv6LL May 16 03:27:59.936305 coreos-metadata[749]: May 16 03:27:59.936 WARN failed to locate config-drive, using the metadata service API instead May 16 03:27:59.977351 coreos-metadata[749]: May 16 03:27:59.977 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 16 03:27:59.992506 coreos-metadata[749]: May 16 03:27:59.992 INFO Fetch successful May 16 03:27:59.994017 coreos-metadata[749]: May 16 03:27:59.993 INFO wrote hostname ci-4284-0-0-n-34cf5e3c62.novalocal to /sysroot/etc/hostname May 16 03:27:59.996598 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. May 16 03:27:59.996839 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. May 16 03:28:00.004390 systemd[1]: Starting ignition-files.service - Ignition (files)... May 16 03:28:00.032926 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 16 03:28:00.066276 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (882) May 16 03:28:00.068210 kernel: BTRFS info (device vda6): first mount of filesystem 206158fa-d3b7-4891-accd-2db768e6ca22 May 16 03:28:00.074392 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 16 03:28:00.079431 kernel: BTRFS info (device vda6): using free space tree May 16 03:28:00.092265 kernel: BTRFS info (device vda6): auto enabling async discard May 16 03:28:00.097673 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 16 03:28:00.140383 ignition[900]: INFO : Ignition 2.20.0 May 16 03:28:00.140383 ignition[900]: INFO : Stage: files May 16 03:28:00.141766 ignition[900]: INFO : no configs at "/usr/lib/ignition/base.d" May 16 03:28:00.141766 ignition[900]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 16 03:28:00.143078 ignition[900]: DEBUG : files: compiled without relabeling support, skipping May 16 03:28:00.144861 ignition[900]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 16 03:28:00.144861 ignition[900]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 16 03:28:00.151268 ignition[900]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 16 03:28:00.152671 ignition[900]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 16 03:28:00.153564 ignition[900]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 16 03:28:00.153371 unknown[900]: wrote ssh authorized keys file for user: core May 16 03:28:00.156858 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" May 16 03:28:00.158012 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 May 16 03:28:00.240022 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 16 03:28:00.573526 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" May 16 03:28:00.573526 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 16 03:28:00.575974 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 16 03:28:00.575974 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 16 03:28:00.575974 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 16 03:28:00.575974 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 16 03:28:00.575974 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 16 03:28:00.575974 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 16 03:28:00.575974 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 16 03:28:00.575974 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 16 03:28:00.575974 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 16 03:28:00.575974 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" May 16 03:28:00.575974 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" May 16 03:28:00.575974 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" May 16 03:28:00.575974 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 May 16 03:28:01.627884 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 16 03:28:03.378650 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" May 16 03:28:03.378650 ignition[900]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 16 03:28:03.384268 ignition[900]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 16 03:28:03.384268 ignition[900]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 16 03:28:03.384268 ignition[900]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 16 03:28:03.384268 ignition[900]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 16 03:28:03.384268 ignition[900]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 16 03:28:03.384268 ignition[900]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 16 03:28:03.384268 ignition[900]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 16 03:28:03.384268 ignition[900]: INFO : files: files passed May 16 03:28:03.384268 ignition[900]: INFO : Ignition finished successfully May 16 03:28:03.383775 systemd[1]: Finished ignition-files.service - Ignition (files). May 16 03:28:03.393419 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 16 03:28:03.398480 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 16 03:28:03.419029 initrd-setup-root-after-ignition[929]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 16 03:28:03.419029 initrd-setup-root-after-ignition[929]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 16 03:28:03.422037 initrd-setup-root-after-ignition[933]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 16 03:28:03.424612 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 16 03:28:03.425468 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 16 03:28:03.429236 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 16 03:28:03.431535 systemd[1]: ignition-quench.service: Deactivated successfully. May 16 03:28:03.431625 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 16 03:28:03.479884 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 16 03:28:03.480063 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 16 03:28:03.481743 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 16 03:28:03.483050 systemd[1]: Reached target initrd.target - Initrd Default Target. May 16 03:28:03.488144 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 16 03:28:03.490294 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 16 03:28:03.510789 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 16 03:28:03.513577 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 16 03:28:03.529760 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 16 03:28:03.531069 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 16 03:28:03.532441 systemd[1]: Stopped target timers.target - Timer Units. May 16 03:28:03.533542 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 16 03:28:03.533673 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 16 03:28:03.535645 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 16 03:28:03.536805 systemd[1]: Stopped target basic.target - Basic System. May 16 03:28:03.537870 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 16 03:28:03.539121 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 16 03:28:03.540368 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 16 03:28:03.541045 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 16 03:28:03.541648 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 16 03:28:03.543374 systemd[1]: Stopped target sysinit.target - System Initialization. May 16 03:28:03.544223 systemd[1]: Stopped target local-fs.target - Local File Systems. May 16 03:28:03.545353 systemd[1]: Stopped target swap.target - Swaps. May 16 03:28:03.546390 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 16 03:28:03.546522 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 16 03:28:03.547630 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 16 03:28:03.548356 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 16 03:28:03.549376 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 16 03:28:03.549712 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 16 03:28:03.550614 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 16 03:28:03.550731 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 16 03:28:03.552199 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 16 03:28:03.552330 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 16 03:28:03.554934 systemd[1]: ignition-files.service: Deactivated successfully. May 16 03:28:03.555046 systemd[1]: Stopped ignition-files.service - Ignition (files). May 16 03:28:03.558324 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 16 03:28:03.558934 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 16 03:28:03.559097 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 16 03:28:03.563979 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 16 03:28:03.565836 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 16 03:28:03.565984 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 16 03:28:03.568427 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 16 03:28:03.568566 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 16 03:28:03.575371 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 16 03:28:03.575468 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 16 03:28:03.584382 ignition[954]: INFO : Ignition 2.20.0 May 16 03:28:03.584382 ignition[954]: INFO : Stage: umount May 16 03:28:03.586170 ignition[954]: INFO : no configs at "/usr/lib/ignition/base.d" May 16 03:28:03.586170 ignition[954]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 16 03:28:03.586170 ignition[954]: INFO : umount: umount passed May 16 03:28:03.586170 ignition[954]: INFO : Ignition finished successfully May 16 03:28:03.586468 systemd[1]: ignition-mount.service: Deactivated successfully. May 16 03:28:03.586593 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 16 03:28:03.587887 systemd[1]: ignition-disks.service: Deactivated successfully. May 16 03:28:03.587960 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 16 03:28:03.588744 systemd[1]: ignition-kargs.service: Deactivated successfully. May 16 03:28:03.588787 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 16 03:28:03.589651 systemd[1]: ignition-fetch.service: Deactivated successfully. May 16 03:28:03.589694 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 16 03:28:03.590587 systemd[1]: Stopped target network.target - Network. May 16 03:28:03.591508 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 16 03:28:03.591555 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 16 03:28:03.592491 systemd[1]: Stopped target paths.target - Path Units. May 16 03:28:03.593414 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 16 03:28:03.593659 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 16 03:28:03.594470 systemd[1]: Stopped target slices.target - Slice Units. May 16 03:28:03.595580 systemd[1]: Stopped target sockets.target - Socket Units. May 16 03:28:03.596644 systemd[1]: iscsid.socket: Deactivated successfully. May 16 03:28:03.596682 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 16 03:28:03.597675 systemd[1]: iscsiuio.socket: Deactivated successfully. May 16 03:28:03.597710 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 16 03:28:03.598687 systemd[1]: ignition-setup.service: Deactivated successfully. May 16 03:28:03.598731 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 16 03:28:03.599686 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 16 03:28:03.599727 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 16 03:28:03.600932 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 16 03:28:03.602106 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 16 03:28:03.608485 systemd[1]: systemd-resolved.service: Deactivated successfully. May 16 03:28:03.608591 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 16 03:28:03.612185 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 16 03:28:03.612377 systemd[1]: systemd-networkd.service: Deactivated successfully. May 16 03:28:03.612478 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 16 03:28:03.614663 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 16 03:28:03.615563 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 16 03:28:03.615618 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 16 03:28:03.618224 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 16 03:28:03.618964 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 16 03:28:03.619011 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 16 03:28:03.620203 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 16 03:28:03.620246 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 16 03:28:03.622213 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 16 03:28:03.622257 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 16 03:28:03.627859 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 16 03:28:03.627901 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 16 03:28:03.628604 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 16 03:28:03.630441 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 16 03:28:03.630500 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 16 03:28:03.639400 systemd[1]: systemd-udevd.service: Deactivated successfully. May 16 03:28:03.639557 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 16 03:28:03.641285 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 16 03:28:03.641339 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 16 03:28:03.642824 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 16 03:28:03.642856 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 16 03:28:03.644902 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 16 03:28:03.644950 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 16 03:28:03.648222 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 16 03:28:03.648266 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 16 03:28:03.649563 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 16 03:28:03.649605 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 16 03:28:03.653292 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 16 03:28:03.654087 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 16 03:28:03.654166 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 16 03:28:03.656977 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 16 03:28:03.657021 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 16 03:28:03.659624 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 16 03:28:03.659682 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 16 03:28:03.660014 systemd[1]: network-cleanup.service: Deactivated successfully. May 16 03:28:03.660841 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 16 03:28:03.662901 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 16 03:28:03.662988 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 16 03:28:03.671491 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 16 03:28:03.676765 systemd[1]: sysroot-boot.service: Deactivated successfully. May 16 03:28:03.676873 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 16 03:28:03.678015 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 16 03:28:03.678853 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 16 03:28:03.678898 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 16 03:28:03.681326 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 16 03:28:03.698217 systemd[1]: Switching root. May 16 03:28:03.736592 systemd-journald[184]: Journal stopped May 16 03:28:05.279316 systemd-journald[184]: Received SIGTERM from PID 1 (systemd). May 16 03:28:05.279379 kernel: SELinux: policy capability network_peer_controls=1 May 16 03:28:05.279398 kernel: SELinux: policy capability open_perms=1 May 16 03:28:05.279410 kernel: SELinux: policy capability extended_socket_class=1 May 16 03:28:05.279421 kernel: SELinux: policy capability always_check_network=0 May 16 03:28:05.279433 kernel: SELinux: policy capability cgroup_seclabel=1 May 16 03:28:05.279444 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 16 03:28:05.279455 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 16 03:28:05.279474 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 16 03:28:05.279486 kernel: audit: type=1403 audit(1747366084.163:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 16 03:28:05.279501 systemd[1]: Successfully loaded SELinux policy in 78ms. May 16 03:28:05.279519 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 27.323ms. May 16 03:28:05.279532 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 16 03:28:05.279545 systemd[1]: Detected virtualization kvm. May 16 03:28:05.279557 systemd[1]: Detected architecture x86-64. May 16 03:28:05.279570 systemd[1]: Detected first boot. May 16 03:28:05.279582 systemd[1]: Hostname set to . May 16 03:28:05.279597 systemd[1]: Initializing machine ID from VM UUID. May 16 03:28:05.279609 zram_generator::config[999]: No configuration found. May 16 03:28:05.279622 kernel: Guest personality initialized and is inactive May 16 03:28:05.279634 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 16 03:28:05.279645 kernel: Initialized host personality May 16 03:28:05.279656 kernel: NET: Registered PF_VSOCK protocol family May 16 03:28:05.279668 systemd[1]: Populated /etc with preset unit settings. May 16 03:28:05.279680 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 16 03:28:05.279695 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 16 03:28:05.279707 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 16 03:28:05.279719 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 16 03:28:05.279731 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 16 03:28:05.279743 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 16 03:28:05.279755 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 16 03:28:05.279767 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 16 03:28:05.279779 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 16 03:28:05.279791 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 16 03:28:05.279806 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 16 03:28:05.279823 systemd[1]: Created slice user.slice - User and Session Slice. May 16 03:28:05.279835 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 16 03:28:05.279847 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 16 03:28:05.279859 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 16 03:28:05.279872 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 16 03:28:05.279885 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 16 03:28:05.279900 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 16 03:28:05.279912 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 16 03:28:05.279924 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 16 03:28:05.279936 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 16 03:28:05.279948 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 16 03:28:05.279960 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 16 03:28:05.279972 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 16 03:28:05.279985 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 16 03:28:05.280002 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 16 03:28:05.280014 systemd[1]: Reached target slices.target - Slice Units. May 16 03:28:05.280026 systemd[1]: Reached target swap.target - Swaps. May 16 03:28:05.280037 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 16 03:28:05.280049 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 16 03:28:05.280062 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 16 03:28:05.280074 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 16 03:28:05.280087 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 16 03:28:05.280099 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 16 03:28:05.280113 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 16 03:28:05.280126 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 16 03:28:05.282192 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 16 03:28:05.282207 systemd[1]: Mounting media.mount - External Media Directory... May 16 03:28:05.282220 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 03:28:05.282232 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 16 03:28:05.282245 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 16 03:28:05.282257 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 16 03:28:05.282269 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 16 03:28:05.282286 systemd[1]: Reached target machines.target - Containers. May 16 03:28:05.282299 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 16 03:28:05.282311 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 16 03:28:05.282323 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 16 03:28:05.282336 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 16 03:28:05.282348 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 16 03:28:05.282360 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 16 03:28:05.282376 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 16 03:28:05.282390 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 16 03:28:05.282402 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 16 03:28:05.282415 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 16 03:28:05.282427 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 16 03:28:05.282439 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 16 03:28:05.282451 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 16 03:28:05.282463 systemd[1]: Stopped systemd-fsck-usr.service. May 16 03:28:05.282475 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 16 03:28:05.282490 systemd[1]: Starting systemd-journald.service - Journal Service... May 16 03:28:05.282502 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 16 03:28:05.282514 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 16 03:28:05.282527 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 16 03:28:05.282539 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 16 03:28:05.282550 kernel: fuse: init (API version 7.39) May 16 03:28:05.282562 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 16 03:28:05.282574 systemd[1]: verity-setup.service: Deactivated successfully. May 16 03:28:05.282587 systemd[1]: Stopped verity-setup.service. May 16 03:28:05.282601 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 03:28:05.282613 kernel: loop: module loaded May 16 03:28:05.282624 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 16 03:28:05.282637 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 16 03:28:05.282651 systemd[1]: Mounted media.mount - External Media Directory. May 16 03:28:05.282663 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 16 03:28:05.282675 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 16 03:28:05.282688 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 16 03:28:05.282700 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 16 03:28:05.282713 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 16 03:28:05.282727 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 16 03:28:05.282739 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 16 03:28:05.282751 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 16 03:28:05.282763 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 16 03:28:05.282775 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 16 03:28:05.282787 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 16 03:28:05.282799 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 16 03:28:05.282812 systemd[1]: modprobe@loop.service: Deactivated successfully. May 16 03:28:05.282824 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 16 03:28:05.282838 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 16 03:28:05.282850 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 16 03:28:05.282863 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 16 03:28:05.282896 systemd-journald[1086]: Collecting audit messages is disabled. May 16 03:28:05.282925 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 16 03:28:05.282938 systemd-journald[1086]: Journal started May 16 03:28:05.282966 systemd-journald[1086]: Runtime Journal (/run/log/journal/b91f34db0e7648f28ad8fbad8a36c03f) is 8M, max 78.2M, 70.2M free. May 16 03:28:04.861362 systemd[1]: Queued start job for default target multi-user.target. May 16 03:28:04.874616 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 16 03:28:04.875153 systemd[1]: systemd-journald.service: Deactivated successfully. May 16 03:28:05.289179 systemd[1]: Started systemd-journald.service - Journal Service. May 16 03:28:05.294956 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 16 03:28:05.302659 systemd[1]: Reached target network-pre.target - Preparation for Network. May 16 03:28:05.318880 kernel: ACPI: bus type drm_connector registered May 16 03:28:05.314109 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 16 03:28:05.325647 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 16 03:28:05.326958 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 16 03:28:05.327016 systemd[1]: Reached target local-fs.target - Local File Systems. May 16 03:28:05.329706 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 16 03:28:05.340367 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 16 03:28:05.351345 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 16 03:28:05.352482 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 16 03:28:05.357338 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 16 03:28:05.365306 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 16 03:28:05.366013 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 16 03:28:05.369343 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 16 03:28:05.369976 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 16 03:28:05.373355 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 16 03:28:05.379289 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 16 03:28:05.384375 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... May 16 03:28:05.387522 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 16 03:28:05.392446 systemd[1]: modprobe@drm.service: Deactivated successfully. May 16 03:28:05.392704 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 16 03:28:05.396476 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 16 03:28:05.398413 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 16 03:28:05.400200 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 16 03:28:05.415952 systemd-journald[1086]: Time spent on flushing to /var/log/journal/b91f34db0e7648f28ad8fbad8a36c03f is 42.306ms for 958 entries. May 16 03:28:05.415952 systemd-journald[1086]: System Journal (/var/log/journal/b91f34db0e7648f28ad8fbad8a36c03f) is 8M, max 584.8M, 576.8M free. May 16 03:28:05.494901 systemd-journald[1086]: Received client request to flush runtime journal. May 16 03:28:05.494945 kernel: loop0: detected capacity change from 0 to 224512 May 16 03:28:05.494961 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 16 03:28:05.413589 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 16 03:28:05.415086 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 16 03:28:05.419614 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 16 03:28:05.426107 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 16 03:28:05.445860 udevadm[1138]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. May 16 03:28:05.451736 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 16 03:28:05.497574 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 16 03:28:05.526316 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 16 03:28:05.537375 kernel: loop1: detected capacity change from 0 to 8 May 16 03:28:05.535298 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 16 03:28:05.540278 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 16 03:28:05.561163 kernel: loop2: detected capacity change from 0 to 109808 May 16 03:28:05.593548 systemd-tmpfiles[1157]: ACLs are not supported, ignoring. May 16 03:28:05.593591 systemd-tmpfiles[1157]: ACLs are not supported, ignoring. May 16 03:28:05.619883 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 16 03:28:05.636175 kernel: loop3: detected capacity change from 0 to 151640 May 16 03:28:05.684164 kernel: loop4: detected capacity change from 0 to 224512 May 16 03:28:05.719376 kernel: loop5: detected capacity change from 0 to 8 May 16 03:28:05.722198 kernel: loop6: detected capacity change from 0 to 109808 May 16 03:28:05.792172 kernel: loop7: detected capacity change from 0 to 151640 May 16 03:28:05.837969 (sd-merge)[1163]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. May 16 03:28:05.838486 (sd-merge)[1163]: Merged extensions into '/usr'. May 16 03:28:05.845425 systemd[1]: Reload requested from client PID 1137 ('systemd-sysext') (unit systemd-sysext.service)... May 16 03:28:05.845443 systemd[1]: Reloading... May 16 03:28:05.949001 zram_generator::config[1188]: No configuration found. May 16 03:28:06.178760 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 16 03:28:06.283676 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 16 03:28:06.284003 systemd[1]: Reloading finished in 437 ms. May 16 03:28:06.303896 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 16 03:28:06.312225 systemd[1]: Starting ensure-sysext.service... May 16 03:28:06.315243 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 16 03:28:06.364091 systemd-tmpfiles[1247]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 16 03:28:06.365007 systemd-tmpfiles[1247]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 16 03:28:06.366897 systemd-tmpfiles[1247]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 16 03:28:06.367666 systemd-tmpfiles[1247]: ACLs are not supported, ignoring. May 16 03:28:06.367837 systemd-tmpfiles[1247]: ACLs are not supported, ignoring. May 16 03:28:06.378005 systemd-tmpfiles[1247]: Detected autofs mount point /boot during canonicalization of boot. May 16 03:28:06.378034 systemd-tmpfiles[1247]: Skipping /boot May 16 03:28:06.401557 systemd[1]: Reload requested from client PID 1246 ('systemctl') (unit ensure-sysext.service)... May 16 03:28:06.401595 systemd[1]: Reloading... May 16 03:28:06.420174 systemd-tmpfiles[1247]: Detected autofs mount point /boot during canonicalization of boot. May 16 03:28:06.420185 systemd-tmpfiles[1247]: Skipping /boot May 16 03:28:06.475217 ldconfig[1132]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 16 03:28:06.491147 zram_generator::config[1274]: No configuration found. May 16 03:28:06.666601 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 16 03:28:06.750172 systemd[1]: Reloading finished in 347 ms. May 16 03:28:06.762994 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 16 03:28:06.763980 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 16 03:28:06.770279 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 16 03:28:06.782433 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 16 03:28:06.787676 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 16 03:28:06.789685 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 16 03:28:06.797918 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 16 03:28:06.801331 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 16 03:28:06.806366 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 16 03:28:06.814964 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 03:28:06.815455 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 16 03:28:06.818507 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 16 03:28:06.830168 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 16 03:28:06.832656 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 16 03:28:06.833418 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 16 03:28:06.833775 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 16 03:28:06.833891 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 03:28:06.849450 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 16 03:28:06.853742 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 03:28:06.853933 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 16 03:28:06.854099 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 16 03:28:06.854447 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 16 03:28:06.854560 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 03:28:06.864560 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 16 03:28:06.864733 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 16 03:28:06.866597 systemd[1]: modprobe@loop.service: Deactivated successfully. May 16 03:28:06.867170 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 16 03:28:06.873684 systemd[1]: Finished ensure-sysext.service. May 16 03:28:06.878730 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 16 03:28:06.883608 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 16 03:28:06.883885 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 16 03:28:06.887659 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 03:28:06.887844 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 16 03:28:06.890486 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 16 03:28:06.891220 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 16 03:28:06.891256 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 16 03:28:06.891317 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 16 03:28:06.891375 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 16 03:28:06.898664 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 16 03:28:06.899316 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 03:28:06.900075 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 16 03:28:06.907354 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 16 03:28:06.928399 systemd[1]: modprobe@drm.service: Deactivated successfully. May 16 03:28:06.928929 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 16 03:28:06.930404 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 16 03:28:06.933447 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 16 03:28:06.937778 systemd-udevd[1340]: Using default interface naming scheme 'v255'. May 16 03:28:06.942338 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 16 03:28:06.947968 augenrules[1377]: No rules May 16 03:28:06.950525 systemd[1]: audit-rules.service: Deactivated successfully. May 16 03:28:06.950994 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 16 03:28:06.953026 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 16 03:28:06.988341 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 16 03:28:06.994288 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 16 03:28:07.059788 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 16 03:28:07.060454 systemd[1]: Reached target time-set.target - System Time Set. May 16 03:28:07.075487 systemd-resolved[1339]: Positive Trust Anchors: May 16 03:28:07.076938 systemd-resolved[1339]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 16 03:28:07.077043 systemd-resolved[1339]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 16 03:28:07.084023 systemd-resolved[1339]: Using system hostname 'ci-4284-0-0-n-34cf5e3c62.novalocal'. May 16 03:28:07.085885 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 16 03:28:07.086701 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 16 03:28:07.113863 systemd-networkd[1395]: lo: Link UP May 16 03:28:07.113874 systemd-networkd[1395]: lo: Gained carrier May 16 03:28:07.115795 systemd-networkd[1395]: Enumeration completed May 16 03:28:07.116210 systemd[1]: Started systemd-networkd.service - Network Configuration. May 16 03:28:07.117290 systemd[1]: Reached target network.target - Network. May 16 03:28:07.120376 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 16 03:28:07.127346 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 16 03:28:07.145297 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 16 03:28:07.151721 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 16 03:28:07.167209 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 40 scanned by (udev-worker) (1394) May 16 03:28:07.195842 systemd-networkd[1395]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 16 03:28:07.195852 systemd-networkd[1395]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 16 03:28:07.200054 systemd-networkd[1395]: eth0: Link UP May 16 03:28:07.200063 systemd-networkd[1395]: eth0: Gained carrier May 16 03:28:07.200081 systemd-networkd[1395]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 16 03:28:07.214220 systemd-networkd[1395]: eth0: DHCPv4 address 172.24.4.18/24, gateway 172.24.4.1 acquired from 172.24.4.1 May 16 03:28:07.215097 systemd-timesyncd[1361]: Network configuration changed, trying to establish connection. May 16 03:28:07.216302 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 May 16 03:28:07.229155 kernel: ACPI: button: Power Button [PWRF] May 16 03:28:07.235149 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 May 16 03:28:07.242897 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 16 03:28:07.246870 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 16 03:28:07.264151 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 May 16 03:28:07.274590 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 16 03:28:07.298167 kernel: mousedev: PS/2 mouse device common for all mice May 16 03:28:07.298953 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 16 03:28:07.329424 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 May 16 03:28:07.329503 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console May 16 03:28:07.333716 kernel: Console: switching to colour dummy device 80x25 May 16 03:28:07.333768 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 16 03:28:07.333786 kernel: [drm] features: -context_init May 16 03:28:07.336738 kernel: [drm] number of scanouts: 1 May 16 03:28:07.336775 kernel: [drm] number of cap sets: 0 May 16 03:28:07.340248 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 May 16 03:28:07.343172 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device May 16 03:28:07.350051 kernel: Console: switching to colour frame buffer device 160x50 May 16 03:28:07.357168 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device May 16 03:28:07.359941 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 16 03:28:07.360619 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 16 03:28:07.364999 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 16 03:28:07.365415 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. May 16 03:28:07.369153 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... May 16 03:28:07.372342 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 16 03:28:07.391943 lvm[1435]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 16 03:28:07.423782 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. May 16 03:28:07.424054 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 16 03:28:07.425498 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... May 16 03:28:07.436701 lvm[1440]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 16 03:28:07.465303 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. May 16 03:28:07.475460 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 16 03:28:07.476087 systemd[1]: Reached target sysinit.target - System Initialization. May 16 03:28:07.476349 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 16 03:28:07.476454 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 16 03:28:07.476693 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 16 03:28:07.476831 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 16 03:28:07.476905 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 16 03:28:07.476972 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 16 03:28:07.476995 systemd[1]: Reached target paths.target - Path Units. May 16 03:28:07.477053 systemd[1]: Reached target timers.target - Timer Units. May 16 03:28:07.479842 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 16 03:28:07.481374 systemd[1]: Starting docker.socket - Docker Socket for the API... May 16 03:28:07.486527 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 16 03:28:07.488444 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 16 03:28:07.488621 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 16 03:28:07.492023 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 16 03:28:07.494732 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 16 03:28:07.499173 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 16 03:28:07.502404 systemd[1]: Reached target sockets.target - Socket Units. May 16 03:28:07.505813 systemd[1]: Reached target basic.target - Basic System. May 16 03:28:07.508288 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 16 03:28:07.508475 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 16 03:28:07.511589 systemd[1]: Starting containerd.service - containerd container runtime... May 16 03:28:07.526398 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 16 03:28:07.538402 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 16 03:28:07.546341 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 16 03:28:07.561901 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 16 03:28:07.564315 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 16 03:28:07.569900 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 16 03:28:07.576267 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 16 03:28:07.578686 jq[1452]: false May 16 03:28:07.583267 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 16 03:28:07.587820 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 16 03:28:07.597318 systemd[1]: Starting systemd-logind.service - User Login Management... May 16 03:28:07.599814 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 16 03:28:07.603297 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 16 03:28:07.605315 systemd[1]: Starting update-engine.service - Update Engine... May 16 03:28:07.611548 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 16 03:28:07.614501 dbus-daemon[1449]: [system] SELinux support is enabled May 16 03:28:07.616257 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 16 03:28:07.619957 extend-filesystems[1453]: Found loop4 May 16 03:28:07.619957 extend-filesystems[1453]: Found loop5 May 16 03:28:07.619957 extend-filesystems[1453]: Found loop6 May 16 03:28:07.619957 extend-filesystems[1453]: Found loop7 May 16 03:28:07.619957 extend-filesystems[1453]: Found vda May 16 03:28:07.619957 extend-filesystems[1453]: Found vda1 May 16 03:28:07.619957 extend-filesystems[1453]: Found vda2 May 16 03:28:07.619957 extend-filesystems[1453]: Found vda3 May 16 03:28:07.619957 extend-filesystems[1453]: Found usr May 16 03:28:07.619957 extend-filesystems[1453]: Found vda4 May 16 03:28:07.619957 extend-filesystems[1453]: Found vda6 May 16 03:28:07.619957 extend-filesystems[1453]: Found vda7 May 16 03:28:07.619957 extend-filesystems[1453]: Found vda9 May 16 03:28:07.619957 extend-filesystems[1453]: Checking size of /dev/vda9 May 16 03:28:07.634550 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 16 03:28:07.634741 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 16 03:28:07.639274 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 16 03:28:07.639457 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 16 03:28:07.684343 jq[1462]: true May 16 03:28:07.651518 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 16 03:28:07.651548 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 16 03:28:07.656454 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 16 03:28:07.656477 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 16 03:28:07.696326 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 40 scanned by (udev-worker) (1410) May 16 03:28:07.696419 extend-filesystems[1453]: Resized partition /dev/vda9 May 16 03:28:07.709177 extend-filesystems[1486]: resize2fs 1.47.2 (1-Jan-2025) May 16 03:28:07.716113 tar[1468]: linux-amd64/LICENSE May 16 03:28:07.716113 tar[1468]: linux-amd64/helm May 16 03:28:07.719472 (ntainerd)[1476]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 16 03:28:07.727258 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 2014203 blocks May 16 03:28:07.749038 update_engine[1461]: I20250516 03:28:07.748950 1461 main.cc:92] Flatcar Update Engine starting May 16 03:28:07.762439 update_engine[1461]: I20250516 03:28:07.761298 1461 update_check_scheduler.cc:74] Next update check in 3m35s May 16 03:28:07.761740 systemd[1]: motdgen.service: Deactivated successfully. May 16 03:28:07.762099 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 16 03:28:07.764886 systemd[1]: Started update-engine.service - Update Engine. May 16 03:28:07.770640 jq[1481]: true May 16 03:28:07.773414 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 16 03:28:07.781953 kernel: EXT4-fs (vda9): resized filesystem to 2014203 May 16 03:28:07.838492 extend-filesystems[1486]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 16 03:28:07.838492 extend-filesystems[1486]: old_desc_blocks = 1, new_desc_blocks = 1 May 16 03:28:07.838492 extend-filesystems[1486]: The filesystem on /dev/vda9 is now 2014203 (4k) blocks long. May 16 03:28:07.851663 extend-filesystems[1453]: Resized filesystem in /dev/vda9 May 16 03:28:07.838664 systemd[1]: extend-filesystems.service: Deactivated successfully. May 16 03:28:07.838876 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 16 03:28:07.851687 systemd-logind[1458]: New seat seat0. May 16 03:28:07.874986 systemd-logind[1458]: Watching system buttons on /dev/input/event1 (Power Button) May 16 03:28:07.875007 systemd-logind[1458]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 16 03:28:07.875211 systemd[1]: Started systemd-logind.service - User Login Management. May 16 03:28:07.880274 bash[1505]: Updated "/home/core/.ssh/authorized_keys" May 16 03:28:07.880708 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 16 03:28:07.890512 systemd[1]: Starting sshkeys.service... May 16 03:28:07.927034 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 16 03:28:07.932863 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 16 03:28:08.040780 locksmithd[1489]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 16 03:28:08.217047 containerd[1476]: time="2025-05-16T03:28:08Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 16 03:28:08.219504 containerd[1476]: time="2025-05-16T03:28:08.219461715Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 May 16 03:28:08.248530 containerd[1476]: time="2025-05-16T03:28:08.248466434Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.083µs" May 16 03:28:08.248662 containerd[1476]: time="2025-05-16T03:28:08.248644137Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 16 03:28:08.248737 containerd[1476]: time="2025-05-16T03:28:08.248721231Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 16 03:28:08.249384 containerd[1476]: time="2025-05-16T03:28:08.249364428Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 16 03:28:08.249466 containerd[1476]: time="2025-05-16T03:28:08.249450319Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 16 03:28:08.249539 containerd[1476]: time="2025-05-16T03:28:08.249524568Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 16 03:28:08.249651 containerd[1476]: time="2025-05-16T03:28:08.249632350Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 16 03:28:08.249717 containerd[1476]: time="2025-05-16T03:28:08.249702482Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 16 03:28:08.250004 containerd[1476]: time="2025-05-16T03:28:08.249982998Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 16 03:28:08.250064 containerd[1476]: time="2025-05-16T03:28:08.250050424Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 16 03:28:08.250150 containerd[1476]: time="2025-05-16T03:28:08.250105878Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 16 03:28:08.251161 containerd[1476]: time="2025-05-16T03:28:08.250211947Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 16 03:28:08.251161 containerd[1476]: time="2025-05-16T03:28:08.250380373Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 16 03:28:08.251161 containerd[1476]: time="2025-05-16T03:28:08.250641563Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 16 03:28:08.251161 containerd[1476]: time="2025-05-16T03:28:08.250690174Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 16 03:28:08.251161 containerd[1476]: time="2025-05-16T03:28:08.250707637Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 16 03:28:08.251161 containerd[1476]: time="2025-05-16T03:28:08.250738936Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 16 03:28:08.251161 containerd[1476]: time="2025-05-16T03:28:08.250995988Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 16 03:28:08.251161 containerd[1476]: time="2025-05-16T03:28:08.251058625Z" level=info msg="metadata content store policy set" policy=shared May 16 03:28:08.260280 containerd[1476]: time="2025-05-16T03:28:08.260259351Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 16 03:28:08.260380 containerd[1476]: time="2025-05-16T03:28:08.260363897Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 16 03:28:08.260443 containerd[1476]: time="2025-05-16T03:28:08.260429099Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 16 03:28:08.260518 containerd[1476]: time="2025-05-16T03:28:08.260489363Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 16 03:28:08.260581 containerd[1476]: time="2025-05-16T03:28:08.260567158Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 16 03:28:08.260638 containerd[1476]: time="2025-05-16T03:28:08.260624937Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 16 03:28:08.260697 containerd[1476]: time="2025-05-16T03:28:08.260683527Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 16 03:28:08.260755 containerd[1476]: time="2025-05-16T03:28:08.260741826Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 16 03:28:08.260821 containerd[1476]: time="2025-05-16T03:28:08.260806207Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 16 03:28:08.260886 containerd[1476]: time="2025-05-16T03:28:08.260871639Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 16 03:28:08.260942 containerd[1476]: time="2025-05-16T03:28:08.260928416Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 16 03:28:08.261007 containerd[1476]: time="2025-05-16T03:28:08.260993067Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 16 03:28:08.261171 containerd[1476]: time="2025-05-16T03:28:08.261152947Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 16 03:28:08.261291 containerd[1476]: time="2025-05-16T03:28:08.261275687Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 16 03:28:08.261408 containerd[1476]: time="2025-05-16T03:28:08.261392416Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 16 03:28:08.261474 containerd[1476]: time="2025-05-16T03:28:08.261460794Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 16 03:28:08.261547 containerd[1476]: time="2025-05-16T03:28:08.261526127Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 16 03:28:08.261641 containerd[1476]: time="2025-05-16T03:28:08.261625874Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 16 03:28:08.261711 containerd[1476]: time="2025-05-16T03:28:08.261696406Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 16 03:28:08.261934 containerd[1476]: time="2025-05-16T03:28:08.261810130Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 16 03:28:08.261934 containerd[1476]: time="2025-05-16T03:28:08.261832882Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 16 03:28:08.261934 containerd[1476]: time="2025-05-16T03:28:08.261852239Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 16 03:28:08.261934 containerd[1476]: time="2025-05-16T03:28:08.261865744Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 16 03:28:08.262081 containerd[1476]: time="2025-05-16T03:28:08.262064146Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 16 03:28:08.263151 containerd[1476]: time="2025-05-16T03:28:08.262218856Z" level=info msg="Start snapshots syncer" May 16 03:28:08.263151 containerd[1476]: time="2025-05-16T03:28:08.262254303Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 16 03:28:08.263151 containerd[1476]: time="2025-05-16T03:28:08.262497859Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 16 03:28:08.263621 containerd[1476]: time="2025-05-16T03:28:08.262552452Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 16 03:28:08.263621 containerd[1476]: time="2025-05-16T03:28:08.262615731Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 16 03:28:08.263621 containerd[1476]: time="2025-05-16T03:28:08.262700019Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 16 03:28:08.263621 containerd[1476]: time="2025-05-16T03:28:08.262724114Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 16 03:28:08.263621 containerd[1476]: time="2025-05-16T03:28:08.262736347Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 16 03:28:08.263621 containerd[1476]: time="2025-05-16T03:28:08.262748079Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 16 03:28:08.263621 containerd[1476]: time="2025-05-16T03:28:08.262762155Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 16 03:28:08.263621 containerd[1476]: time="2025-05-16T03:28:08.262795668Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 16 03:28:08.263621 containerd[1476]: time="2025-05-16T03:28:08.262812940Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 16 03:28:08.263621 containerd[1476]: time="2025-05-16T03:28:08.262835332Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 16 03:28:08.263621 containerd[1476]: time="2025-05-16T03:28:08.262850300Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 16 03:28:08.263621 containerd[1476]: time="2025-05-16T03:28:08.262861572Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 16 03:28:08.263621 containerd[1476]: time="2025-05-16T03:28:08.262895706Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 16 03:28:08.263621 containerd[1476]: time="2025-05-16T03:28:08.262910293Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 16 03:28:08.263899 containerd[1476]: time="2025-05-16T03:28:08.262920342Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 16 03:28:08.263899 containerd[1476]: time="2025-05-16T03:28:08.262932374Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 16 03:28:08.263899 containerd[1476]: time="2025-05-16T03:28:08.262942433Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 16 03:28:08.263899 containerd[1476]: time="2025-05-16T03:28:08.262957131Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 16 03:28:08.263899 containerd[1476]: time="2025-05-16T03:28:08.262969454Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 16 03:28:08.263899 containerd[1476]: time="2025-05-16T03:28:08.262985805Z" level=info msg="runtime interface created" May 16 03:28:08.263899 containerd[1476]: time="2025-05-16T03:28:08.262992497Z" level=info msg="created NRI interface" May 16 03:28:08.263899 containerd[1476]: time="2025-05-16T03:28:08.263001454Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 16 03:28:08.263899 containerd[1476]: time="2025-05-16T03:28:08.263013777Z" level=info msg="Connect containerd service" May 16 03:28:08.263899 containerd[1476]: time="2025-05-16T03:28:08.263039335Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 16 03:28:08.265091 containerd[1476]: time="2025-05-16T03:28:08.264731879Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 16 03:28:08.413175 sshd_keygen[1479]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 16 03:28:08.434217 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 16 03:28:08.440053 systemd[1]: Starting issuegen.service - Generate /run/issue... May 16 03:28:08.466446 systemd[1]: issuegen.service: Deactivated successfully. May 16 03:28:08.467226 systemd[1]: Finished issuegen.service - Generate /run/issue. May 16 03:28:08.473299 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 16 03:28:08.493183 containerd[1476]: time="2025-05-16T03:28:08.493151120Z" level=info msg="Start subscribing containerd event" May 16 03:28:08.493331 containerd[1476]: time="2025-05-16T03:28:08.493302093Z" level=info msg="Start recovering state" May 16 03:28:08.493675 containerd[1476]: time="2025-05-16T03:28:08.493659122Z" level=info msg="Start event monitor" May 16 03:28:08.493740 containerd[1476]: time="2025-05-16T03:28:08.493728282Z" level=info msg="Start cni network conf syncer for default" May 16 03:28:08.493797 containerd[1476]: time="2025-05-16T03:28:08.493785529Z" level=info msg="Start streaming server" May 16 03:28:08.493851 containerd[1476]: time="2025-05-16T03:28:08.493840232Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 16 03:28:08.493901 containerd[1476]: time="2025-05-16T03:28:08.493889845Z" level=info msg="runtime interface starting up..." May 16 03:28:08.493954 containerd[1476]: time="2025-05-16T03:28:08.493942424Z" level=info msg="starting plugins..." May 16 03:28:08.494047 containerd[1476]: time="2025-05-16T03:28:08.493575425Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 16 03:28:08.496181 containerd[1476]: time="2025-05-16T03:28:08.494008808Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 16 03:28:08.496181 containerd[1476]: time="2025-05-16T03:28:08.494098727Z" level=info msg=serving... address=/run/containerd/containerd.sock May 16 03:28:08.496181 containerd[1476]: time="2025-05-16T03:28:08.494247205Z" level=info msg="containerd successfully booted in 0.277824s" May 16 03:28:08.494573 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 16 03:28:08.500676 systemd[1]: Started containerd.service - containerd container runtime. May 16 03:28:08.505622 systemd[1]: Started getty@tty1.service - Getty on tty1. May 16 03:28:08.513544 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 16 03:28:08.515939 systemd[1]: Reached target getty.target - Login Prompts. May 16 03:28:08.572484 tar[1468]: linux-amd64/README.md May 16 03:28:08.588669 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 16 03:28:08.955408 systemd-networkd[1395]: eth0: Gained IPv6LL May 16 03:28:08.957268 systemd-timesyncd[1361]: Network configuration changed, trying to establish connection. May 16 03:28:08.961380 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 16 03:28:08.967692 systemd[1]: Reached target network-online.target - Network is Online. May 16 03:28:08.977940 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 03:28:08.999256 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 16 03:28:09.067302 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 16 03:28:09.537974 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 16 03:28:09.548125 systemd[1]: Started sshd@0-172.24.4.18:22-172.24.4.1:47576.service - OpenSSH per-connection server daemon (172.24.4.1:47576). May 16 03:28:10.521227 sshd[1570]: Accepted publickey for core from 172.24.4.1 port 47576 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:28:10.523099 sshd-session[1570]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:28:10.539934 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 16 03:28:10.552003 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 16 03:28:10.573738 systemd-logind[1458]: New session 1 of user core. May 16 03:28:10.590361 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 16 03:28:10.596852 systemd[1]: Starting user@500.service - User Manager for UID 500... May 16 03:28:10.611092 (systemd)[1575]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 16 03:28:10.613810 systemd-logind[1458]: New session c1 of user core. May 16 03:28:10.784529 systemd[1575]: Queued start job for default target default.target. May 16 03:28:10.791006 systemd[1575]: Created slice app.slice - User Application Slice. May 16 03:28:10.791029 systemd[1575]: Reached target paths.target - Paths. May 16 03:28:10.791068 systemd[1575]: Reached target timers.target - Timers. May 16 03:28:10.794218 systemd[1575]: Starting dbus.socket - D-Bus User Message Bus Socket... May 16 03:28:10.804355 systemd[1575]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 16 03:28:10.804477 systemd[1575]: Reached target sockets.target - Sockets. May 16 03:28:10.804532 systemd[1575]: Reached target basic.target - Basic System. May 16 03:28:10.804571 systemd[1575]: Reached target default.target - Main User Target. May 16 03:28:10.804598 systemd[1575]: Startup finished in 184ms. May 16 03:28:10.805003 systemd[1]: Started user@500.service - User Manager for UID 500. May 16 03:28:10.815352 systemd[1]: Started session-1.scope - Session 1 of User core. May 16 03:28:11.228900 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 03:28:11.240578 (kubelet)[1591]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 16 03:28:11.305495 systemd[1]: Started sshd@1-172.24.4.18:22-172.24.4.1:47578.service - OpenSSH per-connection server daemon (172.24.4.1:47578). May 16 03:28:12.360317 kubelet[1591]: E0516 03:28:12.360225 1591 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 16 03:28:12.363902 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 16 03:28:12.364063 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 16 03:28:12.364629 systemd[1]: kubelet.service: Consumed 2.194s CPU time, 266.4M memory peak. May 16 03:28:12.847639 sshd[1593]: Accepted publickey for core from 172.24.4.1 port 47578 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:28:12.849926 sshd-session[1593]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:28:12.860986 systemd-logind[1458]: New session 2 of user core. May 16 03:28:12.873667 systemd[1]: Started session-2.scope - Session 2 of User core. May 16 03:28:13.507367 sshd[1601]: Connection closed by 172.24.4.1 port 47578 May 16 03:28:13.508469 sshd-session[1593]: pam_unix(sshd:session): session closed for user core May 16 03:28:13.528266 systemd[1]: sshd@1-172.24.4.18:22-172.24.4.1:47578.service: Deactivated successfully. May 16 03:28:13.531378 systemd[1]: session-2.scope: Deactivated successfully. May 16 03:28:13.535207 systemd-logind[1458]: Session 2 logged out. Waiting for processes to exit. May 16 03:28:13.538322 systemd[1]: Started sshd@2-172.24.4.18:22-172.24.4.1:47958.service - OpenSSH per-connection server daemon (172.24.4.1:47958). May 16 03:28:13.546885 systemd-logind[1458]: Removed session 2. May 16 03:28:13.617271 login[1551]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 16 03:28:13.622520 login[1550]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 16 03:28:13.624052 systemd-logind[1458]: New session 3 of user core. May 16 03:28:13.628362 systemd[1]: Started session-3.scope - Session 3 of User core. May 16 03:28:13.636288 systemd-logind[1458]: New session 4 of user core. May 16 03:28:13.642330 systemd[1]: Started session-4.scope - Session 4 of User core. May 16 03:28:14.612088 coreos-metadata[1448]: May 16 03:28:14.611 WARN failed to locate config-drive, using the metadata service API instead May 16 03:28:14.665920 coreos-metadata[1448]: May 16 03:28:14.665 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 May 16 03:28:14.954891 coreos-metadata[1448]: May 16 03:28:14.954 INFO Fetch successful May 16 03:28:14.954891 coreos-metadata[1448]: May 16 03:28:14.954 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 16 03:28:14.967890 coreos-metadata[1448]: May 16 03:28:14.967 INFO Fetch successful May 16 03:28:14.968017 coreos-metadata[1448]: May 16 03:28:14.967 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 May 16 03:28:14.983342 coreos-metadata[1448]: May 16 03:28:14.983 INFO Fetch successful May 16 03:28:14.983342 coreos-metadata[1448]: May 16 03:28:14.983 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 May 16 03:28:14.997451 coreos-metadata[1448]: May 16 03:28:14.997 INFO Fetch successful May 16 03:28:14.997451 coreos-metadata[1448]: May 16 03:28:14.997 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 May 16 03:28:15.011690 coreos-metadata[1448]: May 16 03:28:15.011 INFO Fetch successful May 16 03:28:15.011690 coreos-metadata[1448]: May 16 03:28:15.011 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 May 16 03:28:15.025509 coreos-metadata[1448]: May 16 03:28:15.025 INFO Fetch successful May 16 03:28:15.042549 coreos-metadata[1510]: May 16 03:28:15.042 WARN failed to locate config-drive, using the metadata service API instead May 16 03:28:15.087826 coreos-metadata[1510]: May 16 03:28:15.087 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 May 16 03:28:15.098515 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 16 03:28:15.100434 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 16 03:28:15.102036 sshd[1606]: Accepted publickey for core from 172.24.4.1 port 47958 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:28:15.103378 coreos-metadata[1510]: May 16 03:28:15.103 INFO Fetch successful May 16 03:28:15.103989 coreos-metadata[1510]: May 16 03:28:15.103 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 May 16 03:28:15.104690 sshd-session[1606]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:28:15.113720 systemd-logind[1458]: New session 5 of user core. May 16 03:28:15.118071 coreos-metadata[1510]: May 16 03:28:15.117 INFO Fetch successful May 16 03:28:15.121600 systemd[1]: Started session-5.scope - Session 5 of User core. May 16 03:28:15.129966 unknown[1510]: wrote ssh authorized keys file for user: core May 16 03:28:15.175734 update-ssh-keys[1646]: Updated "/home/core/.ssh/authorized_keys" May 16 03:28:15.178085 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 16 03:28:15.180649 systemd[1]: Finished sshkeys.service. May 16 03:28:15.185994 systemd[1]: Reached target multi-user.target - Multi-User System. May 16 03:28:15.186317 systemd[1]: Startup finished in 1.256s (kernel) + 15.341s (initrd) + 11.101s (userspace) = 27.699s. May 16 03:28:15.781184 sshd[1644]: Connection closed by 172.24.4.1 port 47958 May 16 03:28:15.780573 sshd-session[1606]: pam_unix(sshd:session): session closed for user core May 16 03:28:15.786684 systemd[1]: sshd@2-172.24.4.18:22-172.24.4.1:47958.service: Deactivated successfully. May 16 03:28:15.790576 systemd[1]: session-5.scope: Deactivated successfully. May 16 03:28:15.793983 systemd-logind[1458]: Session 5 logged out. Waiting for processes to exit. May 16 03:28:15.796678 systemd-logind[1458]: Removed session 5. May 16 03:28:22.620737 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 16 03:28:22.632119 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 03:28:23.119713 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 03:28:23.134588 (kubelet)[1660]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 16 03:28:23.222222 kubelet[1660]: E0516 03:28:23.222109 1660 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 16 03:28:23.233728 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 16 03:28:23.234430 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 16 03:28:23.235600 systemd[1]: kubelet.service: Consumed 413ms CPU time, 108.6M memory peak. May 16 03:28:25.813535 systemd[1]: Started sshd@3-172.24.4.18:22-172.24.4.1:53314.service - OpenSSH per-connection server daemon (172.24.4.1:53314). May 16 03:28:26.959689 sshd[1669]: Accepted publickey for core from 172.24.4.1 port 53314 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:28:26.968420 sshd-session[1669]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:28:26.999643 systemd-logind[1458]: New session 6 of user core. May 16 03:28:27.018641 systemd[1]: Started session-6.scope - Session 6 of User core. May 16 03:28:27.599478 sshd[1671]: Connection closed by 172.24.4.1 port 53314 May 16 03:28:27.600738 sshd-session[1669]: pam_unix(sshd:session): session closed for user core May 16 03:28:27.613576 systemd[1]: sshd@3-172.24.4.18:22-172.24.4.1:53314.service: Deactivated successfully. May 16 03:28:27.615979 systemd[1]: session-6.scope: Deactivated successfully. May 16 03:28:27.619526 systemd-logind[1458]: Session 6 logged out. Waiting for processes to exit. May 16 03:28:27.622922 systemd[1]: Started sshd@4-172.24.4.18:22-172.24.4.1:53328.service - OpenSSH per-connection server daemon (172.24.4.1:53328). May 16 03:28:27.625326 systemd-logind[1458]: Removed session 6. May 16 03:28:28.860817 sshd[1676]: Accepted publickey for core from 172.24.4.1 port 53328 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:28:28.864200 sshd-session[1676]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:28:28.878752 systemd-logind[1458]: New session 7 of user core. May 16 03:28:28.886575 systemd[1]: Started session-7.scope - Session 7 of User core. May 16 03:28:29.737208 sshd[1679]: Connection closed by 172.24.4.1 port 53328 May 16 03:28:29.740589 sshd-session[1676]: pam_unix(sshd:session): session closed for user core May 16 03:28:29.756081 systemd[1]: sshd@4-172.24.4.18:22-172.24.4.1:53328.service: Deactivated successfully. May 16 03:28:29.760271 systemd[1]: session-7.scope: Deactivated successfully. May 16 03:28:29.762630 systemd-logind[1458]: Session 7 logged out. Waiting for processes to exit. May 16 03:28:29.769261 systemd[1]: Started sshd@5-172.24.4.18:22-172.24.4.1:53336.service - OpenSSH per-connection server daemon (172.24.4.1:53336). May 16 03:28:29.771533 systemd-logind[1458]: Removed session 7. May 16 03:28:31.506042 sshd[1684]: Accepted publickey for core from 172.24.4.1 port 53336 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:28:31.510128 sshd-session[1684]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:28:31.523836 systemd-logind[1458]: New session 8 of user core. May 16 03:28:31.534442 systemd[1]: Started session-8.scope - Session 8 of User core. May 16 03:28:32.212187 sshd[1687]: Connection closed by 172.24.4.1 port 53336 May 16 03:28:32.212634 sshd-session[1684]: pam_unix(sshd:session): session closed for user core May 16 03:28:32.232941 systemd[1]: sshd@5-172.24.4.18:22-172.24.4.1:53336.service: Deactivated successfully. May 16 03:28:32.237085 systemd[1]: session-8.scope: Deactivated successfully. May 16 03:28:32.239686 systemd-logind[1458]: Session 8 logged out. Waiting for processes to exit. May 16 03:28:32.245310 systemd[1]: Started sshd@6-172.24.4.18:22-172.24.4.1:53350.service - OpenSSH per-connection server daemon (172.24.4.1:53350). May 16 03:28:32.248534 systemd-logind[1458]: Removed session 8. May 16 03:28:33.433977 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 16 03:28:33.438544 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 03:28:33.647833 sshd[1692]: Accepted publickey for core from 172.24.4.1 port 53350 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:28:33.653712 sshd-session[1692]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:28:33.668573 systemd-logind[1458]: New session 9 of user core. May 16 03:28:33.676316 systemd[1]: Started session-9.scope - Session 9 of User core. May 16 03:28:34.005241 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 03:28:34.017571 (kubelet)[1704]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 16 03:28:34.083432 sudo[1709]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 16 03:28:34.084320 sudo[1709]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 16 03:28:34.100198 sudo[1709]: pam_unix(sudo:session): session closed for user root May 16 03:28:34.116195 kubelet[1704]: E0516 03:28:34.115971 1704 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 16 03:28:34.118714 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 16 03:28:34.118862 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 16 03:28:34.119609 systemd[1]: kubelet.service: Consumed 597ms CPU time, 112.1M memory peak. May 16 03:28:34.317751 sshd[1698]: Connection closed by 172.24.4.1 port 53350 May 16 03:28:34.317277 sshd-session[1692]: pam_unix(sshd:session): session closed for user core May 16 03:28:34.339044 systemd[1]: sshd@6-172.24.4.18:22-172.24.4.1:53350.service: Deactivated successfully. May 16 03:28:34.342817 systemd[1]: session-9.scope: Deactivated successfully. May 16 03:28:34.347557 systemd-logind[1458]: Session 9 logged out. Waiting for processes to exit. May 16 03:28:34.350501 systemd[1]: Started sshd@7-172.24.4.18:22-172.24.4.1:47682.service - OpenSSH per-connection server daemon (172.24.4.1:47682). May 16 03:28:34.354123 systemd-logind[1458]: Removed session 9. May 16 03:28:35.670406 sshd[1716]: Accepted publickey for core from 172.24.4.1 port 47682 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:28:35.673517 sshd-session[1716]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:28:35.686954 systemd-logind[1458]: New session 10 of user core. May 16 03:28:35.698429 systemd[1]: Started session-10.scope - Session 10 of User core. May 16 03:28:36.089680 sudo[1721]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 16 03:28:36.091265 sudo[1721]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 16 03:28:36.099658 sudo[1721]: pam_unix(sudo:session): session closed for user root May 16 03:28:36.113072 sudo[1720]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 16 03:28:36.113953 sudo[1720]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 16 03:28:36.140271 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 16 03:28:36.237217 augenrules[1743]: No rules May 16 03:28:36.239394 systemd[1]: audit-rules.service: Deactivated successfully. May 16 03:28:36.239864 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 16 03:28:36.243124 sudo[1720]: pam_unix(sudo:session): session closed for user root May 16 03:28:36.389019 sshd[1719]: Connection closed by 172.24.4.1 port 47682 May 16 03:28:36.388707 sshd-session[1716]: pam_unix(sshd:session): session closed for user core May 16 03:28:36.407291 systemd[1]: sshd@7-172.24.4.18:22-172.24.4.1:47682.service: Deactivated successfully. May 16 03:28:36.411114 systemd[1]: session-10.scope: Deactivated successfully. May 16 03:28:36.412966 systemd-logind[1458]: Session 10 logged out. Waiting for processes to exit. May 16 03:28:36.417884 systemd[1]: Started sshd@8-172.24.4.18:22-172.24.4.1:47688.service - OpenSSH per-connection server daemon (172.24.4.1:47688). May 16 03:28:36.421282 systemd-logind[1458]: Removed session 10. May 16 03:28:38.014505 sshd[1751]: Accepted publickey for core from 172.24.4.1 port 47688 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:28:38.018342 sshd-session[1751]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:28:38.030769 systemd-logind[1458]: New session 11 of user core. May 16 03:28:38.045506 systemd[1]: Started session-11.scope - Session 11 of User core. May 16 03:28:38.577859 sudo[1755]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 16 03:28:38.578629 sudo[1755]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 16 03:28:39.553945 systemd[1]: Starting docker.service - Docker Application Container Engine... May 16 03:28:39.569549 (dockerd)[1774]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 16 03:28:40.126254 dockerd[1774]: time="2025-05-16T03:28:40.126018564Z" level=info msg="Starting up" May 16 03:28:40.128085 dockerd[1774]: time="2025-05-16T03:28:40.127706941Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 16 03:28:40.231913 dockerd[1774]: time="2025-05-16T03:28:40.231828401Z" level=info msg="Loading containers: start." May 16 03:28:40.497202 kernel: Initializing XFRM netlink socket May 16 03:28:40.613118 systemd-networkd[1395]: docker0: Link UP May 16 03:28:40.692229 dockerd[1774]: time="2025-05-16T03:28:40.691968183Z" level=info msg="Loading containers: done." May 16 03:28:40.722971 dockerd[1774]: time="2025-05-16T03:28:40.722218998Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 16 03:28:40.722971 dockerd[1774]: time="2025-05-16T03:28:40.722403584Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 May 16 03:28:40.722971 dockerd[1774]: time="2025-05-16T03:28:40.722620852Z" level=info msg="Daemon has completed initialization" May 16 03:28:40.798959 dockerd[1774]: time="2025-05-16T03:28:40.797945774Z" level=info msg="API listen on /run/docker.sock" May 16 03:28:40.799674 systemd[1]: Started docker.service - Docker Application Container Engine. May 16 03:28:42.450313 containerd[1476]: time="2025-05-16T03:28:42.449896587Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.5\"" May 16 03:28:43.283752 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3714755472.mount: Deactivated successfully. May 16 03:28:44.185658 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 16 03:28:44.190666 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 03:28:44.415287 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 03:28:44.436537 (kubelet)[2030]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 16 03:28:44.500183 kubelet[2030]: E0516 03:28:44.499498 2030 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 16 03:28:44.501266 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 16 03:28:44.501448 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 16 03:28:44.501839 systemd[1]: kubelet.service: Consumed 239ms CPU time, 112M memory peak. May 16 03:28:45.750805 containerd[1476]: time="2025-05-16T03:28:45.750671870Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:45.751934 containerd[1476]: time="2025-05-16T03:28:45.751811107Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.5: active requests=0, bytes read=28797819" May 16 03:28:45.755169 containerd[1476]: time="2025-05-16T03:28:45.753626061Z" level=info msg="ImageCreate event name:\"sha256:495c5ce47cf7c8b58655ef50d0f0a9b43c5ae18492059dc9af4c9aacae82a5a4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:45.757120 containerd[1476]: time="2025-05-16T03:28:45.757073527Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0bee1bf751fe06009678c0cde7545443ba3a8d2edf71cea4c69cbb5774b9bf47\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:45.758405 containerd[1476]: time="2025-05-16T03:28:45.758374166Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.5\" with image id \"sha256:495c5ce47cf7c8b58655ef50d0f0a9b43c5ae18492059dc9af4c9aacae82a5a4\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0bee1bf751fe06009678c0cde7545443ba3a8d2edf71cea4c69cbb5774b9bf47\", size \"28794611\" in 3.308170864s" May 16 03:28:45.758598 containerd[1476]: time="2025-05-16T03:28:45.758576085Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.5\" returns image reference \"sha256:495c5ce47cf7c8b58655ef50d0f0a9b43c5ae18492059dc9af4c9aacae82a5a4\"" May 16 03:28:45.760125 containerd[1476]: time="2025-05-16T03:28:45.760028428Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.5\"" May 16 03:28:48.269204 containerd[1476]: time="2025-05-16T03:28:48.267572126Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:48.269204 containerd[1476]: time="2025-05-16T03:28:48.268947205Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.5: active requests=0, bytes read=24782531" May 16 03:28:48.271280 containerd[1476]: time="2025-05-16T03:28:48.271188198Z" level=info msg="ImageCreate event name:\"sha256:85dcaf69f000132c34fa34452e0fd8444bdf360b593fe06b1103680f6ecc7e00\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:48.278157 containerd[1476]: time="2025-05-16T03:28:48.278053584Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:79bcf2f5e614c336c02dcea9dfcdf485d7297aed6a21239a99c87f7164f9baca\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:48.281668 containerd[1476]: time="2025-05-16T03:28:48.281588143Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.5\" with image id \"sha256:85dcaf69f000132c34fa34452e0fd8444bdf360b593fe06b1103680f6ecc7e00\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:79bcf2f5e614c336c02dcea9dfcdf485d7297aed6a21239a99c87f7164f9baca\", size \"26384363\" in 2.521217342s" May 16 03:28:48.282275 containerd[1476]: time="2025-05-16T03:28:48.281667472Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.5\" returns image reference \"sha256:85dcaf69f000132c34fa34452e0fd8444bdf360b593fe06b1103680f6ecc7e00\"" May 16 03:28:48.283433 containerd[1476]: time="2025-05-16T03:28:48.283071084Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.5\"" May 16 03:28:49.432371 systemd-timesyncd[1361]: Timed out waiting for reply from 23.186.168.130:123 (2.flatcar.pool.ntp.org). May 16 03:28:50.666482 systemd-timesyncd[1361]: Contacted time server 66.42.86.174:123 (2.flatcar.pool.ntp.org). May 16 03:28:50.666680 systemd-timesyncd[1361]: Initial clock synchronization to Fri 2025-05-16 03:28:50.662936 UTC. May 16 03:28:50.666832 systemd-resolved[1339]: Clock change detected. Flushing caches. May 16 03:28:51.222557 containerd[1476]: time="2025-05-16T03:28:51.222418932Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:51.223715 containerd[1476]: time="2025-05-16T03:28:51.223594487Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.5: active requests=0, bytes read=19176071" May 16 03:28:51.225209 containerd[1476]: time="2025-05-16T03:28:51.225176944Z" level=info msg="ImageCreate event name:\"sha256:2729fb488407e634105c62238a45a599db1692680526e20844060a7a8197b45a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:51.229366 containerd[1476]: time="2025-05-16T03:28:51.229290650Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f0f39d8b9808c407cacb3a46a5a9ce4d4a4a7cf3b674ba4bd221f5bc90051d2a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:51.231005 containerd[1476]: time="2025-05-16T03:28:51.230337643Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.5\" with image id \"sha256:2729fb488407e634105c62238a45a599db1692680526e20844060a7a8197b45a\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f0f39d8b9808c407cacb3a46a5a9ce4d4a4a7cf3b674ba4bd221f5bc90051d2a\", size \"20777921\" in 1.81478601s" May 16 03:28:51.231005 containerd[1476]: time="2025-05-16T03:28:51.230373190Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.5\" returns image reference \"sha256:2729fb488407e634105c62238a45a599db1692680526e20844060a7a8197b45a\"" May 16 03:28:51.232298 containerd[1476]: time="2025-05-16T03:28:51.232268745Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.5\"" May 16 03:28:52.648633 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1647761454.mount: Deactivated successfully. May 16 03:28:53.215014 containerd[1476]: time="2025-05-16T03:28:53.214335280Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:53.215467 containerd[1476]: time="2025-05-16T03:28:53.215429142Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.5: active requests=0, bytes read=30892880" May 16 03:28:53.216623 containerd[1476]: time="2025-05-16T03:28:53.216598034Z" level=info msg="ImageCreate event name:\"sha256:f532b7356fac4d7c4e4f6763bb5a15a43e3bb740c9fb26c85b906a4d971f2363\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:53.218779 containerd[1476]: time="2025-05-16T03:28:53.218755561Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:9dc6553459c3319525ba4090a780db1a133d5dee68c08e07f9b9d6ba83b42a0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:53.219419 containerd[1476]: time="2025-05-16T03:28:53.219372568Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.5\" with image id \"sha256:f532b7356fac4d7c4e4f6763bb5a15a43e3bb740c9fb26c85b906a4d971f2363\", repo tag \"registry.k8s.io/kube-proxy:v1.32.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:9dc6553459c3319525ba4090a780db1a133d5dee68c08e07f9b9d6ba83b42a0b\", size \"30891891\" in 1.987065361s" May 16 03:28:53.219482 containerd[1476]: time="2025-05-16T03:28:53.219425878Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.5\" returns image reference \"sha256:f532b7356fac4d7c4e4f6763bb5a15a43e3bb740c9fb26c85b906a4d971f2363\"" May 16 03:28:53.220372 containerd[1476]: time="2025-05-16T03:28:53.220352415Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 16 03:28:53.832106 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1476501918.mount: Deactivated successfully. May 16 03:28:54.633376 update_engine[1461]: I20250516 03:28:54.632927 1461 update_attempter.cc:509] Updating boot flags... May 16 03:28:54.914077 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 40 scanned by (udev-worker) (2080) May 16 03:28:54.995198 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 40 scanned by (udev-worker) (2080) May 16 03:28:55.558058 containerd[1476]: time="2025-05-16T03:28:55.557786222Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:55.559671 containerd[1476]: time="2025-05-16T03:28:55.559337261Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" May 16 03:28:55.561079 containerd[1476]: time="2025-05-16T03:28:55.561046847Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:55.565349 containerd[1476]: time="2025-05-16T03:28:55.565285437Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:28:55.567254 containerd[1476]: time="2025-05-16T03:28:55.567205658Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 2.346614916s" May 16 03:28:55.567254 containerd[1476]: time="2025-05-16T03:28:55.567241856Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" May 16 03:28:55.567830 containerd[1476]: time="2025-05-16T03:28:55.567792819Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 16 03:28:55.815668 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 16 03:28:55.822296 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 03:28:56.121789 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 03:28:56.136421 (kubelet)[2130]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 16 03:28:56.265539 kubelet[2130]: E0516 03:28:56.265388 2130 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 16 03:28:56.270949 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 16 03:28:56.271352 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 16 03:28:56.272297 systemd[1]: kubelet.service: Consumed 371ms CPU time, 110.2M memory peak. May 16 03:28:56.865224 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount366609660.mount: Deactivated successfully. May 16 03:28:56.877874 containerd[1476]: time="2025-05-16T03:28:56.877740356Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 16 03:28:56.879772 containerd[1476]: time="2025-05-16T03:28:56.879664184Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" May 16 03:28:56.881274 containerd[1476]: time="2025-05-16T03:28:56.881126276Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 16 03:28:56.886110 containerd[1476]: time="2025-05-16T03:28:56.885917362Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 16 03:28:56.889042 containerd[1476]: time="2025-05-16T03:28:56.887813658Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.31997872s" May 16 03:28:56.889042 containerd[1476]: time="2025-05-16T03:28:56.887891133Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 16 03:28:56.889871 containerd[1476]: time="2025-05-16T03:28:56.889814531Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" May 16 03:28:57.594306 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2783428386.mount: Deactivated successfully. May 16 03:29:00.404610 containerd[1476]: time="2025-05-16T03:29:00.404517716Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:00.405962 containerd[1476]: time="2025-05-16T03:29:00.405896411Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551368" May 16 03:29:00.407389 containerd[1476]: time="2025-05-16T03:29:00.407335671Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:00.411122 containerd[1476]: time="2025-05-16T03:29:00.411049877Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:00.412546 containerd[1476]: time="2025-05-16T03:29:00.412214371Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.522336641s" May 16 03:29:00.412546 containerd[1476]: time="2025-05-16T03:29:00.412261910Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" May 16 03:29:04.462654 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 16 03:29:04.463002 systemd[1]: kubelet.service: Consumed 371ms CPU time, 110.2M memory peak. May 16 03:29:04.469375 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 03:29:04.520013 systemd[1]: Reload requested from client PID 2224 ('systemctl') (unit session-11.scope)... May 16 03:29:04.520064 systemd[1]: Reloading... May 16 03:29:04.675039 zram_generator::config[2273]: No configuration found. May 16 03:29:04.827590 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 16 03:29:04.960514 systemd[1]: Reloading finished in 439 ms. May 16 03:29:05.033646 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 16 03:29:05.037795 systemd[1]: kubelet.service: Deactivated successfully. May 16 03:29:05.038039 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 16 03:29:05.038086 systemd[1]: kubelet.service: Consumed 361ms CPU time, 98.3M memory peak. May 16 03:29:05.039921 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 03:29:05.191814 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 03:29:05.202258 (kubelet)[2339]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 16 03:29:05.267798 kubelet[2339]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 16 03:29:05.267798 kubelet[2339]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 16 03:29:05.267798 kubelet[2339]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 16 03:29:05.268801 kubelet[2339]: I0516 03:29:05.267901 2339 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 16 03:29:05.771242 kubelet[2339]: I0516 03:29:05.770857 2339 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" May 16 03:29:05.771242 kubelet[2339]: I0516 03:29:05.770889 2339 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 16 03:29:05.771242 kubelet[2339]: I0516 03:29:05.771225 2339 server.go:954] "Client rotation is on, will bootstrap in background" May 16 03:29:06.943960 kubelet[2339]: E0516 03:29:06.943835 2339 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.24.4.18:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.24.4.18:6443: connect: connection refused" logger="UnhandledError" May 16 03:29:06.966849 kubelet[2339]: I0516 03:29:06.966475 2339 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 16 03:29:07.014887 kubelet[2339]: I0516 03:29:07.014798 2339 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 16 03:29:07.027179 kubelet[2339]: I0516 03:29:07.027078 2339 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 16 03:29:07.031445 kubelet[2339]: I0516 03:29:07.031309 2339 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 16 03:29:07.032132 kubelet[2339]: I0516 03:29:07.031404 2339 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-34cf5e3c62.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 16 03:29:07.033194 kubelet[2339]: I0516 03:29:07.032174 2339 topology_manager.go:138] "Creating topology manager with none policy" May 16 03:29:07.033194 kubelet[2339]: I0516 03:29:07.032234 2339 container_manager_linux.go:304] "Creating device plugin manager" May 16 03:29:07.033194 kubelet[2339]: I0516 03:29:07.032651 2339 state_mem.go:36] "Initialized new in-memory state store" May 16 03:29:07.044426 kubelet[2339]: I0516 03:29:07.044070 2339 kubelet.go:446] "Attempting to sync node with API server" May 16 03:29:07.050865 kubelet[2339]: I0516 03:29:07.050112 2339 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 16 03:29:07.050865 kubelet[2339]: I0516 03:29:07.050275 2339 kubelet.go:352] "Adding apiserver pod source" May 16 03:29:07.050865 kubelet[2339]: I0516 03:29:07.050354 2339 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 16 03:29:07.056353 kubelet[2339]: W0516 03:29:07.055049 2339 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.18:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-34cf5e3c62.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.18:6443: connect: connection refused May 16 03:29:07.056353 kubelet[2339]: E0516 03:29:07.055241 2339 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.24.4.18:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-34cf5e3c62.novalocal&limit=500&resourceVersion=0\": dial tcp 172.24.4.18:6443: connect: connection refused" logger="UnhandledError" May 16 03:29:07.057217 kubelet[2339]: W0516 03:29:07.057164 2339 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.18:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.24.4.18:6443: connect: connection refused May 16 03:29:07.057521 kubelet[2339]: E0516 03:29:07.057469 2339 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.24.4.18:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.24.4.18:6443: connect: connection refused" logger="UnhandledError" May 16 03:29:07.058782 kubelet[2339]: I0516 03:29:07.058729 2339 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 16 03:29:07.060489 kubelet[2339]: I0516 03:29:07.060444 2339 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 16 03:29:07.062790 kubelet[2339]: W0516 03:29:07.062746 2339 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 16 03:29:07.073902 kubelet[2339]: I0516 03:29:07.073847 2339 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 16 03:29:07.074309 kubelet[2339]: I0516 03:29:07.074279 2339 server.go:1287] "Started kubelet" May 16 03:29:07.080909 kubelet[2339]: I0516 03:29:07.079736 2339 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 16 03:29:07.083492 kubelet[2339]: I0516 03:29:07.082604 2339 server.go:479] "Adding debug handlers to kubelet server" May 16 03:29:07.092431 kubelet[2339]: I0516 03:29:07.092266 2339 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 16 03:29:07.093241 kubelet[2339]: I0516 03:29:07.093202 2339 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 16 03:29:07.094411 kubelet[2339]: I0516 03:29:07.094373 2339 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 16 03:29:07.098488 kubelet[2339]: E0516 03:29:07.093833 2339 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.18:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.18:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284-0-0-n-34cf5e3c62.novalocal.183fe440b8e6c8e8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284-0-0-n-34cf5e3c62.novalocal,UID:ci-4284-0-0-n-34cf5e3c62.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-n-34cf5e3c62.novalocal,},FirstTimestamp:2025-05-16 03:29:07.07421412 +0000 UTC m=+1.858403617,LastTimestamp:2025-05-16 03:29:07.07421412 +0000 UTC m=+1.858403617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-n-34cf5e3c62.novalocal,}" May 16 03:29:07.102082 kubelet[2339]: I0516 03:29:07.101966 2339 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 16 03:29:07.108367 kubelet[2339]: I0516 03:29:07.108302 2339 volume_manager.go:297] "Starting Kubelet Volume Manager" May 16 03:29:07.108905 kubelet[2339]: E0516 03:29:07.108853 2339 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-34cf5e3c62.novalocal\" not found" May 16 03:29:07.110476 kubelet[2339]: I0516 03:29:07.110426 2339 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 16 03:29:07.110621 kubelet[2339]: I0516 03:29:07.110552 2339 reconciler.go:26] "Reconciler: start to sync state" May 16 03:29:07.111405 kubelet[2339]: W0516 03:29:07.111309 2339 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.18:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.18:6443: connect: connection refused May 16 03:29:07.111536 kubelet[2339]: E0516 03:29:07.111413 2339 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.24.4.18:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.24.4.18:6443: connect: connection refused" logger="UnhandledError" May 16 03:29:07.111642 kubelet[2339]: E0516 03:29:07.111556 2339 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.18:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-34cf5e3c62.novalocal?timeout=10s\": dial tcp 172.24.4.18:6443: connect: connection refused" interval="200ms" May 16 03:29:07.113791 kubelet[2339]: I0516 03:29:07.113366 2339 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 16 03:29:07.116065 kubelet[2339]: I0516 03:29:07.115540 2339 factory.go:221] Registration of the containerd container factory successfully May 16 03:29:07.116065 kubelet[2339]: I0516 03:29:07.115576 2339 factory.go:221] Registration of the systemd container factory successfully May 16 03:29:07.136140 kubelet[2339]: I0516 03:29:07.136072 2339 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 16 03:29:07.138885 kubelet[2339]: E0516 03:29:07.138854 2339 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 16 03:29:07.142140 kubelet[2339]: I0516 03:29:07.142112 2339 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 16 03:29:07.142621 kubelet[2339]: I0516 03:29:07.142302 2339 status_manager.go:227] "Starting to sync pod status with apiserver" May 16 03:29:07.142621 kubelet[2339]: I0516 03:29:07.142353 2339 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 16 03:29:07.142621 kubelet[2339]: I0516 03:29:07.142368 2339 kubelet.go:2382] "Starting kubelet main sync loop" May 16 03:29:07.142621 kubelet[2339]: E0516 03:29:07.142432 2339 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 16 03:29:07.145938 kubelet[2339]: W0516 03:29:07.145895 2339 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.18:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.18:6443: connect: connection refused May 16 03:29:07.146372 kubelet[2339]: E0516 03:29:07.145938 2339 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.24.4.18:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.24.4.18:6443: connect: connection refused" logger="UnhandledError" May 16 03:29:07.146955 kubelet[2339]: I0516 03:29:07.146705 2339 cpu_manager.go:221] "Starting CPU manager" policy="none" May 16 03:29:07.146955 kubelet[2339]: I0516 03:29:07.146721 2339 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 16 03:29:07.146955 kubelet[2339]: I0516 03:29:07.146749 2339 state_mem.go:36] "Initialized new in-memory state store" May 16 03:29:07.159006 kubelet[2339]: I0516 03:29:07.158962 2339 policy_none.go:49] "None policy: Start" May 16 03:29:07.159066 kubelet[2339]: I0516 03:29:07.159025 2339 memory_manager.go:186] "Starting memorymanager" policy="None" May 16 03:29:07.159066 kubelet[2339]: I0516 03:29:07.159048 2339 state_mem.go:35] "Initializing new in-memory state store" May 16 03:29:07.166187 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 16 03:29:07.180937 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 16 03:29:07.184417 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 16 03:29:07.193960 kubelet[2339]: I0516 03:29:07.193925 2339 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 16 03:29:07.195288 kubelet[2339]: I0516 03:29:07.194157 2339 eviction_manager.go:189] "Eviction manager: starting control loop" May 16 03:29:07.195288 kubelet[2339]: I0516 03:29:07.194182 2339 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 16 03:29:07.195288 kubelet[2339]: I0516 03:29:07.195142 2339 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 16 03:29:07.198277 kubelet[2339]: E0516 03:29:07.198230 2339 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 16 03:29:07.198477 kubelet[2339]: E0516 03:29:07.198455 2339 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4284-0-0-n-34cf5e3c62.novalocal\" not found" May 16 03:29:07.268060 systemd[1]: Created slice kubepods-burstable-pod20aa906a118e6b0ed5f5319dedbd16e0.slice - libcontainer container kubepods-burstable-pod20aa906a118e6b0ed5f5319dedbd16e0.slice. May 16 03:29:07.285623 kubelet[2339]: E0516 03:29:07.285491 2339 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-34cf5e3c62.novalocal\" not found" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.292844 systemd[1]: Created slice kubepods-burstable-pode632c0389afc7e67a331931f852518d3.slice - libcontainer container kubepods-burstable-pode632c0389afc7e67a331931f852518d3.slice. May 16 03:29:07.298407 kubelet[2339]: I0516 03:29:07.298332 2339 kubelet_node_status.go:75] "Attempting to register node" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.299375 kubelet[2339]: E0516 03:29:07.299315 2339 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.24.4.18:6443/api/v1/nodes\": dial tcp 172.24.4.18:6443: connect: connection refused" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.303270 kubelet[2339]: E0516 03:29:07.303195 2339 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-34cf5e3c62.novalocal\" not found" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.311326 systemd[1]: Created slice kubepods-burstable-pod1beeb9e74854c2eec2d4bac81830b4ff.slice - libcontainer container kubepods-burstable-pod1beeb9e74854c2eec2d4bac81830b4ff.slice. May 16 03:29:07.312965 kubelet[2339]: I0516 03:29:07.311631 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/20aa906a118e6b0ed5f5319dedbd16e0-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"20aa906a118e6b0ed5f5319dedbd16e0\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.312965 kubelet[2339]: I0516 03:29:07.311707 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/20aa906a118e6b0ed5f5319dedbd16e0-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"20aa906a118e6b0ed5f5319dedbd16e0\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.312965 kubelet[2339]: I0516 03:29:07.311775 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e632c0389afc7e67a331931f852518d3-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"e632c0389afc7e67a331931f852518d3\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.312965 kubelet[2339]: I0516 03:29:07.311832 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e632c0389afc7e67a331931f852518d3-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"e632c0389afc7e67a331931f852518d3\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.312965 kubelet[2339]: I0516 03:29:07.311888 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1beeb9e74854c2eec2d4bac81830b4ff-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"1beeb9e74854c2eec2d4bac81830b4ff\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.313795 kubelet[2339]: I0516 03:29:07.311943 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/20aa906a118e6b0ed5f5319dedbd16e0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"20aa906a118e6b0ed5f5319dedbd16e0\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.313795 kubelet[2339]: I0516 03:29:07.312039 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e632c0389afc7e67a331931f852518d3-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"e632c0389afc7e67a331931f852518d3\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.313795 kubelet[2339]: I0516 03:29:07.312095 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e632c0389afc7e67a331931f852518d3-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"e632c0389afc7e67a331931f852518d3\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.313795 kubelet[2339]: I0516 03:29:07.312157 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e632c0389afc7e67a331931f852518d3-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"e632c0389afc7e67a331931f852518d3\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.315513 kubelet[2339]: E0516 03:29:07.315450 2339 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.18:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-34cf5e3c62.novalocal?timeout=10s\": dial tcp 172.24.4.18:6443: connect: connection refused" interval="400ms" May 16 03:29:07.318507 kubelet[2339]: E0516 03:29:07.318455 2339 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-34cf5e3c62.novalocal\" not found" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.502183 kubelet[2339]: I0516 03:29:07.502031 2339 kubelet_node_status.go:75] "Attempting to register node" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.514605 kubelet[2339]: E0516 03:29:07.502425 2339 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.24.4.18:6443/api/v1/nodes\": dial tcp 172.24.4.18:6443: connect: connection refused" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.589041 containerd[1476]: time="2025-05-16T03:29:07.588449598Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal,Uid:20aa906a118e6b0ed5f5319dedbd16e0,Namespace:kube-system,Attempt:0,}" May 16 03:29:07.605579 containerd[1476]: time="2025-05-16T03:29:07.605463137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal,Uid:e632c0389afc7e67a331931f852518d3,Namespace:kube-system,Attempt:0,}" May 16 03:29:07.626063 containerd[1476]: time="2025-05-16T03:29:07.625698148Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal,Uid:1beeb9e74854c2eec2d4bac81830b4ff,Namespace:kube-system,Attempt:0,}" May 16 03:29:07.716946 kubelet[2339]: E0516 03:29:07.716882 2339 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.18:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-34cf5e3c62.novalocal?timeout=10s\": dial tcp 172.24.4.18:6443: connect: connection refused" interval="800ms" May 16 03:29:07.730017 containerd[1476]: time="2025-05-16T03:29:07.729543480Z" level=info msg="connecting to shim 63f733cbadf97bdb24f727cc66f98d8df03b43fc083df73575dabb724babc0b6" address="unix:///run/containerd/s/83bf1aaa452f4b3d7dd09b2f32849c191bf6bd82ce6d81a5f96663131b8c8d99" namespace=k8s.io protocol=ttrpc version=3 May 16 03:29:07.731079 containerd[1476]: time="2025-05-16T03:29:07.731032873Z" level=info msg="connecting to shim a47a87742e348af182193a202bbc43ab5bb7c21e022981a556586551fe0a357b" address="unix:///run/containerd/s/47c88052571a2fbb76514c14b652cbef63c2076038d21121110574265c786d36" namespace=k8s.io protocol=ttrpc version=3 May 16 03:29:07.781758 containerd[1476]: time="2025-05-16T03:29:07.781627163Z" level=info msg="connecting to shim 98c4672d3bb7acca896d4d379a5d6b6b1f0f6b51f81e2888fa9b6bc1ebfaa969" address="unix:///run/containerd/s/1ca0f755ee85031e396901907b1824046751f3a5810713f42751993127a16ada" namespace=k8s.io protocol=ttrpc version=3 May 16 03:29:07.785120 systemd[1]: Started cri-containerd-a47a87742e348af182193a202bbc43ab5bb7c21e022981a556586551fe0a357b.scope - libcontainer container a47a87742e348af182193a202bbc43ab5bb7c21e022981a556586551fe0a357b. May 16 03:29:07.796050 systemd[1]: Started cri-containerd-63f733cbadf97bdb24f727cc66f98d8df03b43fc083df73575dabb724babc0b6.scope - libcontainer container 63f733cbadf97bdb24f727cc66f98d8df03b43fc083df73575dabb724babc0b6. May 16 03:29:07.833375 systemd[1]: Started cri-containerd-98c4672d3bb7acca896d4d379a5d6b6b1f0f6b51f81e2888fa9b6bc1ebfaa969.scope - libcontainer container 98c4672d3bb7acca896d4d379a5d6b6b1f0f6b51f81e2888fa9b6bc1ebfaa969. May 16 03:29:07.896902 containerd[1476]: time="2025-05-16T03:29:07.896716673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal,Uid:20aa906a118e6b0ed5f5319dedbd16e0,Namespace:kube-system,Attempt:0,} returns sandbox id \"a47a87742e348af182193a202bbc43ab5bb7c21e022981a556586551fe0a357b\"" May 16 03:29:07.904465 containerd[1476]: time="2025-05-16T03:29:07.904420091Z" level=info msg="CreateContainer within sandbox \"a47a87742e348af182193a202bbc43ab5bb7c21e022981a556586551fe0a357b\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 16 03:29:07.906656 kubelet[2339]: I0516 03:29:07.906563 2339 kubelet_node_status.go:75] "Attempting to register node" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.907648 kubelet[2339]: E0516 03:29:07.907616 2339 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.24.4.18:6443/api/v1/nodes\": dial tcp 172.24.4.18:6443: connect: connection refused" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:07.922569 containerd[1476]: time="2025-05-16T03:29:07.922512884Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal,Uid:e632c0389afc7e67a331931f852518d3,Namespace:kube-system,Attempt:0,} returns sandbox id \"63f733cbadf97bdb24f727cc66f98d8df03b43fc083df73575dabb724babc0b6\"" May 16 03:29:07.925966 containerd[1476]: time="2025-05-16T03:29:07.925707475Z" level=info msg="CreateContainer within sandbox \"63f733cbadf97bdb24f727cc66f98d8df03b43fc083df73575dabb724babc0b6\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 16 03:29:07.926751 containerd[1476]: time="2025-05-16T03:29:07.926701960Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal,Uid:1beeb9e74854c2eec2d4bac81830b4ff,Namespace:kube-system,Attempt:0,} returns sandbox id \"98c4672d3bb7acca896d4d379a5d6b6b1f0f6b51f81e2888fa9b6bc1ebfaa969\"" May 16 03:29:07.929709 containerd[1476]: time="2025-05-16T03:29:07.929672251Z" level=info msg="CreateContainer within sandbox \"98c4672d3bb7acca896d4d379a5d6b6b1f0f6b51f81e2888fa9b6bc1ebfaa969\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 16 03:29:07.933319 containerd[1476]: time="2025-05-16T03:29:07.933166274Z" level=info msg="Container 99cd49291a632c73a3bb28eb263acaae782c0a71272fc4ef6c10472240dd5b7f: CDI devices from CRI Config.CDIDevices: []" May 16 03:29:07.952448 containerd[1476]: time="2025-05-16T03:29:07.952397693Z" level=info msg="CreateContainer within sandbox \"a47a87742e348af182193a202bbc43ab5bb7c21e022981a556586551fe0a357b\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"99cd49291a632c73a3bb28eb263acaae782c0a71272fc4ef6c10472240dd5b7f\"" May 16 03:29:07.953069 containerd[1476]: time="2025-05-16T03:29:07.953044436Z" level=info msg="StartContainer for \"99cd49291a632c73a3bb28eb263acaae782c0a71272fc4ef6c10472240dd5b7f\"" May 16 03:29:07.956077 containerd[1476]: time="2025-05-16T03:29:07.955974721Z" level=info msg="Container aff0d092e2b167c222f66195ca6e32e5a809b76d6f43472bbc4761df8e2d4e4f: CDI devices from CRI Config.CDIDevices: []" May 16 03:29:07.956077 containerd[1476]: time="2025-05-16T03:29:07.956028993Z" level=info msg="connecting to shim 99cd49291a632c73a3bb28eb263acaae782c0a71272fc4ef6c10472240dd5b7f" address="unix:///run/containerd/s/47c88052571a2fbb76514c14b652cbef63c2076038d21121110574265c786d36" protocol=ttrpc version=3 May 16 03:29:07.969120 containerd[1476]: time="2025-05-16T03:29:07.969063229Z" level=info msg="Container 6879405abcf075161fec3b31d386f72b0af81309229a6e5015336d82b5e1cbfe: CDI devices from CRI Config.CDIDevices: []" May 16 03:29:07.980188 systemd[1]: Started cri-containerd-99cd49291a632c73a3bb28eb263acaae782c0a71272fc4ef6c10472240dd5b7f.scope - libcontainer container 99cd49291a632c73a3bb28eb263acaae782c0a71272fc4ef6c10472240dd5b7f. May 16 03:29:07.985921 kubelet[2339]: W0516 03:29:07.985834 2339 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.18:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.18:6443: connect: connection refused May 16 03:29:07.986349 containerd[1476]: time="2025-05-16T03:29:07.985934801Z" level=info msg="CreateContainer within sandbox \"63f733cbadf97bdb24f727cc66f98d8df03b43fc083df73575dabb724babc0b6\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"aff0d092e2b167c222f66195ca6e32e5a809b76d6f43472bbc4761df8e2d4e4f\"" May 16 03:29:07.987029 kubelet[2339]: E0516 03:29:07.985899 2339 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.24.4.18:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.24.4.18:6443: connect: connection refused" logger="UnhandledError" May 16 03:29:07.987238 containerd[1476]: time="2025-05-16T03:29:07.986810704Z" level=info msg="StartContainer for \"aff0d092e2b167c222f66195ca6e32e5a809b76d6f43472bbc4761df8e2d4e4f\"" May 16 03:29:07.990117 containerd[1476]: time="2025-05-16T03:29:07.990061331Z" level=info msg="connecting to shim aff0d092e2b167c222f66195ca6e32e5a809b76d6f43472bbc4761df8e2d4e4f" address="unix:///run/containerd/s/83bf1aaa452f4b3d7dd09b2f32849c191bf6bd82ce6d81a5f96663131b8c8d99" protocol=ttrpc version=3 May 16 03:29:08.003494 containerd[1476]: time="2025-05-16T03:29:08.003344734Z" level=info msg="CreateContainer within sandbox \"98c4672d3bb7acca896d4d379a5d6b6b1f0f6b51f81e2888fa9b6bc1ebfaa969\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"6879405abcf075161fec3b31d386f72b0af81309229a6e5015336d82b5e1cbfe\"" May 16 03:29:08.005165 containerd[1476]: time="2025-05-16T03:29:08.005083044Z" level=info msg="StartContainer for \"6879405abcf075161fec3b31d386f72b0af81309229a6e5015336d82b5e1cbfe\"" May 16 03:29:08.007080 containerd[1476]: time="2025-05-16T03:29:08.006793131Z" level=info msg="connecting to shim 6879405abcf075161fec3b31d386f72b0af81309229a6e5015336d82b5e1cbfe" address="unix:///run/containerd/s/1ca0f755ee85031e396901907b1824046751f3a5810713f42751993127a16ada" protocol=ttrpc version=3 May 16 03:29:08.019503 systemd[1]: Started cri-containerd-aff0d092e2b167c222f66195ca6e32e5a809b76d6f43472bbc4761df8e2d4e4f.scope - libcontainer container aff0d092e2b167c222f66195ca6e32e5a809b76d6f43472bbc4761df8e2d4e4f. May 16 03:29:08.036155 systemd[1]: Started cri-containerd-6879405abcf075161fec3b31d386f72b0af81309229a6e5015336d82b5e1cbfe.scope - libcontainer container 6879405abcf075161fec3b31d386f72b0af81309229a6e5015336d82b5e1cbfe. May 16 03:29:08.093476 containerd[1476]: time="2025-05-16T03:29:08.093414480Z" level=info msg="StartContainer for \"99cd49291a632c73a3bb28eb263acaae782c0a71272fc4ef6c10472240dd5b7f\" returns successfully" May 16 03:29:08.117415 kubelet[2339]: W0516 03:29:08.117306 2339 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.18:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.18:6443: connect: connection refused May 16 03:29:08.117415 kubelet[2339]: E0516 03:29:08.117413 2339 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.24.4.18:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.24.4.18:6443: connect: connection refused" logger="UnhandledError" May 16 03:29:08.143971 containerd[1476]: time="2025-05-16T03:29:08.143432518Z" level=info msg="StartContainer for \"aff0d092e2b167c222f66195ca6e32e5a809b76d6f43472bbc4761df8e2d4e4f\" returns successfully" May 16 03:29:08.180010 containerd[1476]: time="2025-05-16T03:29:08.179064336Z" level=info msg="StartContainer for \"6879405abcf075161fec3b31d386f72b0af81309229a6e5015336d82b5e1cbfe\" returns successfully" May 16 03:29:08.184357 kubelet[2339]: E0516 03:29:08.184324 2339 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-34cf5e3c62.novalocal\" not found" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:08.192375 kubelet[2339]: E0516 03:29:08.192334 2339 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-34cf5e3c62.novalocal\" not found" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:08.709480 kubelet[2339]: I0516 03:29:08.709447 2339 kubelet_node_status.go:75] "Attempting to register node" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:09.198256 kubelet[2339]: E0516 03:29:09.198222 2339 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-34cf5e3c62.novalocal\" not found" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:09.198622 kubelet[2339]: E0516 03:29:09.198603 2339 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-34cf5e3c62.novalocal\" not found" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:10.060686 kubelet[2339]: I0516 03:29:10.060476 2339 apiserver.go:52] "Watching apiserver" May 16 03:29:10.084669 kubelet[2339]: E0516 03:29:10.084620 2339 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4284-0-0-n-34cf5e3c62.novalocal\" not found" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:10.111404 kubelet[2339]: I0516 03:29:10.111321 2339 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 16 03:29:10.187298 kubelet[2339]: I0516 03:29:10.187032 2339 kubelet_node_status.go:78] "Successfully registered node" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:10.199384 kubelet[2339]: I0516 03:29:10.199335 2339 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:10.210045 kubelet[2339]: I0516 03:29:10.210007 2339 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:10.218469 kubelet[2339]: E0516 03:29:10.218404 2339 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:10.219517 kubelet[2339]: E0516 03:29:10.219244 2339 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:10.219517 kubelet[2339]: I0516 03:29:10.219267 2339 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:10.221821 kubelet[2339]: E0516 03:29:10.221630 2339 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:10.221821 kubelet[2339]: I0516 03:29:10.221674 2339 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:10.223677 kubelet[2339]: E0516 03:29:10.223638 2339 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:11.203048 kubelet[2339]: I0516 03:29:11.202901 2339 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:11.215237 kubelet[2339]: W0516 03:29:11.214160 2339 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 16 03:29:12.962921 systemd[1]: Reload requested from client PID 2606 ('systemctl') (unit session-11.scope)... May 16 03:29:12.964285 systemd[1]: Reloading... May 16 03:29:13.130106 zram_generator::config[2648]: No configuration found. May 16 03:29:13.317397 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 16 03:29:13.492968 systemd[1]: Reloading finished in 527 ms. May 16 03:29:13.520869 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 16 03:29:13.537631 systemd[1]: kubelet.service: Deactivated successfully. May 16 03:29:13.538221 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 16 03:29:13.538437 systemd[1]: kubelet.service: Consumed 1.385s CPU time, 133.2M memory peak. May 16 03:29:13.542963 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 03:29:13.852640 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 03:29:13.859337 (kubelet)[2715]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 16 03:29:13.923233 kubelet[2715]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 16 03:29:13.923233 kubelet[2715]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 16 03:29:13.923233 kubelet[2715]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 16 03:29:13.923233 kubelet[2715]: I0516 03:29:13.923210 2715 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 16 03:29:13.930526 kubelet[2715]: I0516 03:29:13.930474 2715 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" May 16 03:29:13.930526 kubelet[2715]: I0516 03:29:13.930501 2715 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 16 03:29:13.930798 kubelet[2715]: I0516 03:29:13.930761 2715 server.go:954] "Client rotation is on, will bootstrap in background" May 16 03:29:13.932368 kubelet[2715]: I0516 03:29:13.932317 2715 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 16 03:29:13.935263 kubelet[2715]: I0516 03:29:13.935194 2715 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 16 03:29:13.945949 kubelet[2715]: I0516 03:29:13.945903 2715 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 16 03:29:13.949683 kubelet[2715]: I0516 03:29:13.949652 2715 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 16 03:29:13.949930 kubelet[2715]: I0516 03:29:13.949897 2715 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 16 03:29:13.950199 kubelet[2715]: I0516 03:29:13.949926 2715 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-34cf5e3c62.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 16 03:29:13.950199 kubelet[2715]: I0516 03:29:13.950206 2715 topology_manager.go:138] "Creating topology manager with none policy" May 16 03:29:13.950902 kubelet[2715]: I0516 03:29:13.950219 2715 container_manager_linux.go:304] "Creating device plugin manager" May 16 03:29:13.950902 kubelet[2715]: I0516 03:29:13.950310 2715 state_mem.go:36] "Initialized new in-memory state store" May 16 03:29:13.950902 kubelet[2715]: I0516 03:29:13.950465 2715 kubelet.go:446] "Attempting to sync node with API server" May 16 03:29:13.950902 kubelet[2715]: I0516 03:29:13.950490 2715 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 16 03:29:13.950902 kubelet[2715]: I0516 03:29:13.950520 2715 kubelet.go:352] "Adding apiserver pod source" May 16 03:29:13.950902 kubelet[2715]: I0516 03:29:13.950543 2715 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 16 03:29:13.956039 kubelet[2715]: I0516 03:29:13.954085 2715 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 16 03:29:13.956039 kubelet[2715]: I0516 03:29:13.955211 2715 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 16 03:29:13.956777 kubelet[2715]: I0516 03:29:13.956738 2715 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 16 03:29:13.957074 kubelet[2715]: I0516 03:29:13.957041 2715 server.go:1287] "Started kubelet" May 16 03:29:13.972025 kubelet[2715]: I0516 03:29:13.970469 2715 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 16 03:29:13.978591 kubelet[2715]: I0516 03:29:13.977057 2715 volume_manager.go:297] "Starting Kubelet Volume Manager" May 16 03:29:13.978591 kubelet[2715]: I0516 03:29:13.977366 2715 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 16 03:29:13.981205 kubelet[2715]: E0516 03:29:13.981170 2715 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-34cf5e3c62.novalocal\" not found" May 16 03:29:13.981827 kubelet[2715]: I0516 03:29:13.981463 2715 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 16 03:29:13.981827 kubelet[2715]: I0516 03:29:13.981633 2715 reconciler.go:26] "Reconciler: start to sync state" May 16 03:29:13.981933 kubelet[2715]: E0516 03:29:13.981909 2715 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 16 03:29:13.984332 kubelet[2715]: I0516 03:29:13.984282 2715 server.go:479] "Adding debug handlers to kubelet server" May 16 03:29:13.986226 kubelet[2715]: I0516 03:29:13.986044 2715 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 16 03:29:13.986698 kubelet[2715]: I0516 03:29:13.986667 2715 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 16 03:29:13.989906 kubelet[2715]: I0516 03:29:13.988911 2715 factory.go:221] Registration of the systemd container factory successfully May 16 03:29:13.989906 kubelet[2715]: I0516 03:29:13.989075 2715 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 16 03:29:14.002893 kubelet[2715]: I0516 03:29:14.002850 2715 factory.go:221] Registration of the containerd container factory successfully May 16 03:29:14.010964 kubelet[2715]: I0516 03:29:14.006233 2715 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 16 03:29:14.018144 kubelet[2715]: I0516 03:29:14.017790 2715 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 16 03:29:14.019002 kubelet[2715]: I0516 03:29:14.018923 2715 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 16 03:29:14.019068 kubelet[2715]: I0516 03:29:14.019055 2715 status_manager.go:227] "Starting to sync pod status with apiserver" May 16 03:29:14.019108 kubelet[2715]: I0516 03:29:14.019089 2715 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 16 03:29:14.019108 kubelet[2715]: I0516 03:29:14.019098 2715 kubelet.go:2382] "Starting kubelet main sync loop" May 16 03:29:14.019202 kubelet[2715]: E0516 03:29:14.019166 2715 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 16 03:29:14.074325 kubelet[2715]: I0516 03:29:14.074287 2715 cpu_manager.go:221] "Starting CPU manager" policy="none" May 16 03:29:14.074325 kubelet[2715]: I0516 03:29:14.074307 2715 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 16 03:29:14.074516 kubelet[2715]: I0516 03:29:14.074343 2715 state_mem.go:36] "Initialized new in-memory state store" May 16 03:29:14.074557 kubelet[2715]: I0516 03:29:14.074521 2715 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 16 03:29:14.074557 kubelet[2715]: I0516 03:29:14.074541 2715 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 16 03:29:14.074625 kubelet[2715]: I0516 03:29:14.074571 2715 policy_none.go:49] "None policy: Start" May 16 03:29:14.074625 kubelet[2715]: I0516 03:29:14.074593 2715 memory_manager.go:186] "Starting memorymanager" policy="None" May 16 03:29:14.074625 kubelet[2715]: I0516 03:29:14.074613 2715 state_mem.go:35] "Initializing new in-memory state store" May 16 03:29:14.074764 kubelet[2715]: I0516 03:29:14.074735 2715 state_mem.go:75] "Updated machine memory state" May 16 03:29:14.079238 kubelet[2715]: I0516 03:29:14.079214 2715 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 16 03:29:14.079396 kubelet[2715]: I0516 03:29:14.079380 2715 eviction_manager.go:189] "Eviction manager: starting control loop" May 16 03:29:14.079441 kubelet[2715]: I0516 03:29:14.079401 2715 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 16 03:29:14.079932 kubelet[2715]: I0516 03:29:14.079908 2715 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 16 03:29:14.081035 kubelet[2715]: E0516 03:29:14.081013 2715 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 16 03:29:14.120474 kubelet[2715]: I0516 03:29:14.120361 2715 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.124561 kubelet[2715]: I0516 03:29:14.124307 2715 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.126742 kubelet[2715]: I0516 03:29:14.126718 2715 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.148465 kubelet[2715]: W0516 03:29:14.148428 2715 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 16 03:29:14.155263 kubelet[2715]: W0516 03:29:14.155231 2715 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 16 03:29:14.156875 kubelet[2715]: W0516 03:29:14.156804 2715 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 16 03:29:14.157141 kubelet[2715]: E0516 03:29:14.156886 2715 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal\" already exists" pod="kube-system/kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.184515 kubelet[2715]: I0516 03:29:14.183900 2715 kubelet_node_status.go:75] "Attempting to register node" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.200773 kubelet[2715]: I0516 03:29:14.200705 2715 kubelet_node_status.go:124] "Node was previously registered" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.201265 kubelet[2715]: I0516 03:29:14.201129 2715 kubelet_node_status.go:78] "Successfully registered node" node="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.283751 kubelet[2715]: I0516 03:29:14.283310 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1beeb9e74854c2eec2d4bac81830b4ff-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"1beeb9e74854c2eec2d4bac81830b4ff\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.283751 kubelet[2715]: I0516 03:29:14.283398 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/20aa906a118e6b0ed5f5319dedbd16e0-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"20aa906a118e6b0ed5f5319dedbd16e0\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.283751 kubelet[2715]: I0516 03:29:14.283466 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/20aa906a118e6b0ed5f5319dedbd16e0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"20aa906a118e6b0ed5f5319dedbd16e0\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.283751 kubelet[2715]: I0516 03:29:14.283507 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e632c0389afc7e67a331931f852518d3-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"e632c0389afc7e67a331931f852518d3\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.284071 kubelet[2715]: I0516 03:29:14.283532 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e632c0389afc7e67a331931f852518d3-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"e632c0389afc7e67a331931f852518d3\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.284071 kubelet[2715]: I0516 03:29:14.283637 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/20aa906a118e6b0ed5f5319dedbd16e0-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"20aa906a118e6b0ed5f5319dedbd16e0\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.284071 kubelet[2715]: I0516 03:29:14.283721 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e632c0389afc7e67a331931f852518d3-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"e632c0389afc7e67a331931f852518d3\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.284071 kubelet[2715]: I0516 03:29:14.283797 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e632c0389afc7e67a331931f852518d3-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"e632c0389afc7e67a331931f852518d3\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.284231 kubelet[2715]: I0516 03:29:14.283871 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e632c0389afc7e67a331931f852518d3-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal\" (UID: \"e632c0389afc7e67a331931f852518d3\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:14.951971 kubelet[2715]: I0516 03:29:14.951152 2715 apiserver.go:52] "Watching apiserver" May 16 03:29:14.982091 kubelet[2715]: I0516 03:29:14.982008 2715 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 16 03:29:15.062501 kubelet[2715]: I0516 03:29:15.061614 2715 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:15.083467 kubelet[2715]: W0516 03:29:15.083416 2715 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 16 03:29:15.085036 kubelet[2715]: E0516 03:29:15.083768 2715 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:29:15.116082 kubelet[2715]: I0516 03:29:15.115357 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4284-0-0-n-34cf5e3c62.novalocal" podStartSLOduration=1.115254379 podStartE2EDuration="1.115254379s" podCreationTimestamp="2025-05-16 03:29:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 03:29:15.114777855 +0000 UTC m=+1.248051692" watchObservedRunningTime="2025-05-16 03:29:15.115254379 +0000 UTC m=+1.248528166" May 16 03:29:15.152172 kubelet[2715]: I0516 03:29:15.151161 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4284-0-0-n-34cf5e3c62.novalocal" podStartSLOduration=4.151125555 podStartE2EDuration="4.151125555s" podCreationTimestamp="2025-05-16 03:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 03:29:15.134019743 +0000 UTC m=+1.267293580" watchObservedRunningTime="2025-05-16 03:29:15.151125555 +0000 UTC m=+1.284399362" May 16 03:29:15.180029 kubelet[2715]: I0516 03:29:15.177889 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-34cf5e3c62.novalocal" podStartSLOduration=1.177855828 podStartE2EDuration="1.177855828s" podCreationTimestamp="2025-05-16 03:29:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 03:29:15.151739597 +0000 UTC m=+1.285013464" watchObservedRunningTime="2025-05-16 03:29:15.177855828 +0000 UTC m=+1.311129665" May 16 03:29:18.355647 kubelet[2715]: I0516 03:29:18.355546 2715 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 16 03:29:18.356760 containerd[1476]: time="2025-05-16T03:29:18.356517306Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 16 03:29:18.370667 kubelet[2715]: I0516 03:29:18.357128 2715 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 16 03:29:19.282281 systemd[1]: Created slice kubepods-besteffort-pod14b9100f_669a_456e_98d0_5a1199a8e6e7.slice - libcontainer container kubepods-besteffort-pod14b9100f_669a_456e_98d0_5a1199a8e6e7.slice. May 16 03:29:19.323277 kubelet[2715]: I0516 03:29:19.323207 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/14b9100f-669a-456e-98d0-5a1199a8e6e7-xtables-lock\") pod \"kube-proxy-7zr5c\" (UID: \"14b9100f-669a-456e-98d0-5a1199a8e6e7\") " pod="kube-system/kube-proxy-7zr5c" May 16 03:29:19.323670 kubelet[2715]: I0516 03:29:19.323503 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14b9100f-669a-456e-98d0-5a1199a8e6e7-lib-modules\") pod \"kube-proxy-7zr5c\" (UID: \"14b9100f-669a-456e-98d0-5a1199a8e6e7\") " pod="kube-system/kube-proxy-7zr5c" May 16 03:29:19.323670 kubelet[2715]: I0516 03:29:19.323574 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/14b9100f-669a-456e-98d0-5a1199a8e6e7-kube-proxy\") pod \"kube-proxy-7zr5c\" (UID: \"14b9100f-669a-456e-98d0-5a1199a8e6e7\") " pod="kube-system/kube-proxy-7zr5c" May 16 03:29:19.323670 kubelet[2715]: I0516 03:29:19.323604 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j46w5\" (UniqueName: \"kubernetes.io/projected/14b9100f-669a-456e-98d0-5a1199a8e6e7-kube-api-access-j46w5\") pod \"kube-proxy-7zr5c\" (UID: \"14b9100f-669a-456e-98d0-5a1199a8e6e7\") " pod="kube-system/kube-proxy-7zr5c" May 16 03:29:19.354045 systemd[1]: Created slice kubepods-besteffort-podc57ffc17_0dfd_415a_9df1_f8e5d9e75b2f.slice - libcontainer container kubepods-besteffort-podc57ffc17_0dfd_415a_9df1_f8e5d9e75b2f.slice. May 16 03:29:19.424712 kubelet[2715]: I0516 03:29:19.423973 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9t25\" (UniqueName: \"kubernetes.io/projected/c57ffc17-0dfd-415a-9df1-f8e5d9e75b2f-kube-api-access-v9t25\") pod \"tigera-operator-844669ff44-mvmq4\" (UID: \"c57ffc17-0dfd-415a-9df1-f8e5d9e75b2f\") " pod="tigera-operator/tigera-operator-844669ff44-mvmq4" May 16 03:29:19.424712 kubelet[2715]: I0516 03:29:19.424038 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c57ffc17-0dfd-415a-9df1-f8e5d9e75b2f-var-lib-calico\") pod \"tigera-operator-844669ff44-mvmq4\" (UID: \"c57ffc17-0dfd-415a-9df1-f8e5d9e75b2f\") " pod="tigera-operator/tigera-operator-844669ff44-mvmq4" May 16 03:29:19.598191 containerd[1476]: time="2025-05-16T03:29:19.597382018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7zr5c,Uid:14b9100f-669a-456e-98d0-5a1199a8e6e7,Namespace:kube-system,Attempt:0,}" May 16 03:29:19.657876 containerd[1476]: time="2025-05-16T03:29:19.657766449Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-mvmq4,Uid:c57ffc17-0dfd-415a-9df1-f8e5d9e75b2f,Namespace:tigera-operator,Attempt:0,}" May 16 03:29:20.498763 containerd[1476]: time="2025-05-16T03:29:20.498702013Z" level=info msg="connecting to shim a7f8859a55b063a3acb7093d16a6a59d6b5c03b9605f67d514c845737d73b55e" address="unix:///run/containerd/s/b2c8daffce9c55598ec51bf5046f2394a832bc5109568876341cb1a2f0e5ca9a" namespace=k8s.io protocol=ttrpc version=3 May 16 03:29:20.504563 containerd[1476]: time="2025-05-16T03:29:20.504235401Z" level=info msg="connecting to shim f349462f6a0b9ef104fc1c8970945e4b47ee633b02eab79ef6023dd5333eed65" address="unix:///run/containerd/s/66e2823efcf4b73251e0eb73ba89490afaf3095ec0743132ee747f047256863d" namespace=k8s.io protocol=ttrpc version=3 May 16 03:29:20.558138 systemd[1]: Started cri-containerd-a7f8859a55b063a3acb7093d16a6a59d6b5c03b9605f67d514c845737d73b55e.scope - libcontainer container a7f8859a55b063a3acb7093d16a6a59d6b5c03b9605f67d514c845737d73b55e. May 16 03:29:20.560789 systemd[1]: Started cri-containerd-f349462f6a0b9ef104fc1c8970945e4b47ee633b02eab79ef6023dd5333eed65.scope - libcontainer container f349462f6a0b9ef104fc1c8970945e4b47ee633b02eab79ef6023dd5333eed65. May 16 03:29:20.605942 containerd[1476]: time="2025-05-16T03:29:20.605749611Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7zr5c,Uid:14b9100f-669a-456e-98d0-5a1199a8e6e7,Namespace:kube-system,Attempt:0,} returns sandbox id \"a7f8859a55b063a3acb7093d16a6a59d6b5c03b9605f67d514c845737d73b55e\"" May 16 03:29:20.614023 containerd[1476]: time="2025-05-16T03:29:20.611693759Z" level=info msg="CreateContainer within sandbox \"a7f8859a55b063a3acb7093d16a6a59d6b5c03b9605f67d514c845737d73b55e\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 16 03:29:20.640261 containerd[1476]: time="2025-05-16T03:29:20.640209230Z" level=info msg="Container 5bb6edd27c8ac51a6714cf7714cecba4ff89be9f092cb60030cbc0305aa7e113: CDI devices from CRI Config.CDIDevices: []" May 16 03:29:20.641782 containerd[1476]: time="2025-05-16T03:29:20.641623232Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-mvmq4,Uid:c57ffc17-0dfd-415a-9df1-f8e5d9e75b2f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"f349462f6a0b9ef104fc1c8970945e4b47ee633b02eab79ef6023dd5333eed65\"" May 16 03:29:20.645385 containerd[1476]: time="2025-05-16T03:29:20.644881473Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 16 03:29:20.661342 containerd[1476]: time="2025-05-16T03:29:20.661287643Z" level=info msg="CreateContainer within sandbox \"a7f8859a55b063a3acb7093d16a6a59d6b5c03b9605f67d514c845737d73b55e\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"5bb6edd27c8ac51a6714cf7714cecba4ff89be9f092cb60030cbc0305aa7e113\"" May 16 03:29:20.662788 containerd[1476]: time="2025-05-16T03:29:20.662197469Z" level=info msg="StartContainer for \"5bb6edd27c8ac51a6714cf7714cecba4ff89be9f092cb60030cbc0305aa7e113\"" May 16 03:29:20.665834 containerd[1476]: time="2025-05-16T03:29:20.665764379Z" level=info msg="connecting to shim 5bb6edd27c8ac51a6714cf7714cecba4ff89be9f092cb60030cbc0305aa7e113" address="unix:///run/containerd/s/b2c8daffce9c55598ec51bf5046f2394a832bc5109568876341cb1a2f0e5ca9a" protocol=ttrpc version=3 May 16 03:29:20.689125 systemd[1]: Started cri-containerd-5bb6edd27c8ac51a6714cf7714cecba4ff89be9f092cb60030cbc0305aa7e113.scope - libcontainer container 5bb6edd27c8ac51a6714cf7714cecba4ff89be9f092cb60030cbc0305aa7e113. May 16 03:29:20.740288 containerd[1476]: time="2025-05-16T03:29:20.740236301Z" level=info msg="StartContainer for \"5bb6edd27c8ac51a6714cf7714cecba4ff89be9f092cb60030cbc0305aa7e113\" returns successfully" May 16 03:29:21.102577 kubelet[2715]: I0516 03:29:21.101132 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-7zr5c" podStartSLOduration=2.101080985 podStartE2EDuration="2.101080985s" podCreationTimestamp="2025-05-16 03:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 03:29:21.100792343 +0000 UTC m=+7.234066101" watchObservedRunningTime="2025-05-16 03:29:21.101080985 +0000 UTC m=+7.234354782" May 16 03:29:22.880626 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount888310426.mount: Deactivated successfully. May 16 03:29:23.816521 containerd[1476]: time="2025-05-16T03:29:23.816423731Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:23.818557 containerd[1476]: time="2025-05-16T03:29:23.818493162Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=25055451" May 16 03:29:23.820501 containerd[1476]: time="2025-05-16T03:29:23.820461513Z" level=info msg="ImageCreate event name:\"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:23.823212 containerd[1476]: time="2025-05-16T03:29:23.823151186Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:23.823885 containerd[1476]: time="2025-05-16T03:29:23.823848755Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"25051446\" in 3.178915455s" May 16 03:29:23.823942 containerd[1476]: time="2025-05-16T03:29:23.823885044Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\"" May 16 03:29:23.828853 containerd[1476]: time="2025-05-16T03:29:23.827671098Z" level=info msg="CreateContainer within sandbox \"f349462f6a0b9ef104fc1c8970945e4b47ee633b02eab79ef6023dd5333eed65\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 16 03:29:23.840052 containerd[1476]: time="2025-05-16T03:29:23.839660870Z" level=info msg="Container ab9310c0525b83ab54deab403850dace40e4ddd7abbd5faccfede54695980a7b: CDI devices from CRI Config.CDIDevices: []" May 16 03:29:23.845102 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2719865742.mount: Deactivated successfully. May 16 03:29:23.855791 containerd[1476]: time="2025-05-16T03:29:23.855755939Z" level=info msg="CreateContainer within sandbox \"f349462f6a0b9ef104fc1c8970945e4b47ee633b02eab79ef6023dd5333eed65\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"ab9310c0525b83ab54deab403850dace40e4ddd7abbd5faccfede54695980a7b\"" May 16 03:29:23.862632 containerd[1476]: time="2025-05-16T03:29:23.862497721Z" level=info msg="StartContainer for \"ab9310c0525b83ab54deab403850dace40e4ddd7abbd5faccfede54695980a7b\"" May 16 03:29:23.867938 containerd[1476]: time="2025-05-16T03:29:23.866793489Z" level=info msg="connecting to shim ab9310c0525b83ab54deab403850dace40e4ddd7abbd5faccfede54695980a7b" address="unix:///run/containerd/s/66e2823efcf4b73251e0eb73ba89490afaf3095ec0743132ee747f047256863d" protocol=ttrpc version=3 May 16 03:29:23.894184 systemd[1]: Started cri-containerd-ab9310c0525b83ab54deab403850dace40e4ddd7abbd5faccfede54695980a7b.scope - libcontainer container ab9310c0525b83ab54deab403850dace40e4ddd7abbd5faccfede54695980a7b. May 16 03:29:23.933097 containerd[1476]: time="2025-05-16T03:29:23.933020447Z" level=info msg="StartContainer for \"ab9310c0525b83ab54deab403850dace40e4ddd7abbd5faccfede54695980a7b\" returns successfully" May 16 03:29:31.285739 sudo[1755]: pam_unix(sudo:session): session closed for user root May 16 03:29:31.493019 sshd[1754]: Connection closed by 172.24.4.1 port 47688 May 16 03:29:31.493064 sshd-session[1751]: pam_unix(sshd:session): session closed for user core May 16 03:29:31.501745 systemd[1]: sshd@8-172.24.4.18:22-172.24.4.1:47688.service: Deactivated successfully. May 16 03:29:31.508827 systemd[1]: session-11.scope: Deactivated successfully. May 16 03:29:31.511253 systemd[1]: session-11.scope: Consumed 7.265s CPU time, 225M memory peak. May 16 03:29:31.516880 systemd-logind[1458]: Session 11 logged out. Waiting for processes to exit. May 16 03:29:31.520696 systemd-logind[1458]: Removed session 11. May 16 03:29:35.708522 kubelet[2715]: I0516 03:29:35.708332 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-844669ff44-mvmq4" podStartSLOduration=13.5260354 podStartE2EDuration="16.708087847s" podCreationTimestamp="2025-05-16 03:29:19 +0000 UTC" firstStartedPulling="2025-05-16 03:29:20.643065537 +0000 UTC m=+6.776339284" lastFinishedPulling="2025-05-16 03:29:23.825117984 +0000 UTC m=+9.958391731" observedRunningTime="2025-05-16 03:29:24.162353699 +0000 UTC m=+10.295627536" watchObservedRunningTime="2025-05-16 03:29:35.708087847 +0000 UTC m=+21.841361624" May 16 03:29:35.732613 systemd[1]: Created slice kubepods-besteffort-pod586c223d_d8b6_4715_a7c0_caa15b6abbf7.slice - libcontainer container kubepods-besteffort-pod586c223d_d8b6_4715_a7c0_caa15b6abbf7.slice. May 16 03:29:35.737411 kubelet[2715]: I0516 03:29:35.736762 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjvhn\" (UniqueName: \"kubernetes.io/projected/586c223d-d8b6-4715-a7c0-caa15b6abbf7-kube-api-access-tjvhn\") pod \"calico-typha-7bc674f944-hvxhk\" (UID: \"586c223d-d8b6-4715-a7c0-caa15b6abbf7\") " pod="calico-system/calico-typha-7bc674f944-hvxhk" May 16 03:29:35.737411 kubelet[2715]: I0516 03:29:35.736867 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/586c223d-d8b6-4715-a7c0-caa15b6abbf7-tigera-ca-bundle\") pod \"calico-typha-7bc674f944-hvxhk\" (UID: \"586c223d-d8b6-4715-a7c0-caa15b6abbf7\") " pod="calico-system/calico-typha-7bc674f944-hvxhk" May 16 03:29:35.737411 kubelet[2715]: I0516 03:29:35.737071 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/586c223d-d8b6-4715-a7c0-caa15b6abbf7-typha-certs\") pod \"calico-typha-7bc674f944-hvxhk\" (UID: \"586c223d-d8b6-4715-a7c0-caa15b6abbf7\") " pod="calico-system/calico-typha-7bc674f944-hvxhk" May 16 03:29:36.044285 containerd[1476]: time="2025-05-16T03:29:36.043257188Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7bc674f944-hvxhk,Uid:586c223d-d8b6-4715-a7c0-caa15b6abbf7,Namespace:calico-system,Attempt:0,}" May 16 03:29:36.098640 containerd[1476]: time="2025-05-16T03:29:36.098581495Z" level=info msg="connecting to shim f6795a17a1c56d2f50a2a41bda7254bc663a9503616f8c37f30a2bdff32d2a55" address="unix:///run/containerd/s/9732f3806d6417a9e6c00412a979c224d8b7ad845bfd0735702a2f7c2d8c9f66" namespace=k8s.io protocol=ttrpc version=3 May 16 03:29:36.101843 systemd[1]: Created slice kubepods-besteffort-podcfd7b224_cdc9_4a90_8c0a_d63235575388.slice - libcontainer container kubepods-besteffort-podcfd7b224_cdc9_4a90_8c0a_d63235575388.slice. May 16 03:29:36.139725 kubelet[2715]: I0516 03:29:36.139642 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/cfd7b224-cdc9-4a90-8c0a-d63235575388-cni-bin-dir\") pod \"calico-node-rjbd9\" (UID: \"cfd7b224-cdc9-4a90-8c0a-d63235575388\") " pod="calico-system/calico-node-rjbd9" May 16 03:29:36.139725 kubelet[2715]: I0516 03:29:36.139727 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/cfd7b224-cdc9-4a90-8c0a-d63235575388-cni-net-dir\") pod \"calico-node-rjbd9\" (UID: \"cfd7b224-cdc9-4a90-8c0a-d63235575388\") " pod="calico-system/calico-node-rjbd9" May 16 03:29:36.139972 kubelet[2715]: I0516 03:29:36.139759 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j75kj\" (UniqueName: \"kubernetes.io/projected/cfd7b224-cdc9-4a90-8c0a-d63235575388-kube-api-access-j75kj\") pod \"calico-node-rjbd9\" (UID: \"cfd7b224-cdc9-4a90-8c0a-d63235575388\") " pod="calico-system/calico-node-rjbd9" May 16 03:29:36.139972 kubelet[2715]: I0516 03:29:36.139798 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfd7b224-cdc9-4a90-8c0a-d63235575388-tigera-ca-bundle\") pod \"calico-node-rjbd9\" (UID: \"cfd7b224-cdc9-4a90-8c0a-d63235575388\") " pod="calico-system/calico-node-rjbd9" May 16 03:29:36.139972 kubelet[2715]: I0516 03:29:36.139825 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/cfd7b224-cdc9-4a90-8c0a-d63235575388-var-run-calico\") pod \"calico-node-rjbd9\" (UID: \"cfd7b224-cdc9-4a90-8c0a-d63235575388\") " pod="calico-system/calico-node-rjbd9" May 16 03:29:36.139972 kubelet[2715]: I0516 03:29:36.139854 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/cfd7b224-cdc9-4a90-8c0a-d63235575388-policysync\") pod \"calico-node-rjbd9\" (UID: \"cfd7b224-cdc9-4a90-8c0a-d63235575388\") " pod="calico-system/calico-node-rjbd9" May 16 03:29:36.139972 kubelet[2715]: I0516 03:29:36.139887 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cfd7b224-cdc9-4a90-8c0a-d63235575388-lib-modules\") pod \"calico-node-rjbd9\" (UID: \"cfd7b224-cdc9-4a90-8c0a-d63235575388\") " pod="calico-system/calico-node-rjbd9" May 16 03:29:36.140295 kubelet[2715]: I0516 03:29:36.139910 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/cfd7b224-cdc9-4a90-8c0a-d63235575388-xtables-lock\") pod \"calico-node-rjbd9\" (UID: \"cfd7b224-cdc9-4a90-8c0a-d63235575388\") " pod="calico-system/calico-node-rjbd9" May 16 03:29:36.140295 kubelet[2715]: I0516 03:29:36.139956 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/cfd7b224-cdc9-4a90-8c0a-d63235575388-flexvol-driver-host\") pod \"calico-node-rjbd9\" (UID: \"cfd7b224-cdc9-4a90-8c0a-d63235575388\") " pod="calico-system/calico-node-rjbd9" May 16 03:29:36.140295 kubelet[2715]: I0516 03:29:36.140024 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/cfd7b224-cdc9-4a90-8c0a-d63235575388-var-lib-calico\") pod \"calico-node-rjbd9\" (UID: \"cfd7b224-cdc9-4a90-8c0a-d63235575388\") " pod="calico-system/calico-node-rjbd9" May 16 03:29:36.140295 kubelet[2715]: I0516 03:29:36.140056 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/cfd7b224-cdc9-4a90-8c0a-d63235575388-cni-log-dir\") pod \"calico-node-rjbd9\" (UID: \"cfd7b224-cdc9-4a90-8c0a-d63235575388\") " pod="calico-system/calico-node-rjbd9" May 16 03:29:36.140295 kubelet[2715]: I0516 03:29:36.140089 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/cfd7b224-cdc9-4a90-8c0a-d63235575388-node-certs\") pod \"calico-node-rjbd9\" (UID: \"cfd7b224-cdc9-4a90-8c0a-d63235575388\") " pod="calico-system/calico-node-rjbd9" May 16 03:29:36.152399 systemd[1]: Started cri-containerd-f6795a17a1c56d2f50a2a41bda7254bc663a9503616f8c37f30a2bdff32d2a55.scope - libcontainer container f6795a17a1c56d2f50a2a41bda7254bc663a9503616f8c37f30a2bdff32d2a55. May 16 03:29:36.251849 kubelet[2715]: E0516 03:29:36.250161 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.251849 kubelet[2715]: W0516 03:29:36.250223 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.251849 kubelet[2715]: E0516 03:29:36.250553 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.251849 kubelet[2715]: W0516 03:29:36.250564 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.251849 kubelet[2715]: E0516 03:29:36.250641 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.251849 kubelet[2715]: E0516 03:29:36.250693 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.265945 kubelet[2715]: E0516 03:29:36.265858 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.265945 kubelet[2715]: W0516 03:29:36.265891 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.266269 kubelet[2715]: E0516 03:29:36.265969 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.268287 containerd[1476]: time="2025-05-16T03:29:36.267802037Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7bc674f944-hvxhk,Uid:586c223d-d8b6-4715-a7c0-caa15b6abbf7,Namespace:calico-system,Attempt:0,} returns sandbox id \"f6795a17a1c56d2f50a2a41bda7254bc663a9503616f8c37f30a2bdff32d2a55\"" May 16 03:29:36.272532 containerd[1476]: time="2025-05-16T03:29:36.272324116Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 16 03:29:36.388914 kubelet[2715]: E0516 03:29:36.388572 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5gjgv" podUID="ec9adb4c-4eda-47e3-8ebd-e314c0ec0140" May 16 03:29:36.423906 kubelet[2715]: E0516 03:29:36.423265 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.423906 kubelet[2715]: W0516 03:29:36.423764 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.423906 kubelet[2715]: E0516 03:29:36.423793 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.424833 kubelet[2715]: E0516 03:29:36.424695 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.424833 kubelet[2715]: W0516 03:29:36.424711 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.424833 kubelet[2715]: E0516 03:29:36.424725 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.425766 kubelet[2715]: E0516 03:29:36.425503 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.425766 kubelet[2715]: W0516 03:29:36.425518 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.426532 kubelet[2715]: E0516 03:29:36.426196 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.429015 kubelet[2715]: E0516 03:29:36.428758 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.429015 kubelet[2715]: W0516 03:29:36.428783 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.430366 kubelet[2715]: E0516 03:29:36.428802 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.431111 kubelet[2715]: E0516 03:29:36.431020 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.431111 kubelet[2715]: W0516 03:29:36.431039 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.431111 kubelet[2715]: E0516 03:29:36.431056 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.432031 kubelet[2715]: E0516 03:29:36.431786 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.432031 kubelet[2715]: W0516 03:29:36.431800 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.432031 kubelet[2715]: E0516 03:29:36.431814 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.433152 kubelet[2715]: E0516 03:29:36.432885 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.433152 kubelet[2715]: W0516 03:29:36.433064 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.433152 kubelet[2715]: E0516 03:29:36.433084 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.434180 kubelet[2715]: E0516 03:29:36.433880 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.434180 kubelet[2715]: W0516 03:29:36.433893 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.434180 kubelet[2715]: E0516 03:29:36.433905 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.435305 kubelet[2715]: E0516 03:29:36.435076 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.435305 kubelet[2715]: W0516 03:29:36.435091 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.435305 kubelet[2715]: E0516 03:29:36.435119 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.436775 kubelet[2715]: E0516 03:29:36.435745 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.436775 kubelet[2715]: W0516 03:29:36.435760 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.436775 kubelet[2715]: E0516 03:29:36.436341 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.437145 containerd[1476]: time="2025-05-16T03:29:36.435969236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-rjbd9,Uid:cfd7b224-cdc9-4a90-8c0a-d63235575388,Namespace:calico-system,Attempt:0,}" May 16 03:29:36.437341 kubelet[2715]: E0516 03:29:36.437222 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.437341 kubelet[2715]: W0516 03:29:36.437234 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.437595 kubelet[2715]: E0516 03:29:36.437478 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.438104 kubelet[2715]: E0516 03:29:36.438062 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.438402 kubelet[2715]: W0516 03:29:36.438258 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.438402 kubelet[2715]: E0516 03:29:36.438295 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.438954 kubelet[2715]: E0516 03:29:36.438811 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.438954 kubelet[2715]: W0516 03:29:36.438825 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.438954 kubelet[2715]: E0516 03:29:36.438868 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.439882 kubelet[2715]: E0516 03:29:36.439741 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.439882 kubelet[2715]: W0516 03:29:36.439757 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.439882 kubelet[2715]: E0516 03:29:36.439771 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.440395 kubelet[2715]: E0516 03:29:36.440372 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.440395 kubelet[2715]: W0516 03:29:36.440387 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.440630 kubelet[2715]: E0516 03:29:36.440399 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.440630 kubelet[2715]: E0516 03:29:36.440581 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.440630 kubelet[2715]: W0516 03:29:36.440591 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.440630 kubelet[2715]: E0516 03:29:36.440601 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.445610 kubelet[2715]: E0516 03:29:36.440775 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.445610 kubelet[2715]: W0516 03:29:36.440786 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.445610 kubelet[2715]: E0516 03:29:36.440795 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.445610 kubelet[2715]: E0516 03:29:36.440950 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.445610 kubelet[2715]: W0516 03:29:36.440961 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.445610 kubelet[2715]: E0516 03:29:36.440971 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.445610 kubelet[2715]: E0516 03:29:36.441200 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.445610 kubelet[2715]: W0516 03:29:36.441210 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.445610 kubelet[2715]: E0516 03:29:36.441221 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.445610 kubelet[2715]: E0516 03:29:36.441438 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.446159 kubelet[2715]: W0516 03:29:36.441448 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.446159 kubelet[2715]: E0516 03:29:36.441458 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.446159 kubelet[2715]: E0516 03:29:36.442795 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.446159 kubelet[2715]: W0516 03:29:36.442821 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.446159 kubelet[2715]: E0516 03:29:36.442843 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.446159 kubelet[2715]: I0516 03:29:36.442926 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec9adb4c-4eda-47e3-8ebd-e314c0ec0140-kubelet-dir\") pod \"csi-node-driver-5gjgv\" (UID: \"ec9adb4c-4eda-47e3-8ebd-e314c0ec0140\") " pod="calico-system/csi-node-driver-5gjgv" May 16 03:29:36.446159 kubelet[2715]: E0516 03:29:36.443294 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.446159 kubelet[2715]: W0516 03:29:36.443321 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.446159 kubelet[2715]: E0516 03:29:36.443341 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.447236 kubelet[2715]: I0516 03:29:36.443361 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ec9adb4c-4eda-47e3-8ebd-e314c0ec0140-socket-dir\") pod \"csi-node-driver-5gjgv\" (UID: \"ec9adb4c-4eda-47e3-8ebd-e314c0ec0140\") " pod="calico-system/csi-node-driver-5gjgv" May 16 03:29:36.447236 kubelet[2715]: E0516 03:29:36.443599 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.447236 kubelet[2715]: W0516 03:29:36.443611 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.447236 kubelet[2715]: E0516 03:29:36.443629 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.447236 kubelet[2715]: I0516 03:29:36.443681 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/ec9adb4c-4eda-47e3-8ebd-e314c0ec0140-varrun\") pod \"csi-node-driver-5gjgv\" (UID: \"ec9adb4c-4eda-47e3-8ebd-e314c0ec0140\") " pod="calico-system/csi-node-driver-5gjgv" May 16 03:29:36.447236 kubelet[2715]: E0516 03:29:36.443884 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.447236 kubelet[2715]: W0516 03:29:36.443896 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.447236 kubelet[2715]: E0516 03:29:36.443914 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.447236 kubelet[2715]: E0516 03:29:36.444133 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.447648 kubelet[2715]: W0516 03:29:36.444144 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.447648 kubelet[2715]: E0516 03:29:36.444163 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.447648 kubelet[2715]: E0516 03:29:36.444361 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.447648 kubelet[2715]: W0516 03:29:36.444371 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.447648 kubelet[2715]: E0516 03:29:36.444389 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.447648 kubelet[2715]: E0516 03:29:36.444696 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.447648 kubelet[2715]: W0516 03:29:36.444706 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.447648 kubelet[2715]: E0516 03:29:36.444716 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.447648 kubelet[2715]: E0516 03:29:36.444883 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.447648 kubelet[2715]: W0516 03:29:36.444894 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.448369 kubelet[2715]: E0516 03:29:36.444904 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.448369 kubelet[2715]: I0516 03:29:36.444931 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxzrn\" (UniqueName: \"kubernetes.io/projected/ec9adb4c-4eda-47e3-8ebd-e314c0ec0140-kube-api-access-dxzrn\") pod \"csi-node-driver-5gjgv\" (UID: \"ec9adb4c-4eda-47e3-8ebd-e314c0ec0140\") " pod="calico-system/csi-node-driver-5gjgv" May 16 03:29:36.448369 kubelet[2715]: E0516 03:29:36.445142 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.448369 kubelet[2715]: W0516 03:29:36.445154 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.448369 kubelet[2715]: E0516 03:29:36.445317 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.448369 kubelet[2715]: W0516 03:29:36.445327 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.448369 kubelet[2715]: E0516 03:29:36.445337 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.448369 kubelet[2715]: E0516 03:29:36.445597 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.448369 kubelet[2715]: W0516 03:29:36.445611 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.448729 kubelet[2715]: E0516 03:29:36.445627 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.448729 kubelet[2715]: E0516 03:29:36.446265 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.448729 kubelet[2715]: W0516 03:29:36.446276 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.448729 kubelet[2715]: E0516 03:29:36.446297 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.448729 kubelet[2715]: E0516 03:29:36.446502 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.448729 kubelet[2715]: W0516 03:29:36.446513 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.448729 kubelet[2715]: E0516 03:29:36.446526 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.448729 kubelet[2715]: E0516 03:29:36.446541 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.448729 kubelet[2715]: I0516 03:29:36.446559 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ec9adb4c-4eda-47e3-8ebd-e314c0ec0140-registration-dir\") pod \"csi-node-driver-5gjgv\" (UID: \"ec9adb4c-4eda-47e3-8ebd-e314c0ec0140\") " pod="calico-system/csi-node-driver-5gjgv" May 16 03:29:36.448729 kubelet[2715]: E0516 03:29:36.446757 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.449198 kubelet[2715]: W0516 03:29:36.446770 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.449198 kubelet[2715]: E0516 03:29:36.446780 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.449198 kubelet[2715]: E0516 03:29:36.446924 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.449198 kubelet[2715]: W0516 03:29:36.446934 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.449198 kubelet[2715]: E0516 03:29:36.446945 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.478612 containerd[1476]: time="2025-05-16T03:29:36.478465094Z" level=info msg="connecting to shim 9d7f50032733af504fe12d3345a33b7d3ff2d238a9ef730c1a5afbd6fe910994" address="unix:///run/containerd/s/ece97cd8b27fba53a9b5cfc543dcdc139306b163de31f77d33722dd7c2566970" namespace=k8s.io protocol=ttrpc version=3 May 16 03:29:36.514313 systemd[1]: Started cri-containerd-9d7f50032733af504fe12d3345a33b7d3ff2d238a9ef730c1a5afbd6fe910994.scope - libcontainer container 9d7f50032733af504fe12d3345a33b7d3ff2d238a9ef730c1a5afbd6fe910994. May 16 03:29:36.547770 kubelet[2715]: E0516 03:29:36.547720 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.548182 kubelet[2715]: W0516 03:29:36.547951 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.548182 kubelet[2715]: E0516 03:29:36.548015 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.548823 kubelet[2715]: E0516 03:29:36.548757 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.548823 kubelet[2715]: W0516 03:29:36.548775 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.548823 kubelet[2715]: E0516 03:29:36.548787 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.550299 kubelet[2715]: E0516 03:29:36.550158 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.550299 kubelet[2715]: W0516 03:29:36.550173 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.550299 kubelet[2715]: E0516 03:29:36.550184 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.551201 kubelet[2715]: E0516 03:29:36.550887 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.551201 kubelet[2715]: W0516 03:29:36.550913 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.551201 kubelet[2715]: E0516 03:29:36.550936 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.551554 kubelet[2715]: E0516 03:29:36.551247 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.551554 kubelet[2715]: W0516 03:29:36.551259 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.551554 kubelet[2715]: E0516 03:29:36.551270 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.551554 kubelet[2715]: E0516 03:29:36.551547 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.551689 kubelet[2715]: W0516 03:29:36.551563 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.551689 kubelet[2715]: E0516 03:29:36.551578 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.551835 kubelet[2715]: E0516 03:29:36.551811 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.551835 kubelet[2715]: W0516 03:29:36.551826 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.551835 kubelet[2715]: E0516 03:29:36.551855 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.552257 kubelet[2715]: E0516 03:29:36.552048 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.552257 kubelet[2715]: W0516 03:29:36.552063 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.552257 kubelet[2715]: E0516 03:29:36.552129 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.552257 kubelet[2715]: E0516 03:29:36.552253 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.552257 kubelet[2715]: W0516 03:29:36.552265 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.552776 kubelet[2715]: E0516 03:29:36.552345 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.552776 kubelet[2715]: E0516 03:29:36.552456 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.552776 kubelet[2715]: W0516 03:29:36.552466 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.552776 kubelet[2715]: E0516 03:29:36.552520 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.552776 kubelet[2715]: E0516 03:29:36.552627 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.552776 kubelet[2715]: W0516 03:29:36.552641 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.552776 kubelet[2715]: E0516 03:29:36.552661 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.553200 kubelet[2715]: E0516 03:29:36.552921 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.553200 kubelet[2715]: W0516 03:29:36.552938 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.553200 kubelet[2715]: E0516 03:29:36.552956 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.553364 kubelet[2715]: E0516 03:29:36.553286 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.553364 kubelet[2715]: W0516 03:29:36.553301 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.553364 kubelet[2715]: E0516 03:29:36.553325 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.554598 kubelet[2715]: E0516 03:29:36.554572 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.554598 kubelet[2715]: W0516 03:29:36.554592 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.554839 kubelet[2715]: E0516 03:29:36.554668 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.554930 kubelet[2715]: E0516 03:29:36.554876 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.554930 kubelet[2715]: W0516 03:29:36.554892 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.555195 kubelet[2715]: E0516 03:29:36.555107 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.555195 kubelet[2715]: W0516 03:29:36.555132 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.555195 kubelet[2715]: E0516 03:29:36.555109 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.555195 kubelet[2715]: E0516 03:29:36.555165 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.555533 kubelet[2715]: E0516 03:29:36.555389 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.555533 kubelet[2715]: W0516 03:29:36.555401 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.555533 kubelet[2715]: E0516 03:29:36.555436 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.556046 kubelet[2715]: E0516 03:29:36.556025 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.556046 kubelet[2715]: W0516 03:29:36.556041 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.556407 kubelet[2715]: E0516 03:29:36.556064 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.557484 kubelet[2715]: E0516 03:29:36.557053 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.557484 kubelet[2715]: W0516 03:29:36.557072 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.557484 kubelet[2715]: E0516 03:29:36.557094 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.558019 kubelet[2715]: E0516 03:29:36.557871 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.558019 kubelet[2715]: W0516 03:29:36.557885 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.558019 kubelet[2715]: E0516 03:29:36.557953 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.558564 kubelet[2715]: E0516 03:29:36.558455 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.558564 kubelet[2715]: W0516 03:29:36.558469 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.559045 kubelet[2715]: E0516 03:29:36.558889 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.559545 kubelet[2715]: E0516 03:29:36.559383 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.559545 kubelet[2715]: W0516 03:29:36.559395 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.559545 kubelet[2715]: E0516 03:29:36.559460 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.560615 kubelet[2715]: E0516 03:29:36.560366 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.560615 kubelet[2715]: W0516 03:29:36.560382 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.560835 kubelet[2715]: E0516 03:29:36.560617 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.561409 kubelet[2715]: E0516 03:29:36.561064 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.561409 kubelet[2715]: W0516 03:29:36.561082 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.562117 kubelet[2715]: E0516 03:29:36.561566 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.562293 kubelet[2715]: E0516 03:29:36.562220 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.562293 kubelet[2715]: W0516 03:29:36.562237 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.562293 kubelet[2715]: E0516 03:29:36.562251 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:36.570199 containerd[1476]: time="2025-05-16T03:29:36.570064612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-rjbd9,Uid:cfd7b224-cdc9-4a90-8c0a-d63235575388,Namespace:calico-system,Attempt:0,} returns sandbox id \"9d7f50032733af504fe12d3345a33b7d3ff2d238a9ef730c1a5afbd6fe910994\"" May 16 03:29:36.570347 kubelet[2715]: E0516 03:29:36.570148 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:36.570347 kubelet[2715]: W0516 03:29:36.570172 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:36.570347 kubelet[2715]: E0516 03:29:36.570236 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:38.024800 kubelet[2715]: E0516 03:29:38.020486 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5gjgv" podUID="ec9adb4c-4eda-47e3-8ebd-e314c0ec0140" May 16 03:29:38.462734 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1671740907.mount: Deactivated successfully. May 16 03:29:39.817947 containerd[1476]: time="2025-05-16T03:29:39.817889399Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:39.819944 containerd[1476]: time="2025-05-16T03:29:39.819846220Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=35158669" May 16 03:29:39.821927 containerd[1476]: time="2025-05-16T03:29:39.821891596Z" level=info msg="ImageCreate event name:\"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:39.825558 containerd[1476]: time="2025-05-16T03:29:39.825523216Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:39.826323 containerd[1476]: time="2025-05-16T03:29:39.826288565Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"35158523\" in 3.553904987s" May 16 03:29:39.826425 containerd[1476]: time="2025-05-16T03:29:39.826407508Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\"" May 16 03:29:39.828309 containerd[1476]: time="2025-05-16T03:29:39.828237310Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 16 03:29:39.845311 containerd[1476]: time="2025-05-16T03:29:39.845270654Z" level=info msg="CreateContainer within sandbox \"f6795a17a1c56d2f50a2a41bda7254bc663a9503616f8c37f30a2bdff32d2a55\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 16 03:29:39.864248 containerd[1476]: time="2025-05-16T03:29:39.864206957Z" level=info msg="Container a834d1e63b749457e1c042f6592e21caa32ae027aad8bbdd6eec369cb961444d: CDI devices from CRI Config.CDIDevices: []" May 16 03:29:39.868584 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3049123744.mount: Deactivated successfully. May 16 03:29:39.884006 containerd[1476]: time="2025-05-16T03:29:39.883500581Z" level=info msg="CreateContainer within sandbox \"f6795a17a1c56d2f50a2a41bda7254bc663a9503616f8c37f30a2bdff32d2a55\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a834d1e63b749457e1c042f6592e21caa32ae027aad8bbdd6eec369cb961444d\"" May 16 03:29:39.887632 containerd[1476]: time="2025-05-16T03:29:39.886295939Z" level=info msg="StartContainer for \"a834d1e63b749457e1c042f6592e21caa32ae027aad8bbdd6eec369cb961444d\"" May 16 03:29:39.888462 containerd[1476]: time="2025-05-16T03:29:39.888392943Z" level=info msg="connecting to shim a834d1e63b749457e1c042f6592e21caa32ae027aad8bbdd6eec369cb961444d" address="unix:///run/containerd/s/9732f3806d6417a9e6c00412a979c224d8b7ad845bfd0735702a2f7c2d8c9f66" protocol=ttrpc version=3 May 16 03:29:39.916406 systemd[1]: Started cri-containerd-a834d1e63b749457e1c042f6592e21caa32ae027aad8bbdd6eec369cb961444d.scope - libcontainer container a834d1e63b749457e1c042f6592e21caa32ae027aad8bbdd6eec369cb961444d. May 16 03:29:39.972125 containerd[1476]: time="2025-05-16T03:29:39.972085399Z" level=info msg="StartContainer for \"a834d1e63b749457e1c042f6592e21caa32ae027aad8bbdd6eec369cb961444d\" returns successfully" May 16 03:29:40.020941 kubelet[2715]: E0516 03:29:40.020365 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5gjgv" podUID="ec9adb4c-4eda-47e3-8ebd-e314c0ec0140" May 16 03:29:40.204349 kubelet[2715]: I0516 03:29:40.204162 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7bc674f944-hvxhk" podStartSLOduration=1.647642715 podStartE2EDuration="5.204144027s" podCreationTimestamp="2025-05-16 03:29:35 +0000 UTC" firstStartedPulling="2025-05-16 03:29:36.270835104 +0000 UTC m=+22.404108861" lastFinishedPulling="2025-05-16 03:29:39.827336426 +0000 UTC m=+25.960610173" observedRunningTime="2025-05-16 03:29:40.202623377 +0000 UTC m=+26.335897154" watchObservedRunningTime="2025-05-16 03:29:40.204144027 +0000 UTC m=+26.337417774" May 16 03:29:40.274102 kubelet[2715]: E0516 03:29:40.273793 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.274102 kubelet[2715]: W0516 03:29:40.273822 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.274102 kubelet[2715]: E0516 03:29:40.273877 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.275497 kubelet[2715]: E0516 03:29:40.275280 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.275497 kubelet[2715]: W0516 03:29:40.275317 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.275497 kubelet[2715]: E0516 03:29:40.275338 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.275873 kubelet[2715]: E0516 03:29:40.275769 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.275873 kubelet[2715]: W0516 03:29:40.275784 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.275873 kubelet[2715]: E0516 03:29:40.275796 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.277122 kubelet[2715]: E0516 03:29:40.277055 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.277122 kubelet[2715]: W0516 03:29:40.277073 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.277122 kubelet[2715]: E0516 03:29:40.277085 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.277666 kubelet[2715]: E0516 03:29:40.277571 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.277666 kubelet[2715]: W0516 03:29:40.277605 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.277666 kubelet[2715]: E0516 03:29:40.277619 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.278198 kubelet[2715]: E0516 03:29:40.278037 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.278198 kubelet[2715]: W0516 03:29:40.278051 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.278198 kubelet[2715]: E0516 03:29:40.278062 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.278684 kubelet[2715]: E0516 03:29:40.278571 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.278684 kubelet[2715]: W0516 03:29:40.278585 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.278684 kubelet[2715]: E0516 03:29:40.278597 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.279080 kubelet[2715]: E0516 03:29:40.279022 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.279080 kubelet[2715]: W0516 03:29:40.279034 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.279406 kubelet[2715]: E0516 03:29:40.279250 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.280300 kubelet[2715]: E0516 03:29:40.280062 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.280300 kubelet[2715]: W0516 03:29:40.280079 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.280300 kubelet[2715]: E0516 03:29:40.280091 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.281342 kubelet[2715]: E0516 03:29:40.281221 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.281342 kubelet[2715]: W0516 03:29:40.281265 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.281342 kubelet[2715]: E0516 03:29:40.281280 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.281816 kubelet[2715]: E0516 03:29:40.281689 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.281816 kubelet[2715]: W0516 03:29:40.281703 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.281816 kubelet[2715]: E0516 03:29:40.281713 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.282542 kubelet[2715]: E0516 03:29:40.282306 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.282542 kubelet[2715]: W0516 03:29:40.282320 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.282542 kubelet[2715]: E0516 03:29:40.282330 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.283055 kubelet[2715]: E0516 03:29:40.282812 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.283055 kubelet[2715]: W0516 03:29:40.282824 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.283055 kubelet[2715]: E0516 03:29:40.282850 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.284157 kubelet[2715]: E0516 03:29:40.284082 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.284157 kubelet[2715]: W0516 03:29:40.284097 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.284157 kubelet[2715]: E0516 03:29:40.284110 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.284727 kubelet[2715]: E0516 03:29:40.284538 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.284727 kubelet[2715]: W0516 03:29:40.284551 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.284727 kubelet[2715]: E0516 03:29:40.284572 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.285267 kubelet[2715]: E0516 03:29:40.285119 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.285267 kubelet[2715]: W0516 03:29:40.285133 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.285267 kubelet[2715]: E0516 03:29:40.285145 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.285569 kubelet[2715]: E0516 03:29:40.285465 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.285569 kubelet[2715]: W0516 03:29:40.285479 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.285569 kubelet[2715]: E0516 03:29:40.285507 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.286214 kubelet[2715]: E0516 03:29:40.285916 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.286214 kubelet[2715]: W0516 03:29:40.285929 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.286214 kubelet[2715]: E0516 03:29:40.285948 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.286373 kubelet[2715]: E0516 03:29:40.286314 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.286373 kubelet[2715]: W0516 03:29:40.286340 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.286449 kubelet[2715]: E0516 03:29:40.286363 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.286782 kubelet[2715]: E0516 03:29:40.286641 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.286782 kubelet[2715]: W0516 03:29:40.286658 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.286782 kubelet[2715]: E0516 03:29:40.286671 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.287397 kubelet[2715]: E0516 03:29:40.286844 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.287397 kubelet[2715]: W0516 03:29:40.286854 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.287397 kubelet[2715]: E0516 03:29:40.286871 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.287397 kubelet[2715]: E0516 03:29:40.287187 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.287397 kubelet[2715]: W0516 03:29:40.287202 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.287397 kubelet[2715]: E0516 03:29:40.287333 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.287938 kubelet[2715]: E0516 03:29:40.287463 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.287938 kubelet[2715]: W0516 03:29:40.287725 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.287938 kubelet[2715]: E0516 03:29:40.287739 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.289153 kubelet[2715]: E0516 03:29:40.288106 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.289153 kubelet[2715]: W0516 03:29:40.288117 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.289153 kubelet[2715]: E0516 03:29:40.288134 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.289489 kubelet[2715]: E0516 03:29:40.289476 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.289709 kubelet[2715]: W0516 03:29:40.289576 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.289709 kubelet[2715]: E0516 03:29:40.289619 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.290152 kubelet[2715]: E0516 03:29:40.290046 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.290246 kubelet[2715]: W0516 03:29:40.290231 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.291018 kubelet[2715]: E0516 03:29:40.290481 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.291135 kubelet[2715]: E0516 03:29:40.291119 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.291676 kubelet[2715]: W0516 03:29:40.291342 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.291805 kubelet[2715]: E0516 03:29:40.291791 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.292229 kubelet[2715]: W0516 03:29:40.291871 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.294020 kubelet[2715]: E0516 03:29:40.292743 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.294020 kubelet[2715]: W0516 03:29:40.292767 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.294020 kubelet[2715]: E0516 03:29:40.292785 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.294020 kubelet[2715]: E0516 03:29:40.292829 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.296218 kubelet[2715]: E0516 03:29:40.295407 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.296218 kubelet[2715]: W0516 03:29:40.295423 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.296218 kubelet[2715]: E0516 03:29:40.295439 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.296218 kubelet[2715]: E0516 03:29:40.295643 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.296481 kubelet[2715]: E0516 03:29:40.296466 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.296552 kubelet[2715]: W0516 03:29:40.296539 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.296658 kubelet[2715]: E0516 03:29:40.296629 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.297005 kubelet[2715]: E0516 03:29:40.296895 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.297005 kubelet[2715]: W0516 03:29:40.296915 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.297005 kubelet[2715]: E0516 03:29:40.296929 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:40.299342 kubelet[2715]: E0516 03:29:40.299259 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:40.299342 kubelet[2715]: W0516 03:29:40.299283 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:40.299342 kubelet[2715]: E0516 03:29:40.299296 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.191806 kubelet[2715]: E0516 03:29:41.191707 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.192887 kubelet[2715]: W0516 03:29:41.192367 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.192887 kubelet[2715]: E0516 03:29:41.192454 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.193420 kubelet[2715]: E0516 03:29:41.193354 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.193420 kubelet[2715]: W0516 03:29:41.193395 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.193420 kubelet[2715]: E0516 03:29:41.193420 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.194161 kubelet[2715]: E0516 03:29:41.194098 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.194288 kubelet[2715]: W0516 03:29:41.194142 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.194288 kubelet[2715]: E0516 03:29:41.194236 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.195037 kubelet[2715]: E0516 03:29:41.194940 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.195220 kubelet[2715]: W0516 03:29:41.194975 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.195220 kubelet[2715]: E0516 03:29:41.195083 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.195857 kubelet[2715]: E0516 03:29:41.195794 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.195857 kubelet[2715]: W0516 03:29:41.195834 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.196219 kubelet[2715]: E0516 03:29:41.195861 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.196483 kubelet[2715]: E0516 03:29:41.196420 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.196582 kubelet[2715]: W0516 03:29:41.196492 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.196582 kubelet[2715]: E0516 03:29:41.196520 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.197063 kubelet[2715]: E0516 03:29:41.196967 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.197063 kubelet[2715]: W0516 03:29:41.197059 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.197286 kubelet[2715]: E0516 03:29:41.197090 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.197641 kubelet[2715]: E0516 03:29:41.197569 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.197641 kubelet[2715]: W0516 03:29:41.197609 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.197641 kubelet[2715]: E0516 03:29:41.197633 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.198213 kubelet[2715]: E0516 03:29:41.198132 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.198213 kubelet[2715]: W0516 03:29:41.198175 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.198213 kubelet[2715]: E0516 03:29:41.198200 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.198657 kubelet[2715]: E0516 03:29:41.198583 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.198657 kubelet[2715]: W0516 03:29:41.198619 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.198657 kubelet[2715]: E0516 03:29:41.198641 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.199359 kubelet[2715]: E0516 03:29:41.199270 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.199359 kubelet[2715]: W0516 03:29:41.199307 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.199359 kubelet[2715]: E0516 03:29:41.199332 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.199814 kubelet[2715]: E0516 03:29:41.199754 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.199814 kubelet[2715]: W0516 03:29:41.199778 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.199814 kubelet[2715]: E0516 03:29:41.199804 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.200850 kubelet[2715]: E0516 03:29:41.200386 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.200850 kubelet[2715]: W0516 03:29:41.200417 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.200850 kubelet[2715]: E0516 03:29:41.200443 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.200850 kubelet[2715]: E0516 03:29:41.200820 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.200850 kubelet[2715]: W0516 03:29:41.200843 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.201639 kubelet[2715]: E0516 03:29:41.200866 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.201639 kubelet[2715]: E0516 03:29:41.201310 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.201639 kubelet[2715]: W0516 03:29:41.201335 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.201639 kubelet[2715]: E0516 03:29:41.201358 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.202509 kubelet[2715]: E0516 03:29:41.202226 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.202509 kubelet[2715]: W0516 03:29:41.202253 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.202509 kubelet[2715]: E0516 03:29:41.202279 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.202900 kubelet[2715]: E0516 03:29:41.202848 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.202900 kubelet[2715]: W0516 03:29:41.202876 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.202900 kubelet[2715]: E0516 03:29:41.202916 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.203688 kubelet[2715]: E0516 03:29:41.203648 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.203936 kubelet[2715]: W0516 03:29:41.203892 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.204467 kubelet[2715]: E0516 03:29:41.204093 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.204777 kubelet[2715]: E0516 03:29:41.204710 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.204777 kubelet[2715]: W0516 03:29:41.204752 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.204777 kubelet[2715]: E0516 03:29:41.204791 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.205273 kubelet[2715]: E0516 03:29:41.205217 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.205273 kubelet[2715]: W0516 03:29:41.205241 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.205273 kubelet[2715]: E0516 03:29:41.205264 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.205856 kubelet[2715]: E0516 03:29:41.205818 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.205856 kubelet[2715]: W0516 03:29:41.205852 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.206386 kubelet[2715]: E0516 03:29:41.206018 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.207349 kubelet[2715]: E0516 03:29:41.207129 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.207349 kubelet[2715]: W0516 03:29:41.207164 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.207782 kubelet[2715]: E0516 03:29:41.207674 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.208135 kubelet[2715]: E0516 03:29:41.208076 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.208135 kubelet[2715]: W0516 03:29:41.208115 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.208505 kubelet[2715]: E0516 03:29:41.208154 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.208714 kubelet[2715]: E0516 03:29:41.208652 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.208714 kubelet[2715]: W0516 03:29:41.208692 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.208875 kubelet[2715]: E0516 03:29:41.208728 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.209186 kubelet[2715]: E0516 03:29:41.209151 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.209186 kubelet[2715]: W0516 03:29:41.209182 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.209522 kubelet[2715]: E0516 03:29:41.209319 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.209795 kubelet[2715]: E0516 03:29:41.209654 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.209795 kubelet[2715]: W0516 03:29:41.209677 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.210116 kubelet[2715]: E0516 03:29:41.209813 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.210254 kubelet[2715]: E0516 03:29:41.210211 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.210254 kubelet[2715]: W0516 03:29:41.210235 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.210603 kubelet[2715]: E0516 03:29:41.210518 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.210838 kubelet[2715]: E0516 03:29:41.210800 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.210838 kubelet[2715]: W0516 03:29:41.210836 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.210838 kubelet[2715]: E0516 03:29:41.210869 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.211910 kubelet[2715]: E0516 03:29:41.211878 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.212275 kubelet[2715]: W0516 03:29:41.212160 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.212504 kubelet[2715]: E0516 03:29:41.212438 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.212838 kubelet[2715]: E0516 03:29:41.212770 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.212838 kubelet[2715]: W0516 03:29:41.212814 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.212838 kubelet[2715]: E0516 03:29:41.212852 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.214082 kubelet[2715]: E0516 03:29:41.213722 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.214082 kubelet[2715]: W0516 03:29:41.213755 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.214082 kubelet[2715]: E0516 03:29:41.213782 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.214746 kubelet[2715]: E0516 03:29:41.214712 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.214959 kubelet[2715]: W0516 03:29:41.214925 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.215959 kubelet[2715]: E0516 03:29:41.215291 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:41.216412 kubelet[2715]: E0516 03:29:41.216261 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 03:29:41.216412 kubelet[2715]: W0516 03:29:41.216318 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 03:29:41.216412 kubelet[2715]: E0516 03:29:41.216346 2715 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 03:29:42.022781 kubelet[2715]: E0516 03:29:42.022729 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5gjgv" podUID="ec9adb4c-4eda-47e3-8ebd-e314c0ec0140" May 16 03:29:42.030771 containerd[1476]: time="2025-05-16T03:29:42.030705936Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:42.032913 containerd[1476]: time="2025-05-16T03:29:42.032793850Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4441619" May 16 03:29:42.034775 containerd[1476]: time="2025-05-16T03:29:42.034713960Z" level=info msg="ImageCreate event name:\"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:42.037512 containerd[1476]: time="2025-05-16T03:29:42.037462486Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:42.038573 containerd[1476]: time="2025-05-16T03:29:42.038182119Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5934282\" in 2.209865169s" May 16 03:29:42.038573 containerd[1476]: time="2025-05-16T03:29:42.038228677Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\"" May 16 03:29:42.041156 containerd[1476]: time="2025-05-16T03:29:42.041126123Z" level=info msg="CreateContainer within sandbox \"9d7f50032733af504fe12d3345a33b7d3ff2d238a9ef730c1a5afbd6fe910994\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 16 03:29:42.055173 containerd[1476]: time="2025-05-16T03:29:42.055130198Z" level=info msg="Container 6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17: CDI devices from CRI Config.CDIDevices: []" May 16 03:29:42.062296 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3320889725.mount: Deactivated successfully. May 16 03:29:42.067872 containerd[1476]: time="2025-05-16T03:29:42.067825742Z" level=info msg="CreateContainer within sandbox \"9d7f50032733af504fe12d3345a33b7d3ff2d238a9ef730c1a5afbd6fe910994\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17\"" May 16 03:29:42.068502 containerd[1476]: time="2025-05-16T03:29:42.068439725Z" level=info msg="StartContainer for \"6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17\"" May 16 03:29:42.070129 containerd[1476]: time="2025-05-16T03:29:42.070095488Z" level=info msg="connecting to shim 6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17" address="unix:///run/containerd/s/ece97cd8b27fba53a9b5cfc543dcdc139306b163de31f77d33722dd7c2566970" protocol=ttrpc version=3 May 16 03:29:42.099234 systemd[1]: Started cri-containerd-6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17.scope - libcontainer container 6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17. May 16 03:29:42.150530 containerd[1476]: time="2025-05-16T03:29:42.150484029Z" level=info msg="StartContainer for \"6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17\" returns successfully" May 16 03:29:42.158057 systemd[1]: cri-containerd-6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17.scope: Deactivated successfully. May 16 03:29:42.162622 containerd[1476]: time="2025-05-16T03:29:42.162525494Z" level=info msg="received exit event container_id:\"6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17\" id:\"6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17\" pid:3416 exited_at:{seconds:1747366182 nanos:159873188}" May 16 03:29:42.162850 containerd[1476]: time="2025-05-16T03:29:42.162574155Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17\" id:\"6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17\" pid:3416 exited_at:{seconds:1747366182 nanos:159873188}" May 16 03:29:42.190265 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17-rootfs.mount: Deactivated successfully. May 16 03:29:43.224362 containerd[1476]: time="2025-05-16T03:29:43.222447714Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 16 03:29:44.026339 kubelet[2715]: E0516 03:29:44.026129 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5gjgv" podUID="ec9adb4c-4eda-47e3-8ebd-e314c0ec0140" May 16 03:29:46.027945 kubelet[2715]: E0516 03:29:46.026767 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5gjgv" podUID="ec9adb4c-4eda-47e3-8ebd-e314c0ec0140" May 16 03:29:48.022181 kubelet[2715]: E0516 03:29:48.022080 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5gjgv" podUID="ec9adb4c-4eda-47e3-8ebd-e314c0ec0140" May 16 03:29:48.459022 containerd[1476]: time="2025-05-16T03:29:48.458709022Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:48.460199 containerd[1476]: time="2025-05-16T03:29:48.460107529Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=70300568" May 16 03:29:48.461721 containerd[1476]: time="2025-05-16T03:29:48.461644445Z" level=info msg="ImageCreate event name:\"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:48.464907 containerd[1476]: time="2025-05-16T03:29:48.464830110Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:29:48.465916 containerd[1476]: time="2025-05-16T03:29:48.465688762Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"71793271\" in 5.242986591s" May 16 03:29:48.465916 containerd[1476]: time="2025-05-16T03:29:48.465737334Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\"" May 16 03:29:48.472241 containerd[1476]: time="2025-05-16T03:29:48.469304814Z" level=info msg="CreateContainer within sandbox \"9d7f50032733af504fe12d3345a33b7d3ff2d238a9ef730c1a5afbd6fe910994\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 16 03:29:48.487842 containerd[1476]: time="2025-05-16T03:29:48.487803873Z" level=info msg="Container 2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6: CDI devices from CRI Config.CDIDevices: []" May 16 03:29:48.509332 containerd[1476]: time="2025-05-16T03:29:48.509276437Z" level=info msg="CreateContainer within sandbox \"9d7f50032733af504fe12d3345a33b7d3ff2d238a9ef730c1a5afbd6fe910994\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6\"" May 16 03:29:48.511102 containerd[1476]: time="2025-05-16T03:29:48.510140900Z" level=info msg="StartContainer for \"2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6\"" May 16 03:29:48.513437 containerd[1476]: time="2025-05-16T03:29:48.513387750Z" level=info msg="connecting to shim 2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6" address="unix:///run/containerd/s/ece97cd8b27fba53a9b5cfc543dcdc139306b163de31f77d33722dd7c2566970" protocol=ttrpc version=3 May 16 03:29:48.549238 systemd[1]: Started cri-containerd-2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6.scope - libcontainer container 2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6. May 16 03:29:48.608137 containerd[1476]: time="2025-05-16T03:29:48.606762353Z" level=info msg="StartContainer for \"2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6\" returns successfully" May 16 03:29:50.021815 kubelet[2715]: E0516 03:29:50.021732 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5gjgv" podUID="ec9adb4c-4eda-47e3-8ebd-e314c0ec0140" May 16 03:29:50.475771 containerd[1476]: time="2025-05-16T03:29:50.475471273Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 16 03:29:50.487136 systemd[1]: cri-containerd-2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6.scope: Deactivated successfully. May 16 03:29:50.487846 systemd[1]: cri-containerd-2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6.scope: Consumed 1.048s CPU time, 194.7M memory peak, 170.9M written to disk. May 16 03:29:50.494905 containerd[1476]: time="2025-05-16T03:29:50.492766975Z" level=info msg="received exit event container_id:\"2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6\" id:\"2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6\" pid:3474 exited_at:{seconds:1747366190 nanos:491359972}" May 16 03:29:50.494905 containerd[1476]: time="2025-05-16T03:29:50.493391046Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6\" id:\"2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6\" pid:3474 exited_at:{seconds:1747366190 nanos:491359972}" May 16 03:29:50.498383 kubelet[2715]: I0516 03:29:50.498326 2715 kubelet_node_status.go:501] "Fast updating node status as it just became ready" May 16 03:29:50.558591 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6-rootfs.mount: Deactivated successfully. May 16 03:29:50.575753 systemd[1]: Created slice kubepods-burstable-pod44c65729_467c_46d8_ae6e_1add08755687.slice - libcontainer container kubepods-burstable-pod44c65729_467c_46d8_ae6e_1add08755687.slice. May 16 03:29:50.701765 systemd[1]: Created slice kubepods-besteffort-pod73457dc8_5848_40bf_ace2_479bc6054dd2.slice - libcontainer container kubepods-besteffort-pod73457dc8_5848_40bf_ace2_479bc6054dd2.slice. May 16 03:29:50.857420 kubelet[2715]: I0516 03:29:50.703946 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx6dl\" (UniqueName: \"kubernetes.io/projected/44c65729-467c-46d8-ae6e-1add08755687-kube-api-access-gx6dl\") pod \"coredns-668d6bf9bc-mskp9\" (UID: \"44c65729-467c-46d8-ae6e-1add08755687\") " pod="kube-system/coredns-668d6bf9bc-mskp9" May 16 03:29:50.857420 kubelet[2715]: I0516 03:29:50.704105 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44c65729-467c-46d8-ae6e-1add08755687-config-volume\") pod \"coredns-668d6bf9bc-mskp9\" (UID: \"44c65729-467c-46d8-ae6e-1add08755687\") " pod="kube-system/coredns-668d6bf9bc-mskp9" May 16 03:29:50.857420 kubelet[2715]: I0516 03:29:50.704277 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/73457dc8-5848-40bf-ace2-479bc6054dd2-calico-apiserver-certs\") pod \"calico-apiserver-594f46878-gp6pw\" (UID: \"73457dc8-5848-40bf-ace2-479bc6054dd2\") " pod="calico-apiserver/calico-apiserver-594f46878-gp6pw" May 16 03:29:50.857420 kubelet[2715]: I0516 03:29:50.704762 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7z9r\" (UniqueName: \"kubernetes.io/projected/73457dc8-5848-40bf-ace2-479bc6054dd2-kube-api-access-g7z9r\") pod \"calico-apiserver-594f46878-gp6pw\" (UID: \"73457dc8-5848-40bf-ace2-479bc6054dd2\") " pod="calico-apiserver/calico-apiserver-594f46878-gp6pw" May 16 03:29:50.942507 systemd[1]: Created slice kubepods-besteffort-pod60ec12df_16f1_4a36_aa58_7ef6c1d545b7.slice - libcontainer container kubepods-besteffort-pod60ec12df_16f1_4a36_aa58_7ef6c1d545b7.slice. May 16 03:29:50.957780 systemd[1]: Created slice kubepods-besteffort-podc0064a9e_4be1_4ce0_a21f_9e78adbef175.slice - libcontainer container kubepods-besteffort-podc0064a9e_4be1_4ce0_a21f_9e78adbef175.slice. May 16 03:29:50.964368 systemd[1]: Created slice kubepods-burstable-podfb595fc0_7f7d_427f_87b1_bb4b1008771a.slice - libcontainer container kubepods-burstable-podfb595fc0_7f7d_427f_87b1_bb4b1008771a.slice. May 16 03:29:50.970817 systemd[1]: Created slice kubepods-besteffort-pode642d357_631b_4841_8832_f5200203d19c.slice - libcontainer container kubepods-besteffort-pode642d357_631b_4841_8832_f5200203d19c.slice. May 16 03:29:50.976803 systemd[1]: Created slice kubepods-besteffort-podc58e974e_e8d9_443d_a0c3_984047d750c6.slice - libcontainer container kubepods-besteffort-podc58e974e_e8d9_443d_a0c3_984047d750c6.slice. May 16 03:29:51.010733 kubelet[2715]: I0516 03:29:51.010236 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/60ec12df-16f1-4a36-aa58-7ef6c1d545b7-calico-apiserver-certs\") pod \"calico-apiserver-594f46878-mm7r4\" (UID: \"60ec12df-16f1-4a36-aa58-7ef6c1d545b7\") " pod="calico-apiserver/calico-apiserver-594f46878-mm7r4" May 16 03:29:51.010733 kubelet[2715]: I0516 03:29:51.010310 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0064a9e-4be1-4ce0-a21f-9e78adbef175-goldmane-ca-bundle\") pod \"goldmane-78d55f7ddc-4kgrm\" (UID: \"c0064a9e-4be1-4ce0-a21f-9e78adbef175\") " pod="calico-system/goldmane-78d55f7ddc-4kgrm" May 16 03:29:51.010733 kubelet[2715]: I0516 03:29:51.010335 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0064a9e-4be1-4ce0-a21f-9e78adbef175-config\") pod \"goldmane-78d55f7ddc-4kgrm\" (UID: \"c0064a9e-4be1-4ce0-a21f-9e78adbef175\") " pod="calico-system/goldmane-78d55f7ddc-4kgrm" May 16 03:29:51.010733 kubelet[2715]: I0516 03:29:51.010369 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jggn\" (UniqueName: \"kubernetes.io/projected/60ec12df-16f1-4a36-aa58-7ef6c1d545b7-kube-api-access-2jggn\") pod \"calico-apiserver-594f46878-mm7r4\" (UID: \"60ec12df-16f1-4a36-aa58-7ef6c1d545b7\") " pod="calico-apiserver/calico-apiserver-594f46878-mm7r4" May 16 03:29:51.010733 kubelet[2715]: I0516 03:29:51.010393 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/c0064a9e-4be1-4ce0-a21f-9e78adbef175-goldmane-key-pair\") pod \"goldmane-78d55f7ddc-4kgrm\" (UID: \"c0064a9e-4be1-4ce0-a21f-9e78adbef175\") " pod="calico-system/goldmane-78d55f7ddc-4kgrm" May 16 03:29:51.011058 kubelet[2715]: I0516 03:29:51.010437 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2lrx\" (UniqueName: \"kubernetes.io/projected/c0064a9e-4be1-4ce0-a21f-9e78adbef175-kube-api-access-v2lrx\") pod \"goldmane-78d55f7ddc-4kgrm\" (UID: \"c0064a9e-4be1-4ce0-a21f-9e78adbef175\") " pod="calico-system/goldmane-78d55f7ddc-4kgrm" May 16 03:29:51.111897 kubelet[2715]: I0516 03:29:51.111194 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg2zf\" (UniqueName: \"kubernetes.io/projected/fb595fc0-7f7d-427f-87b1-bb4b1008771a-kube-api-access-xg2zf\") pod \"coredns-668d6bf9bc-54gvm\" (UID: \"fb595fc0-7f7d-427f-87b1-bb4b1008771a\") " pod="kube-system/coredns-668d6bf9bc-54gvm" May 16 03:29:51.113339 kubelet[2715]: I0516 03:29:51.112636 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h72q\" (UniqueName: \"kubernetes.io/projected/c58e974e-e8d9-443d-a0c3-984047d750c6-kube-api-access-4h72q\") pod \"calico-kube-controllers-58c54ff885-qgwjf\" (UID: \"c58e974e-e8d9-443d-a0c3-984047d750c6\") " pod="calico-system/calico-kube-controllers-58c54ff885-qgwjf" May 16 03:29:51.113339 kubelet[2715]: I0516 03:29:51.112884 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e642d357-631b-4841-8832-f5200203d19c-whisker-backend-key-pair\") pod \"whisker-6d65767b66-v2j5j\" (UID: \"e642d357-631b-4841-8832-f5200203d19c\") " pod="calico-system/whisker-6d65767b66-v2j5j" May 16 03:29:51.113339 kubelet[2715]: I0516 03:29:51.112941 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9swkk\" (UniqueName: \"kubernetes.io/projected/e642d357-631b-4841-8832-f5200203d19c-kube-api-access-9swkk\") pod \"whisker-6d65767b66-v2j5j\" (UID: \"e642d357-631b-4841-8832-f5200203d19c\") " pod="calico-system/whisker-6d65767b66-v2j5j" May 16 03:29:51.113339 kubelet[2715]: I0516 03:29:51.113075 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb595fc0-7f7d-427f-87b1-bb4b1008771a-config-volume\") pod \"coredns-668d6bf9bc-54gvm\" (UID: \"fb595fc0-7f7d-427f-87b1-bb4b1008771a\") " pod="kube-system/coredns-668d6bf9bc-54gvm" May 16 03:29:51.113339 kubelet[2715]: I0516 03:29:51.113219 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e642d357-631b-4841-8832-f5200203d19c-whisker-ca-bundle\") pod \"whisker-6d65767b66-v2j5j\" (UID: \"e642d357-631b-4841-8832-f5200203d19c\") " pod="calico-system/whisker-6d65767b66-v2j5j" May 16 03:29:51.113776 kubelet[2715]: I0516 03:29:51.113343 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c58e974e-e8d9-443d-a0c3-984047d750c6-tigera-ca-bundle\") pod \"calico-kube-controllers-58c54ff885-qgwjf\" (UID: \"c58e974e-e8d9-443d-a0c3-984047d750c6\") " pod="calico-system/calico-kube-controllers-58c54ff885-qgwjf" May 16 03:29:51.183705 containerd[1476]: time="2025-05-16T03:29:51.182944413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594f46878-gp6pw,Uid:73457dc8-5848-40bf-ace2-479bc6054dd2,Namespace:calico-apiserver,Attempt:0,}" May 16 03:29:51.184104 containerd[1476]: time="2025-05-16T03:29:51.184080106Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mskp9,Uid:44c65729-467c-46d8-ae6e-1add08755687,Namespace:kube-system,Attempt:0,}" May 16 03:29:51.256182 containerd[1476]: time="2025-05-16T03:29:51.255689490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594f46878-mm7r4,Uid:60ec12df-16f1-4a36-aa58-7ef6c1d545b7,Namespace:calico-apiserver,Attempt:0,}" May 16 03:29:51.270180 containerd[1476]: time="2025-05-16T03:29:51.270133634Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-4kgrm,Uid:c0064a9e-4be1-4ce0-a21f-9e78adbef175,Namespace:calico-system,Attempt:0,}" May 16 03:29:51.274666 containerd[1476]: time="2025-05-16T03:29:51.274571108Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6d65767b66-v2j5j,Uid:e642d357-631b-4841-8832-f5200203d19c,Namespace:calico-system,Attempt:0,}" May 16 03:29:51.287153 containerd[1476]: time="2025-05-16T03:29:51.286965121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58c54ff885-qgwjf,Uid:c58e974e-e8d9-443d-a0c3-984047d750c6,Namespace:calico-system,Attempt:0,}" May 16 03:29:51.295098 containerd[1476]: time="2025-05-16T03:29:51.294895253Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 16 03:29:51.369590 containerd[1476]: time="2025-05-16T03:29:51.369468674Z" level=error msg="Failed to destroy network for sandbox \"7db5ec314eb26cfb9509d329176cada6f35e0218408031e76b748782eb5e6d7a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.373304 containerd[1476]: time="2025-05-16T03:29:51.373232942Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mskp9,Uid:44c65729-467c-46d8-ae6e-1add08755687,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7db5ec314eb26cfb9509d329176cada6f35e0218408031e76b748782eb5e6d7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.374389 kubelet[2715]: E0516 03:29:51.374314 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7db5ec314eb26cfb9509d329176cada6f35e0218408031e76b748782eb5e6d7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.375543 kubelet[2715]: E0516 03:29:51.375127 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7db5ec314eb26cfb9509d329176cada6f35e0218408031e76b748782eb5e6d7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-mskp9" May 16 03:29:51.375543 kubelet[2715]: E0516 03:29:51.375178 2715 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7db5ec314eb26cfb9509d329176cada6f35e0218408031e76b748782eb5e6d7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-mskp9" May 16 03:29:51.375543 kubelet[2715]: E0516 03:29:51.375240 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-mskp9_kube-system(44c65729-467c-46d8-ae6e-1add08755687)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-mskp9_kube-system(44c65729-467c-46d8-ae6e-1add08755687)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7db5ec314eb26cfb9509d329176cada6f35e0218408031e76b748782eb5e6d7a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-mskp9" podUID="44c65729-467c-46d8-ae6e-1add08755687" May 16 03:29:51.439022 containerd[1476]: time="2025-05-16T03:29:51.438932187Z" level=error msg="Failed to destroy network for sandbox \"24f2c5d80207d130d8b63d86f7b92f0bb6b232b286d5656b046f6ca82798a2ce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.443757 containerd[1476]: time="2025-05-16T03:29:51.443273109Z" level=error msg="Failed to destroy network for sandbox \"affdf73b5527c18739e4d140fe9e8471c299b28ad39c0ef40c666dceb68687d0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.443757 containerd[1476]: time="2025-05-16T03:29:51.443533448Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594f46878-mm7r4,Uid:60ec12df-16f1-4a36-aa58-7ef6c1d545b7,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"24f2c5d80207d130d8b63d86f7b92f0bb6b232b286d5656b046f6ca82798a2ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.443932 kubelet[2715]: E0516 03:29:51.443790 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24f2c5d80207d130d8b63d86f7b92f0bb6b232b286d5656b046f6ca82798a2ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.443932 kubelet[2715]: E0516 03:29:51.443853 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24f2c5d80207d130d8b63d86f7b92f0bb6b232b286d5656b046f6ca82798a2ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-594f46878-mm7r4" May 16 03:29:51.443932 kubelet[2715]: E0516 03:29:51.443879 2715 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24f2c5d80207d130d8b63d86f7b92f0bb6b232b286d5656b046f6ca82798a2ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-594f46878-mm7r4" May 16 03:29:51.444125 kubelet[2715]: E0516 03:29:51.443927 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-594f46878-mm7r4_calico-apiserver(60ec12df-16f1-4a36-aa58-7ef6c1d545b7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-594f46878-mm7r4_calico-apiserver(60ec12df-16f1-4a36-aa58-7ef6c1d545b7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"24f2c5d80207d130d8b63d86f7b92f0bb6b232b286d5656b046f6ca82798a2ce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-594f46878-mm7r4" podUID="60ec12df-16f1-4a36-aa58-7ef6c1d545b7" May 16 03:29:51.446012 containerd[1476]: time="2025-05-16T03:29:51.445894141Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594f46878-gp6pw,Uid:73457dc8-5848-40bf-ace2-479bc6054dd2,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"affdf73b5527c18739e4d140fe9e8471c299b28ad39c0ef40c666dceb68687d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.446699 kubelet[2715]: E0516 03:29:51.446613 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"affdf73b5527c18739e4d140fe9e8471c299b28ad39c0ef40c666dceb68687d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.446867 kubelet[2715]: E0516 03:29:51.446708 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"affdf73b5527c18739e4d140fe9e8471c299b28ad39c0ef40c666dceb68687d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-594f46878-gp6pw" May 16 03:29:51.446867 kubelet[2715]: E0516 03:29:51.446729 2715 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"affdf73b5527c18739e4d140fe9e8471c299b28ad39c0ef40c666dceb68687d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-594f46878-gp6pw" May 16 03:29:51.446867 kubelet[2715]: E0516 03:29:51.446780 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-594f46878-gp6pw_calico-apiserver(73457dc8-5848-40bf-ace2-479bc6054dd2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-594f46878-gp6pw_calico-apiserver(73457dc8-5848-40bf-ace2-479bc6054dd2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"affdf73b5527c18739e4d140fe9e8471c299b28ad39c0ef40c666dceb68687d0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-594f46878-gp6pw" podUID="73457dc8-5848-40bf-ace2-479bc6054dd2" May 16 03:29:51.481453 containerd[1476]: time="2025-05-16T03:29:51.481392713Z" level=error msg="Failed to destroy network for sandbox \"d73624faae94b9955e168c32e9ec031d98457f8a2f685338e77a7bf59eb0c5dd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.484187 containerd[1476]: time="2025-05-16T03:29:51.483698613Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58c54ff885-qgwjf,Uid:c58e974e-e8d9-443d-a0c3-984047d750c6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d73624faae94b9955e168c32e9ec031d98457f8a2f685338e77a7bf59eb0c5dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.485852 kubelet[2715]: E0516 03:29:51.483904 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d73624faae94b9955e168c32e9ec031d98457f8a2f685338e77a7bf59eb0c5dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.485852 kubelet[2715]: E0516 03:29:51.483959 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d73624faae94b9955e168c32e9ec031d98457f8a2f685338e77a7bf59eb0c5dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58c54ff885-qgwjf" May 16 03:29:51.485852 kubelet[2715]: E0516 03:29:51.484032 2715 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d73624faae94b9955e168c32e9ec031d98457f8a2f685338e77a7bf59eb0c5dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58c54ff885-qgwjf" May 16 03:29:51.485967 kubelet[2715]: E0516 03:29:51.484080 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-58c54ff885-qgwjf_calico-system(c58e974e-e8d9-443d-a0c3-984047d750c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-58c54ff885-qgwjf_calico-system(c58e974e-e8d9-443d-a0c3-984047d750c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d73624faae94b9955e168c32e9ec031d98457f8a2f685338e77a7bf59eb0c5dd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-58c54ff885-qgwjf" podUID="c58e974e-e8d9-443d-a0c3-984047d750c6" May 16 03:29:51.503258 containerd[1476]: time="2025-05-16T03:29:51.503186208Z" level=error msg="Failed to destroy network for sandbox \"1c0701508853466773eff165e90b5ff92a727d01bfcef24fc4b368a586acdb86\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.505442 containerd[1476]: time="2025-05-16T03:29:51.505229585Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-4kgrm,Uid:c0064a9e-4be1-4ce0-a21f-9e78adbef175,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c0701508853466773eff165e90b5ff92a727d01bfcef24fc4b368a586acdb86\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.505918 kubelet[2715]: E0516 03:29:51.505711 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c0701508853466773eff165e90b5ff92a727d01bfcef24fc4b368a586acdb86\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.505918 kubelet[2715]: E0516 03:29:51.505788 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c0701508853466773eff165e90b5ff92a727d01bfcef24fc4b368a586acdb86\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-4kgrm" May 16 03:29:51.505918 kubelet[2715]: E0516 03:29:51.505820 2715 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c0701508853466773eff165e90b5ff92a727d01bfcef24fc4b368a586acdb86\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-4kgrm" May 16 03:29:51.506144 kubelet[2715]: E0516 03:29:51.505885 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-78d55f7ddc-4kgrm_calico-system(c0064a9e-4be1-4ce0-a21f-9e78adbef175)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-78d55f7ddc-4kgrm_calico-system(c0064a9e-4be1-4ce0-a21f-9e78adbef175)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1c0701508853466773eff165e90b5ff92a727d01bfcef24fc4b368a586acdb86\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:29:51.507053 containerd[1476]: time="2025-05-16T03:29:51.506901094Z" level=error msg="Failed to destroy network for sandbox \"9dfdc03809f1ea67abaf2e751687344abae0b4b6e19217929dff9faded25e517\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.509478 containerd[1476]: time="2025-05-16T03:29:51.509433760Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6d65767b66-v2j5j,Uid:e642d357-631b-4841-8832-f5200203d19c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9dfdc03809f1ea67abaf2e751687344abae0b4b6e19217929dff9faded25e517\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.509761 kubelet[2715]: E0516 03:29:51.509670 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9dfdc03809f1ea67abaf2e751687344abae0b4b6e19217929dff9faded25e517\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.509761 kubelet[2715]: E0516 03:29:51.509724 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9dfdc03809f1ea67abaf2e751687344abae0b4b6e19217929dff9faded25e517\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6d65767b66-v2j5j" May 16 03:29:51.510008 kubelet[2715]: E0516 03:29:51.509765 2715 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9dfdc03809f1ea67abaf2e751687344abae0b4b6e19217929dff9faded25e517\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6d65767b66-v2j5j" May 16 03:29:51.510008 kubelet[2715]: E0516 03:29:51.509817 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6d65767b66-v2j5j_calico-system(e642d357-631b-4841-8832-f5200203d19c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6d65767b66-v2j5j_calico-system(e642d357-631b-4841-8832-f5200203d19c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9dfdc03809f1ea67abaf2e751687344abae0b4b6e19217929dff9faded25e517\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6d65767b66-v2j5j" podUID="e642d357-631b-4841-8832-f5200203d19c" May 16 03:29:51.568776 containerd[1476]: time="2025-05-16T03:29:51.568443804Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-54gvm,Uid:fb595fc0-7f7d-427f-87b1-bb4b1008771a,Namespace:kube-system,Attempt:0,}" May 16 03:29:51.625296 containerd[1476]: time="2025-05-16T03:29:51.625182122Z" level=error msg="Failed to destroy network for sandbox \"63a6e5f2a31760e0c999b413e272abb0373a8817aa244387496129ce5754dac2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.627894 containerd[1476]: time="2025-05-16T03:29:51.627744995Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-54gvm,Uid:fb595fc0-7f7d-427f-87b1-bb4b1008771a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"63a6e5f2a31760e0c999b413e272abb0373a8817aa244387496129ce5754dac2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.628085 kubelet[2715]: E0516 03:29:51.627996 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63a6e5f2a31760e0c999b413e272abb0373a8817aa244387496129ce5754dac2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:51.628146 kubelet[2715]: E0516 03:29:51.628080 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63a6e5f2a31760e0c999b413e272abb0373a8817aa244387496129ce5754dac2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-54gvm" May 16 03:29:51.628146 kubelet[2715]: E0516 03:29:51.628106 2715 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63a6e5f2a31760e0c999b413e272abb0373a8817aa244387496129ce5754dac2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-54gvm" May 16 03:29:51.629586 kubelet[2715]: E0516 03:29:51.628156 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-54gvm_kube-system(fb595fc0-7f7d-427f-87b1-bb4b1008771a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-54gvm_kube-system(fb595fc0-7f7d-427f-87b1-bb4b1008771a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"63a6e5f2a31760e0c999b413e272abb0373a8817aa244387496129ce5754dac2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-54gvm" podUID="fb595fc0-7f7d-427f-87b1-bb4b1008771a" May 16 03:29:51.630730 systemd[1]: run-netns-cni\x2dde779e94\x2d3f9c\x2d49a5\x2db3ab\x2de2d49dabce0e.mount: Deactivated successfully. May 16 03:29:52.044370 systemd[1]: Created slice kubepods-besteffort-podec9adb4c_4eda_47e3_8ebd_e314c0ec0140.slice - libcontainer container kubepods-besteffort-podec9adb4c_4eda_47e3_8ebd_e314c0ec0140.slice. May 16 03:29:52.051518 containerd[1476]: time="2025-05-16T03:29:52.051433312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5gjgv,Uid:ec9adb4c-4eda-47e3-8ebd-e314c0ec0140,Namespace:calico-system,Attempt:0,}" May 16 03:29:52.144095 containerd[1476]: time="2025-05-16T03:29:52.144027449Z" level=error msg="Failed to destroy network for sandbox \"cc4099d00f7a762471fb6b05e4340637a2739609adc8d5003dbaf8a64bddb1c4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:52.146777 containerd[1476]: time="2025-05-16T03:29:52.146725385Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5gjgv,Uid:ec9adb4c-4eda-47e3-8ebd-e314c0ec0140,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc4099d00f7a762471fb6b05e4340637a2739609adc8d5003dbaf8a64bddb1c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:52.147594 kubelet[2715]: E0516 03:29:52.147021 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc4099d00f7a762471fb6b05e4340637a2739609adc8d5003dbaf8a64bddb1c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:29:52.147594 kubelet[2715]: E0516 03:29:52.147084 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc4099d00f7a762471fb6b05e4340637a2739609adc8d5003dbaf8a64bddb1c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5gjgv" May 16 03:29:52.147594 kubelet[2715]: E0516 03:29:52.147107 2715 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc4099d00f7a762471fb6b05e4340637a2739609adc8d5003dbaf8a64bddb1c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5gjgv" May 16 03:29:52.147969 kubelet[2715]: E0516 03:29:52.147172 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-5gjgv_calico-system(ec9adb4c-4eda-47e3-8ebd-e314c0ec0140)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-5gjgv_calico-system(ec9adb4c-4eda-47e3-8ebd-e314c0ec0140)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cc4099d00f7a762471fb6b05e4340637a2739609adc8d5003dbaf8a64bddb1c4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-5gjgv" podUID="ec9adb4c-4eda-47e3-8ebd-e314c0ec0140" May 16 03:29:52.148848 systemd[1]: run-netns-cni\x2deb3bf2d4\x2d7c02\x2d6c2f\x2d0a65\x2dc9c52c4bdaee.mount: Deactivated successfully. May 16 03:30:01.919558 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4022650002.mount: Deactivated successfully. May 16 03:30:02.026828 containerd[1476]: time="2025-05-16T03:30:02.025183273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594f46878-gp6pw,Uid:73457dc8-5848-40bf-ace2-479bc6054dd2,Namespace:calico-apiserver,Attempt:0,}" May 16 03:30:02.028873 containerd[1476]: time="2025-05-16T03:30:02.028771376Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-4kgrm,Uid:c0064a9e-4be1-4ce0-a21f-9e78adbef175,Namespace:calico-system,Attempt:0,}" May 16 03:30:02.029234 containerd[1476]: time="2025-05-16T03:30:02.029153062Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594f46878-mm7r4,Uid:60ec12df-16f1-4a36-aa58-7ef6c1d545b7,Namespace:calico-apiserver,Attempt:0,}" May 16 03:30:02.029816 containerd[1476]: time="2025-05-16T03:30:02.029738281Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mskp9,Uid:44c65729-467c-46d8-ae6e-1add08755687,Namespace:kube-system,Attempt:0,}" May 16 03:30:02.439426 containerd[1476]: time="2025-05-16T03:30:02.433454211Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:02.446009 containerd[1476]: time="2025-05-16T03:30:02.434785039Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=156396372" May 16 03:30:02.446453 containerd[1476]: time="2025-05-16T03:30:02.446412349Z" level=info msg="ImageCreate event name:\"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:02.447710 containerd[1476]: time="2025-05-16T03:30:02.447659640Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:02.448552 containerd[1476]: time="2025-05-16T03:30:02.448509766Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"156396234\" in 11.153189604s" May 16 03:30:02.448622 containerd[1476]: time="2025-05-16T03:30:02.448568586Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\"" May 16 03:30:02.511520 containerd[1476]: time="2025-05-16T03:30:02.511462908Z" level=info msg="CreateContainer within sandbox \"9d7f50032733af504fe12d3345a33b7d3ff2d238a9ef730c1a5afbd6fe910994\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 16 03:30:02.548775 containerd[1476]: time="2025-05-16T03:30:02.548725657Z" level=info msg="Container bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21: CDI devices from CRI Config.CDIDevices: []" May 16 03:30:02.586034 containerd[1476]: time="2025-05-16T03:30:02.585963471Z" level=info msg="CreateContainer within sandbox \"9d7f50032733af504fe12d3345a33b7d3ff2d238a9ef730c1a5afbd6fe910994\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\"" May 16 03:30:02.589701 containerd[1476]: time="2025-05-16T03:30:02.588812116Z" level=info msg="StartContainer for \"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\"" May 16 03:30:02.605953 containerd[1476]: time="2025-05-16T03:30:02.605860911Z" level=info msg="connecting to shim bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21" address="unix:///run/containerd/s/ece97cd8b27fba53a9b5cfc543dcdc139306b163de31f77d33722dd7c2566970" protocol=ttrpc version=3 May 16 03:30:02.621196 containerd[1476]: time="2025-05-16T03:30:02.621133513Z" level=error msg="Failed to destroy network for sandbox \"cf6977e67aad840fd9716850f6e0c7076d201529011da1d0f61766c1f7d5143a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:30:02.629499 containerd[1476]: time="2025-05-16T03:30:02.629426256Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mskp9,Uid:44c65729-467c-46d8-ae6e-1add08755687,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf6977e67aad840fd9716850f6e0c7076d201529011da1d0f61766c1f7d5143a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:30:02.631671 kubelet[2715]: E0516 03:30:02.631611 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf6977e67aad840fd9716850f6e0c7076d201529011da1d0f61766c1f7d5143a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:30:02.633750 kubelet[2715]: E0516 03:30:02.631732 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf6977e67aad840fd9716850f6e0c7076d201529011da1d0f61766c1f7d5143a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-mskp9" May 16 03:30:02.633750 kubelet[2715]: E0516 03:30:02.631778 2715 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf6977e67aad840fd9716850f6e0c7076d201529011da1d0f61766c1f7d5143a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-mskp9" May 16 03:30:02.633750 kubelet[2715]: E0516 03:30:02.631863 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-mskp9_kube-system(44c65729-467c-46d8-ae6e-1add08755687)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-mskp9_kube-system(44c65729-467c-46d8-ae6e-1add08755687)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cf6977e67aad840fd9716850f6e0c7076d201529011da1d0f61766c1f7d5143a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-mskp9" podUID="44c65729-467c-46d8-ae6e-1add08755687" May 16 03:30:02.638818 containerd[1476]: time="2025-05-16T03:30:02.638272948Z" level=error msg="Failed to destroy network for sandbox \"ee52e2397ebab0ee418a1a74387a9f216a6c8a572992466b86cd2cc47ae1cc01\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:30:02.642622 containerd[1476]: time="2025-05-16T03:30:02.642574631Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594f46878-mm7r4,Uid:60ec12df-16f1-4a36-aa58-7ef6c1d545b7,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee52e2397ebab0ee418a1a74387a9f216a6c8a572992466b86cd2cc47ae1cc01\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:30:02.645016 kubelet[2715]: E0516 03:30:02.644936 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee52e2397ebab0ee418a1a74387a9f216a6c8a572992466b86cd2cc47ae1cc01\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:30:02.645155 kubelet[2715]: E0516 03:30:02.645049 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee52e2397ebab0ee418a1a74387a9f216a6c8a572992466b86cd2cc47ae1cc01\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-594f46878-mm7r4" May 16 03:30:02.645155 kubelet[2715]: E0516 03:30:02.645076 2715 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee52e2397ebab0ee418a1a74387a9f216a6c8a572992466b86cd2cc47ae1cc01\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-594f46878-mm7r4" May 16 03:30:02.645155 kubelet[2715]: E0516 03:30:02.645133 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-594f46878-mm7r4_calico-apiserver(60ec12df-16f1-4a36-aa58-7ef6c1d545b7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-594f46878-mm7r4_calico-apiserver(60ec12df-16f1-4a36-aa58-7ef6c1d545b7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ee52e2397ebab0ee418a1a74387a9f216a6c8a572992466b86cd2cc47ae1cc01\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-594f46878-mm7r4" podUID="60ec12df-16f1-4a36-aa58-7ef6c1d545b7" May 16 03:30:02.647363 containerd[1476]: time="2025-05-16T03:30:02.647097669Z" level=error msg="Failed to destroy network for sandbox \"db1312536c4601e48d451f8a7551d81b0d77371adb6e44b24d7faae99d3c2714\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:30:02.650333 containerd[1476]: time="2025-05-16T03:30:02.650181777Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-4kgrm,Uid:c0064a9e-4be1-4ce0-a21f-9e78adbef175,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"db1312536c4601e48d451f8a7551d81b0d77371adb6e44b24d7faae99d3c2714\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:30:02.650747 kubelet[2715]: E0516 03:30:02.650624 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db1312536c4601e48d451f8a7551d81b0d77371adb6e44b24d7faae99d3c2714\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:30:02.650747 kubelet[2715]: E0516 03:30:02.650707 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db1312536c4601e48d451f8a7551d81b0d77371adb6e44b24d7faae99d3c2714\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-4kgrm" May 16 03:30:02.652257 kubelet[2715]: E0516 03:30:02.650743 2715 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db1312536c4601e48d451f8a7551d81b0d77371adb6e44b24d7faae99d3c2714\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-4kgrm" May 16 03:30:02.652257 kubelet[2715]: E0516 03:30:02.650806 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-78d55f7ddc-4kgrm_calico-system(c0064a9e-4be1-4ce0-a21f-9e78adbef175)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-78d55f7ddc-4kgrm_calico-system(c0064a9e-4be1-4ce0-a21f-9e78adbef175)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"db1312536c4601e48d451f8a7551d81b0d77371adb6e44b24d7faae99d3c2714\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:30:02.660612 containerd[1476]: time="2025-05-16T03:30:02.660245353Z" level=error msg="Failed to destroy network for sandbox \"14e4d1f7924b407ee70f2f19e9802d1a9abed55871186b28ae7ac145a82d4f24\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:30:02.662049 containerd[1476]: time="2025-05-16T03:30:02.661969178Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594f46878-gp6pw,Uid:73457dc8-5848-40bf-ace2-479bc6054dd2,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"14e4d1f7924b407ee70f2f19e9802d1a9abed55871186b28ae7ac145a82d4f24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:30:02.662550 kubelet[2715]: E0516 03:30:02.662501 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14e4d1f7924b407ee70f2f19e9802d1a9abed55871186b28ae7ac145a82d4f24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 03:30:02.662727 kubelet[2715]: E0516 03:30:02.662587 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14e4d1f7924b407ee70f2f19e9802d1a9abed55871186b28ae7ac145a82d4f24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-594f46878-gp6pw" May 16 03:30:02.662727 kubelet[2715]: E0516 03:30:02.662615 2715 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14e4d1f7924b407ee70f2f19e9802d1a9abed55871186b28ae7ac145a82d4f24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-594f46878-gp6pw" May 16 03:30:02.662727 kubelet[2715]: E0516 03:30:02.662668 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-594f46878-gp6pw_calico-apiserver(73457dc8-5848-40bf-ace2-479bc6054dd2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-594f46878-gp6pw_calico-apiserver(73457dc8-5848-40bf-ace2-479bc6054dd2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"14e4d1f7924b407ee70f2f19e9802d1a9abed55871186b28ae7ac145a82d4f24\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-594f46878-gp6pw" podUID="73457dc8-5848-40bf-ace2-479bc6054dd2" May 16 03:30:02.700186 systemd[1]: Started cri-containerd-bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21.scope - libcontainer container bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21. May 16 03:30:02.762188 containerd[1476]: time="2025-05-16T03:30:02.762128323Z" level=info msg="StartContainer for \"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" returns successfully" May 16 03:30:02.866819 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 16 03:30:02.866962 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 16 03:30:02.920854 systemd[1]: run-netns-cni\x2d12047600\x2dc973\x2d18f6\x2db63c\x2d7a2f929a9218.mount: Deactivated successfully. May 16 03:30:02.921354 systemd[1]: run-netns-cni\x2df57749d5\x2d6274\x2d8f2f\x2de131\x2deb86c68f9383.mount: Deactivated successfully. May 16 03:30:02.921463 systemd[1]: run-netns-cni\x2d0e4dbd38\x2d69a9\x2d79ba\x2dbb6e\x2dfa7096a01d2d.mount: Deactivated successfully. May 16 03:30:02.921546 systemd[1]: run-netns-cni\x2d9a38e7de\x2d8d43\x2dd4f7\x2d78b4\x2d965e12ba5563.mount: Deactivated successfully. May 16 03:30:03.117100 kubelet[2715]: I0516 03:30:03.117036 2715 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e642d357-631b-4841-8832-f5200203d19c-whisker-ca-bundle\") pod \"e642d357-631b-4841-8832-f5200203d19c\" (UID: \"e642d357-631b-4841-8832-f5200203d19c\") " May 16 03:30:03.117100 kubelet[2715]: I0516 03:30:03.117100 2715 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e642d357-631b-4841-8832-f5200203d19c-whisker-backend-key-pair\") pod \"e642d357-631b-4841-8832-f5200203d19c\" (UID: \"e642d357-631b-4841-8832-f5200203d19c\") " May 16 03:30:03.117328 kubelet[2715]: I0516 03:30:03.117136 2715 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9swkk\" (UniqueName: \"kubernetes.io/projected/e642d357-631b-4841-8832-f5200203d19c-kube-api-access-9swkk\") pod \"e642d357-631b-4841-8832-f5200203d19c\" (UID: \"e642d357-631b-4841-8832-f5200203d19c\") " May 16 03:30:03.119038 kubelet[2715]: I0516 03:30:03.118075 2715 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e642d357-631b-4841-8832-f5200203d19c-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "e642d357-631b-4841-8832-f5200203d19c" (UID: "e642d357-631b-4841-8832-f5200203d19c"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 16 03:30:03.133124 kubelet[2715]: I0516 03:30:03.132913 2715 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e642d357-631b-4841-8832-f5200203d19c-kube-api-access-9swkk" (OuterVolumeSpecName: "kube-api-access-9swkk") pod "e642d357-631b-4841-8832-f5200203d19c" (UID: "e642d357-631b-4841-8832-f5200203d19c"). InnerVolumeSpecName "kube-api-access-9swkk". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 16 03:30:03.134010 kubelet[2715]: I0516 03:30:03.133413 2715 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e642d357-631b-4841-8832-f5200203d19c-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "e642d357-631b-4841-8832-f5200203d19c" (UID: "e642d357-631b-4841-8832-f5200203d19c"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 16 03:30:03.134553 systemd[1]: var-lib-kubelet-pods-e642d357\x2d631b\x2d4841\x2d8832\x2df5200203d19c-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 16 03:30:03.143013 systemd[1]: var-lib-kubelet-pods-e642d357\x2d631b\x2d4841\x2d8832\x2df5200203d19c-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d9swkk.mount: Deactivated successfully. May 16 03:30:03.218482 kubelet[2715]: I0516 03:30:03.218394 2715 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e642d357-631b-4841-8832-f5200203d19c-whisker-ca-bundle\") on node \"ci-4284-0-0-n-34cf5e3c62.novalocal\" DevicePath \"\"" May 16 03:30:03.218482 kubelet[2715]: I0516 03:30:03.218432 2715 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e642d357-631b-4841-8832-f5200203d19c-whisker-backend-key-pair\") on node \"ci-4284-0-0-n-34cf5e3c62.novalocal\" DevicePath \"\"" May 16 03:30:03.218482 kubelet[2715]: I0516 03:30:03.218443 2715 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9swkk\" (UniqueName: \"kubernetes.io/projected/e642d357-631b-4841-8832-f5200203d19c-kube-api-access-9swkk\") on node \"ci-4284-0-0-n-34cf5e3c62.novalocal\" DevicePath \"\"" May 16 03:30:03.347707 systemd[1]: Removed slice kubepods-besteffort-pode642d357_631b_4841_8832_f5200203d19c.slice - libcontainer container kubepods-besteffort-pode642d357_631b_4841_8832_f5200203d19c.slice. May 16 03:30:03.377257 kubelet[2715]: I0516 03:30:03.376258 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-rjbd9" podStartSLOduration=1.493723234 podStartE2EDuration="27.376220935s" podCreationTimestamp="2025-05-16 03:29:36 +0000 UTC" firstStartedPulling="2025-05-16 03:29:36.571565196 +0000 UTC m=+22.704838943" lastFinishedPulling="2025-05-16 03:30:02.454062896 +0000 UTC m=+48.587336644" observedRunningTime="2025-05-16 03:30:03.373559921 +0000 UTC m=+49.506833678" watchObservedRunningTime="2025-05-16 03:30:03.376220935 +0000 UTC m=+49.509494682" May 16 03:30:03.478076 systemd[1]: Created slice kubepods-besteffort-pod32870edc_65ff_47a6_9110_9ba1fe628ed6.slice - libcontainer container kubepods-besteffort-pod32870edc_65ff_47a6_9110_9ba1fe628ed6.slice. May 16 03:30:03.621493 kubelet[2715]: I0516 03:30:03.621411 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/32870edc-65ff-47a6-9110-9ba1fe628ed6-whisker-backend-key-pair\") pod \"whisker-79cdbd85dd-m22sm\" (UID: \"32870edc-65ff-47a6-9110-9ba1fe628ed6\") " pod="calico-system/whisker-79cdbd85dd-m22sm" May 16 03:30:03.621493 kubelet[2715]: I0516 03:30:03.621473 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32870edc-65ff-47a6-9110-9ba1fe628ed6-whisker-ca-bundle\") pod \"whisker-79cdbd85dd-m22sm\" (UID: \"32870edc-65ff-47a6-9110-9ba1fe628ed6\") " pod="calico-system/whisker-79cdbd85dd-m22sm" May 16 03:30:03.621493 kubelet[2715]: I0516 03:30:03.621501 2715 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfk9k\" (UniqueName: \"kubernetes.io/projected/32870edc-65ff-47a6-9110-9ba1fe628ed6-kube-api-access-gfk9k\") pod \"whisker-79cdbd85dd-m22sm\" (UID: \"32870edc-65ff-47a6-9110-9ba1fe628ed6\") " pod="calico-system/whisker-79cdbd85dd-m22sm" May 16 03:30:03.782537 containerd[1476]: time="2025-05-16T03:30:03.782312524Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79cdbd85dd-m22sm,Uid:32870edc-65ff-47a6-9110-9ba1fe628ed6,Namespace:calico-system,Attempt:0,}" May 16 03:30:04.028429 kubelet[2715]: I0516 03:30:04.027520 2715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e642d357-631b-4841-8832-f5200203d19c" path="/var/lib/kubelet/pods/e642d357-631b-4841-8832-f5200203d19c/volumes" May 16 03:30:04.081233 systemd-networkd[1395]: cali5c843a0002d: Link UP May 16 03:30:04.081496 systemd-networkd[1395]: cali5c843a0002d: Gained carrier May 16 03:30:04.111999 containerd[1476]: 2025-05-16 03:30:03.838 [INFO][3905] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 16 03:30:04.111999 containerd[1476]: 2025-05-16 03:30:03.944 [INFO][3905] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-whisker--79cdbd85dd--m22sm-eth0 whisker-79cdbd85dd- calico-system 32870edc-65ff-47a6-9110-9ba1fe628ed6 925 0 2025-05-16 03:30:03 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:79cdbd85dd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4284-0-0-n-34cf5e3c62.novalocal whisker-79cdbd85dd-m22sm eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali5c843a0002d [] [] }} ContainerID="f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" Namespace="calico-system" Pod="whisker-79cdbd85dd-m22sm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-whisker--79cdbd85dd--m22sm-" May 16 03:30:04.111999 containerd[1476]: 2025-05-16 03:30:03.944 [INFO][3905] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" Namespace="calico-system" Pod="whisker-79cdbd85dd-m22sm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-whisker--79cdbd85dd--m22sm-eth0" May 16 03:30:04.111999 containerd[1476]: 2025-05-16 03:30:04.001 [INFO][3917] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" HandleID="k8s-pod-network.f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-whisker--79cdbd85dd--m22sm-eth0" May 16 03:30:04.112444 containerd[1476]: 2025-05-16 03:30:04.001 [INFO][3917] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" HandleID="k8s-pod-network.f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-whisker--79cdbd85dd--m22sm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002b3a50), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-34cf5e3c62.novalocal", "pod":"whisker-79cdbd85dd-m22sm", "timestamp":"2025-05-16 03:30:04.001487727 +0000 UTC"}, Hostname:"ci-4284-0-0-n-34cf5e3c62.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 03:30:04.112444 containerd[1476]: 2025-05-16 03:30:04.001 [INFO][3917] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 03:30:04.112444 containerd[1476]: 2025-05-16 03:30:04.002 [INFO][3917] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 03:30:04.112444 containerd[1476]: 2025-05-16 03:30:04.002 [INFO][3917] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-34cf5e3c62.novalocal' May 16 03:30:04.112444 containerd[1476]: 2025-05-16 03:30:04.017 [INFO][3917] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:04.112444 containerd[1476]: 2025-05-16 03:30:04.033 [INFO][3917] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:04.112444 containerd[1476]: 2025-05-16 03:30:04.040 [INFO][3917] ipam/ipam.go 511: Trying affinity for 192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:04.112444 containerd[1476]: 2025-05-16 03:30:04.043 [INFO][3917] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:04.112444 containerd[1476]: 2025-05-16 03:30:04.047 [INFO][3917] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:04.115571 containerd[1476]: 2025-05-16 03:30:04.047 [INFO][3917] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.0/26 handle="k8s-pod-network.f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:04.115571 containerd[1476]: 2025-05-16 03:30:04.050 [INFO][3917] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819 May 16 03:30:04.115571 containerd[1476]: 2025-05-16 03:30:04.057 [INFO][3917] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.0/26 handle="k8s-pod-network.f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:04.115571 containerd[1476]: 2025-05-16 03:30:04.065 [INFO][3917] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.1/26] block=192.168.116.0/26 handle="k8s-pod-network.f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:04.115571 containerd[1476]: 2025-05-16 03:30:04.065 [INFO][3917] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.1/26] handle="k8s-pod-network.f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:04.115571 containerd[1476]: 2025-05-16 03:30:04.065 [INFO][3917] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 03:30:04.115571 containerd[1476]: 2025-05-16 03:30:04.065 [INFO][3917] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.1/26] IPv6=[] ContainerID="f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" HandleID="k8s-pod-network.f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-whisker--79cdbd85dd--m22sm-eth0" May 16 03:30:04.115795 containerd[1476]: 2025-05-16 03:30:04.068 [INFO][3905] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" Namespace="calico-system" Pod="whisker-79cdbd85dd-m22sm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-whisker--79cdbd85dd--m22sm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-whisker--79cdbd85dd--m22sm-eth0", GenerateName:"whisker-79cdbd85dd-", Namespace:"calico-system", SelfLink:"", UID:"32870edc-65ff-47a6-9110-9ba1fe628ed6", ResourceVersion:"925", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 30, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79cdbd85dd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"", Pod:"whisker-79cdbd85dd-m22sm", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.116.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali5c843a0002d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:04.115895 containerd[1476]: 2025-05-16 03:30:04.068 [INFO][3905] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.1/32] ContainerID="f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" Namespace="calico-system" Pod="whisker-79cdbd85dd-m22sm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-whisker--79cdbd85dd--m22sm-eth0" May 16 03:30:04.115895 containerd[1476]: 2025-05-16 03:30:04.068 [INFO][3905] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5c843a0002d ContainerID="f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" Namespace="calico-system" Pod="whisker-79cdbd85dd-m22sm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-whisker--79cdbd85dd--m22sm-eth0" May 16 03:30:04.115895 containerd[1476]: 2025-05-16 03:30:04.082 [INFO][3905] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" Namespace="calico-system" Pod="whisker-79cdbd85dd-m22sm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-whisker--79cdbd85dd--m22sm-eth0" May 16 03:30:04.116045 containerd[1476]: 2025-05-16 03:30:04.082 [INFO][3905] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" Namespace="calico-system" Pod="whisker-79cdbd85dd-m22sm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-whisker--79cdbd85dd--m22sm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-whisker--79cdbd85dd--m22sm-eth0", GenerateName:"whisker-79cdbd85dd-", Namespace:"calico-system", SelfLink:"", UID:"32870edc-65ff-47a6-9110-9ba1fe628ed6", ResourceVersion:"925", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 30, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79cdbd85dd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819", Pod:"whisker-79cdbd85dd-m22sm", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.116.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali5c843a0002d", MAC:"a6:46:26:56:b1:6b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:04.117464 containerd[1476]: 2025-05-16 03:30:04.104 [INFO][3905] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" Namespace="calico-system" Pod="whisker-79cdbd85dd-m22sm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-whisker--79cdbd85dd--m22sm-eth0" May 16 03:30:04.213511 containerd[1476]: time="2025-05-16T03:30:04.212891472Z" level=info msg="connecting to shim f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819" address="unix:///run/containerd/s/8681d75e43f5f115167543e63822ad29dda6b27c7c4831cf3bae6efa97436fba" namespace=k8s.io protocol=ttrpc version=3 May 16 03:30:04.255173 systemd[1]: Started cri-containerd-f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819.scope - libcontainer container f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819. May 16 03:30:04.339424 containerd[1476]: time="2025-05-16T03:30:04.339126095Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79cdbd85dd-m22sm,Uid:32870edc-65ff-47a6-9110-9ba1fe628ed6,Namespace:calico-system,Attempt:0,} returns sandbox id \"f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819\"" May 16 03:30:04.344053 containerd[1476]: time="2025-05-16T03:30:04.342939641Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 16 03:30:04.891659 containerd[1476]: time="2025-05-16T03:30:04.891003969Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:30:05.020742 containerd[1476]: time="2025-05-16T03:30:05.020506903Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5gjgv,Uid:ec9adb4c-4eda-47e3-8ebd-e314c0ec0140,Namespace:calico-system,Attempt:0,}" May 16 03:30:05.138304 containerd[1476]: time="2025-05-16T03:30:05.138138836Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:30:05.138484 containerd[1476]: time="2025-05-16T03:30:05.138282175Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 16 03:30:05.138585 kubelet[2715]: E0516 03:30:05.138533 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 03:30:05.139145 kubelet[2715]: E0516 03:30:05.138606 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 03:30:05.139192 kubelet[2715]: E0516 03:30:05.138854 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1b39ead48ab84d5997218c5ff179c936,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gfk9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79cdbd85dd-m22sm_calico-system(32870edc-65ff-47a6-9110-9ba1fe628ed6): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:30:05.141416 containerd[1476]: time="2025-05-16T03:30:05.141370309Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 16 03:30:05.388688 kernel: bpftool[4116]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set May 16 03:30:05.434263 systemd-networkd[1395]: cali75ed29da897: Link UP May 16 03:30:05.440112 systemd-networkd[1395]: cali75ed29da897: Gained carrier May 16 03:30:05.459574 containerd[1476]: 2025-05-16 03:30:05.272 [INFO][4089] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 16 03:30:05.459574 containerd[1476]: 2025-05-16 03:30:05.304 [INFO][4089] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-csi--node--driver--5gjgv-eth0 csi-node-driver- calico-system ec9adb4c-4eda-47e3-8ebd-e314c0ec0140 723 0 2025-05-16 03:29:36 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:78f6f74485 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4284-0-0-n-34cf5e3c62.novalocal csi-node-driver-5gjgv eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali75ed29da897 [] [] }} ContainerID="ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" Namespace="calico-system" Pod="csi-node-driver-5gjgv" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-csi--node--driver--5gjgv-" May 16 03:30:05.459574 containerd[1476]: 2025-05-16 03:30:05.304 [INFO][4089] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" Namespace="calico-system" Pod="csi-node-driver-5gjgv" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-csi--node--driver--5gjgv-eth0" May 16 03:30:05.459574 containerd[1476]: 2025-05-16 03:30:05.377 [INFO][4103] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" HandleID="k8s-pod-network.ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-csi--node--driver--5gjgv-eth0" May 16 03:30:05.459909 containerd[1476]: 2025-05-16 03:30:05.377 [INFO][4103] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" HandleID="k8s-pod-network.ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-csi--node--driver--5gjgv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f870), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-34cf5e3c62.novalocal", "pod":"csi-node-driver-5gjgv", "timestamp":"2025-05-16 03:30:05.377643758 +0000 UTC"}, Hostname:"ci-4284-0-0-n-34cf5e3c62.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 03:30:05.459909 containerd[1476]: 2025-05-16 03:30:05.378 [INFO][4103] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 03:30:05.459909 containerd[1476]: 2025-05-16 03:30:05.378 [INFO][4103] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 03:30:05.459909 containerd[1476]: 2025-05-16 03:30:05.378 [INFO][4103] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-34cf5e3c62.novalocal' May 16 03:30:05.459909 containerd[1476]: 2025-05-16 03:30:05.391 [INFO][4103] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:05.459909 containerd[1476]: 2025-05-16 03:30:05.398 [INFO][4103] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:05.459909 containerd[1476]: 2025-05-16 03:30:05.403 [INFO][4103] ipam/ipam.go 511: Trying affinity for 192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:05.459909 containerd[1476]: 2025-05-16 03:30:05.406 [INFO][4103] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:05.459909 containerd[1476]: 2025-05-16 03:30:05.409 [INFO][4103] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:05.460350 containerd[1476]: 2025-05-16 03:30:05.411 [INFO][4103] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.0/26 handle="k8s-pod-network.ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:05.460350 containerd[1476]: 2025-05-16 03:30:05.413 [INFO][4103] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7 May 16 03:30:05.460350 containerd[1476]: 2025-05-16 03:30:05.419 [INFO][4103] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.0/26 handle="k8s-pod-network.ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:05.460350 containerd[1476]: 2025-05-16 03:30:05.428 [INFO][4103] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.2/26] block=192.168.116.0/26 handle="k8s-pod-network.ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:05.460350 containerd[1476]: 2025-05-16 03:30:05.428 [INFO][4103] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.2/26] handle="k8s-pod-network.ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:05.460350 containerd[1476]: 2025-05-16 03:30:05.428 [INFO][4103] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 03:30:05.460350 containerd[1476]: 2025-05-16 03:30:05.428 [INFO][4103] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.2/26] IPv6=[] ContainerID="ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" HandleID="k8s-pod-network.ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-csi--node--driver--5gjgv-eth0" May 16 03:30:05.460584 containerd[1476]: 2025-05-16 03:30:05.430 [INFO][4089] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" Namespace="calico-system" Pod="csi-node-driver-5gjgv" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-csi--node--driver--5gjgv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-csi--node--driver--5gjgv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ec9adb4c-4eda-47e3-8ebd-e314c0ec0140", ResourceVersion:"723", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 29, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"", Pod:"csi-node-driver-5gjgv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.116.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali75ed29da897", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:05.461437 containerd[1476]: 2025-05-16 03:30:05.430 [INFO][4089] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.2/32] ContainerID="ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" Namespace="calico-system" Pod="csi-node-driver-5gjgv" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-csi--node--driver--5gjgv-eth0" May 16 03:30:05.461437 containerd[1476]: 2025-05-16 03:30:05.430 [INFO][4089] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali75ed29da897 ContainerID="ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" Namespace="calico-system" Pod="csi-node-driver-5gjgv" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-csi--node--driver--5gjgv-eth0" May 16 03:30:05.461437 containerd[1476]: 2025-05-16 03:30:05.436 [INFO][4089] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" Namespace="calico-system" Pod="csi-node-driver-5gjgv" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-csi--node--driver--5gjgv-eth0" May 16 03:30:05.461531 containerd[1476]: 2025-05-16 03:30:05.437 [INFO][4089] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" Namespace="calico-system" Pod="csi-node-driver-5gjgv" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-csi--node--driver--5gjgv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-csi--node--driver--5gjgv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ec9adb4c-4eda-47e3-8ebd-e314c0ec0140", ResourceVersion:"723", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 29, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7", Pod:"csi-node-driver-5gjgv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.116.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali75ed29da897", MAC:"26:50:2b:8e:88:9d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:05.461607 containerd[1476]: 2025-05-16 03:30:05.454 [INFO][4089] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" Namespace="calico-system" Pod="csi-node-driver-5gjgv" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-csi--node--driver--5gjgv-eth0" May 16 03:30:05.480222 systemd-networkd[1395]: cali5c843a0002d: Gained IPv6LL May 16 03:30:05.509013 containerd[1476]: time="2025-05-16T03:30:05.508943347Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:30:05.511128 containerd[1476]: time="2025-05-16T03:30:05.510691527Z" level=info msg="connecting to shim ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7" address="unix:///run/containerd/s/e15187788fc978fc9172499e65b584a91abc96c3e79f7428ca324e5a782606bc" namespace=k8s.io protocol=ttrpc version=3 May 16 03:30:05.511790 containerd[1476]: time="2025-05-16T03:30:05.511745094Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:30:05.512067 containerd[1476]: time="2025-05-16T03:30:05.511900095Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 16 03:30:05.512568 kubelet[2715]: E0516 03:30:05.512482 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 03:30:05.512688 kubelet[2715]: E0516 03:30:05.512566 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 03:30:05.512805 kubelet[2715]: E0516 03:30:05.512719 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gfk9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79cdbd85dd-m22sm_calico-system(32870edc-65ff-47a6-9110-9ba1fe628ed6): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:30:05.514595 kubelet[2715]: E0516 03:30:05.514533 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:30:05.552154 systemd[1]: Started cri-containerd-ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7.scope - libcontainer container ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7. May 16 03:30:05.601653 containerd[1476]: time="2025-05-16T03:30:05.601521065Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5gjgv,Uid:ec9adb4c-4eda-47e3-8ebd-e314c0ec0140,Namespace:calico-system,Attempt:0,} returns sandbox id \"ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7\"" May 16 03:30:05.605672 containerd[1476]: time="2025-05-16T03:30:05.605620297Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 16 03:30:05.823967 systemd-networkd[1395]: vxlan.calico: Link UP May 16 03:30:05.826081 systemd-networkd[1395]: vxlan.calico: Gained carrier May 16 03:30:06.363125 kubelet[2715]: E0516 03:30:06.361845 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:30:07.021963 containerd[1476]: time="2025-05-16T03:30:07.021794603Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-54gvm,Uid:fb595fc0-7f7d-427f-87b1-bb4b1008771a,Namespace:kube-system,Attempt:0,}" May 16 03:30:07.023237 containerd[1476]: time="2025-05-16T03:30:07.022521267Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58c54ff885-qgwjf,Uid:c58e974e-e8d9-443d-a0c3-984047d750c6,Namespace:calico-system,Attempt:0,}" May 16 03:30:07.143169 systemd-networkd[1395]: cali75ed29da897: Gained IPv6LL May 16 03:30:07.255205 systemd-networkd[1395]: cali83984adc8a4: Link UP May 16 03:30:07.255485 systemd-networkd[1395]: cali83984adc8a4: Gained carrier May 16 03:30:07.278245 containerd[1476]: 2025-05-16 03:30:07.133 [INFO][4245] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--54gvm-eth0 coredns-668d6bf9bc- kube-system fb595fc0-7f7d-427f-87b1-bb4b1008771a 841 0 2025-05-16 03:29:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-34cf5e3c62.novalocal coredns-668d6bf9bc-54gvm eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali83984adc8a4 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" Namespace="kube-system" Pod="coredns-668d6bf9bc-54gvm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--54gvm-" May 16 03:30:07.278245 containerd[1476]: 2025-05-16 03:30:07.133 [INFO][4245] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" Namespace="kube-system" Pod="coredns-668d6bf9bc-54gvm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--54gvm-eth0" May 16 03:30:07.278245 containerd[1476]: 2025-05-16 03:30:07.183 [INFO][4268] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" HandleID="k8s-pod-network.89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--54gvm-eth0" May 16 03:30:07.279278 containerd[1476]: 2025-05-16 03:30:07.183 [INFO][4268] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" HandleID="k8s-pod-network.89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--54gvm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d3240), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-34cf5e3c62.novalocal", "pod":"coredns-668d6bf9bc-54gvm", "timestamp":"2025-05-16 03:30:07.1833513 +0000 UTC"}, Hostname:"ci-4284-0-0-n-34cf5e3c62.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 03:30:07.279278 containerd[1476]: 2025-05-16 03:30:07.183 [INFO][4268] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 03:30:07.279278 containerd[1476]: 2025-05-16 03:30:07.183 [INFO][4268] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 03:30:07.279278 containerd[1476]: 2025-05-16 03:30:07.183 [INFO][4268] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-34cf5e3c62.novalocal' May 16 03:30:07.279278 containerd[1476]: 2025-05-16 03:30:07.197 [INFO][4268] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.279278 containerd[1476]: 2025-05-16 03:30:07.204 [INFO][4268] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.279278 containerd[1476]: 2025-05-16 03:30:07.210 [INFO][4268] ipam/ipam.go 511: Trying affinity for 192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.279278 containerd[1476]: 2025-05-16 03:30:07.213 [INFO][4268] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.279278 containerd[1476]: 2025-05-16 03:30:07.218 [INFO][4268] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.279608 containerd[1476]: 2025-05-16 03:30:07.218 [INFO][4268] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.0/26 handle="k8s-pod-network.89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.279608 containerd[1476]: 2025-05-16 03:30:07.222 [INFO][4268] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e May 16 03:30:07.279608 containerd[1476]: 2025-05-16 03:30:07.229 [INFO][4268] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.0/26 handle="k8s-pod-network.89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.279608 containerd[1476]: 2025-05-16 03:30:07.237 [INFO][4268] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.3/26] block=192.168.116.0/26 handle="k8s-pod-network.89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.279608 containerd[1476]: 2025-05-16 03:30:07.238 [INFO][4268] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.3/26] handle="k8s-pod-network.89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.279608 containerd[1476]: 2025-05-16 03:30:07.238 [INFO][4268] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 03:30:07.279608 containerd[1476]: 2025-05-16 03:30:07.238 [INFO][4268] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.3/26] IPv6=[] ContainerID="89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" HandleID="k8s-pod-network.89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--54gvm-eth0" May 16 03:30:07.279973 containerd[1476]: 2025-05-16 03:30:07.240 [INFO][4245] cni-plugin/k8s.go 418: Populated endpoint ContainerID="89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" Namespace="kube-system" Pod="coredns-668d6bf9bc-54gvm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--54gvm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--54gvm-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"fb595fc0-7f7d-427f-87b1-bb4b1008771a", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 29, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"", Pod:"coredns-668d6bf9bc-54gvm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.116.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali83984adc8a4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:07.279973 containerd[1476]: 2025-05-16 03:30:07.242 [INFO][4245] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.3/32] ContainerID="89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" Namespace="kube-system" Pod="coredns-668d6bf9bc-54gvm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--54gvm-eth0" May 16 03:30:07.279973 containerd[1476]: 2025-05-16 03:30:07.243 [INFO][4245] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali83984adc8a4 ContainerID="89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" Namespace="kube-system" Pod="coredns-668d6bf9bc-54gvm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--54gvm-eth0" May 16 03:30:07.279973 containerd[1476]: 2025-05-16 03:30:07.256 [INFO][4245] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" Namespace="kube-system" Pod="coredns-668d6bf9bc-54gvm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--54gvm-eth0" May 16 03:30:07.279973 containerd[1476]: 2025-05-16 03:30:07.257 [INFO][4245] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" Namespace="kube-system" Pod="coredns-668d6bf9bc-54gvm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--54gvm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--54gvm-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"fb595fc0-7f7d-427f-87b1-bb4b1008771a", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 29, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e", Pod:"coredns-668d6bf9bc-54gvm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.116.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali83984adc8a4", MAC:"1a:07:ac:68:4b:63", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:07.279973 containerd[1476]: 2025-05-16 03:30:07.275 [INFO][4245] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" Namespace="kube-system" Pod="coredns-668d6bf9bc-54gvm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--54gvm-eth0" May 16 03:30:07.327442 containerd[1476]: time="2025-05-16T03:30:07.327375419Z" level=info msg="connecting to shim 89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e" address="unix:///run/containerd/s/f34510f9138b4f752147449ab9b7abcf1a223bb721f83c0ef78325d39f934d1b" namespace=k8s.io protocol=ttrpc version=3 May 16 03:30:07.365160 systemd[1]: Started cri-containerd-89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e.scope - libcontainer container 89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e. May 16 03:30:07.377163 systemd-networkd[1395]: califc35af0786d: Link UP May 16 03:30:07.377442 systemd-networkd[1395]: califc35af0786d: Gained carrier May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.148 [INFO][4251] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--kube--controllers--58c54ff885--qgwjf-eth0 calico-kube-controllers-58c54ff885- calico-system c58e974e-e8d9-443d-a0c3-984047d750c6 849 0 2025-05-16 03:29:36 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:58c54ff885 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284-0-0-n-34cf5e3c62.novalocal calico-kube-controllers-58c54ff885-qgwjf eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] califc35af0786d [] [] }} ContainerID="4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" Namespace="calico-system" Pod="calico-kube-controllers-58c54ff885-qgwjf" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--kube--controllers--58c54ff885--qgwjf-" May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.148 [INFO][4251] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" Namespace="calico-system" Pod="calico-kube-controllers-58c54ff885-qgwjf" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--kube--controllers--58c54ff885--qgwjf-eth0" May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.199 [INFO][4273] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" HandleID="k8s-pod-network.4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--kube--controllers--58c54ff885--qgwjf-eth0" May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.199 [INFO][4273] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" HandleID="k8s-pod-network.4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--kube--controllers--58c54ff885--qgwjf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000233020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-34cf5e3c62.novalocal", "pod":"calico-kube-controllers-58c54ff885-qgwjf", "timestamp":"2025-05-16 03:30:07.199499699 +0000 UTC"}, Hostname:"ci-4284-0-0-n-34cf5e3c62.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.199 [INFO][4273] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.238 [INFO][4273] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.238 [INFO][4273] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-34cf5e3c62.novalocal' May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.298 [INFO][4273] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.309 [INFO][4273] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.318 [INFO][4273] ipam/ipam.go 511: Trying affinity for 192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.322 [INFO][4273] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.325 [INFO][4273] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.325 [INFO][4273] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.0/26 handle="k8s-pod-network.4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.331 [INFO][4273] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92 May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.343 [INFO][4273] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.0/26 handle="k8s-pod-network.4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.367 [INFO][4273] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.4/26] block=192.168.116.0/26 handle="k8s-pod-network.4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.367 [INFO][4273] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.4/26] handle="k8s-pod-network.4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.367 [INFO][4273] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 03:30:07.405525 containerd[1476]: 2025-05-16 03:30:07.367 [INFO][4273] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.4/26] IPv6=[] ContainerID="4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" HandleID="k8s-pod-network.4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--kube--controllers--58c54ff885--qgwjf-eth0" May 16 03:30:07.406774 containerd[1476]: 2025-05-16 03:30:07.370 [INFO][4251] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" Namespace="calico-system" Pod="calico-kube-controllers-58c54ff885-qgwjf" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--kube--controllers--58c54ff885--qgwjf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--kube--controllers--58c54ff885--qgwjf-eth0", GenerateName:"calico-kube-controllers-58c54ff885-", Namespace:"calico-system", SelfLink:"", UID:"c58e974e-e8d9-443d-a0c3-984047d750c6", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 29, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58c54ff885", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"", Pod:"calico-kube-controllers-58c54ff885-qgwjf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.116.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"califc35af0786d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:07.406774 containerd[1476]: 2025-05-16 03:30:07.370 [INFO][4251] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.4/32] ContainerID="4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" Namespace="calico-system" Pod="calico-kube-controllers-58c54ff885-qgwjf" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--kube--controllers--58c54ff885--qgwjf-eth0" May 16 03:30:07.406774 containerd[1476]: 2025-05-16 03:30:07.370 [INFO][4251] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califc35af0786d ContainerID="4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" Namespace="calico-system" Pod="calico-kube-controllers-58c54ff885-qgwjf" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--kube--controllers--58c54ff885--qgwjf-eth0" May 16 03:30:07.406774 containerd[1476]: 2025-05-16 03:30:07.376 [INFO][4251] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" Namespace="calico-system" Pod="calico-kube-controllers-58c54ff885-qgwjf" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--kube--controllers--58c54ff885--qgwjf-eth0" May 16 03:30:07.406774 containerd[1476]: 2025-05-16 03:30:07.376 [INFO][4251] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" Namespace="calico-system" Pod="calico-kube-controllers-58c54ff885-qgwjf" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--kube--controllers--58c54ff885--qgwjf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--kube--controllers--58c54ff885--qgwjf-eth0", GenerateName:"calico-kube-controllers-58c54ff885-", Namespace:"calico-system", SelfLink:"", UID:"c58e974e-e8d9-443d-a0c3-984047d750c6", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 29, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58c54ff885", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92", Pod:"calico-kube-controllers-58c54ff885-qgwjf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.116.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"califc35af0786d", MAC:"de:2d:da:bc:ec:d6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:07.406774 containerd[1476]: 2025-05-16 03:30:07.398 [INFO][4251] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" Namespace="calico-system" Pod="calico-kube-controllers-58c54ff885-qgwjf" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--kube--controllers--58c54ff885--qgwjf-eth0" May 16 03:30:07.457954 containerd[1476]: time="2025-05-16T03:30:07.457770184Z" level=info msg="connecting to shim 4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92" address="unix:///run/containerd/s/5451c429303577c633e62f0db18456c83efc1bfe11acb25cef4b99cab3754ff5" namespace=k8s.io protocol=ttrpc version=3 May 16 03:30:07.494492 containerd[1476]: time="2025-05-16T03:30:07.493964126Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-54gvm,Uid:fb595fc0-7f7d-427f-87b1-bb4b1008771a,Namespace:kube-system,Attempt:0,} returns sandbox id \"89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e\"" May 16 03:30:07.504200 containerd[1476]: time="2025-05-16T03:30:07.504145461Z" level=info msg="CreateContainer within sandbox \"89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 16 03:30:07.526618 containerd[1476]: time="2025-05-16T03:30:07.526566310Z" level=info msg="Container f284e2aee57741212f6bed34f6be980acef236f021794573e1eb1456c2e37df3: CDI devices from CRI Config.CDIDevices: []" May 16 03:30:07.527242 systemd[1]: Started cri-containerd-4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92.scope - libcontainer container 4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92. May 16 03:30:07.541910 containerd[1476]: time="2025-05-16T03:30:07.541781970Z" level=info msg="CreateContainer within sandbox \"89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f284e2aee57741212f6bed34f6be980acef236f021794573e1eb1456c2e37df3\"" May 16 03:30:07.542864 containerd[1476]: time="2025-05-16T03:30:07.542832571Z" level=info msg="StartContainer for \"f284e2aee57741212f6bed34f6be980acef236f021794573e1eb1456c2e37df3\"" May 16 03:30:07.543825 containerd[1476]: time="2025-05-16T03:30:07.543794375Z" level=info msg="connecting to shim f284e2aee57741212f6bed34f6be980acef236f021794573e1eb1456c2e37df3" address="unix:///run/containerd/s/f34510f9138b4f752147449ab9b7abcf1a223bb721f83c0ef78325d39f934d1b" protocol=ttrpc version=3 May 16 03:30:07.571198 systemd[1]: Started cri-containerd-f284e2aee57741212f6bed34f6be980acef236f021794573e1eb1456c2e37df3.scope - libcontainer container f284e2aee57741212f6bed34f6be980acef236f021794573e1eb1456c2e37df3. May 16 03:30:07.621129 containerd[1476]: time="2025-05-16T03:30:07.621062497Z" level=info msg="StartContainer for \"f284e2aee57741212f6bed34f6be980acef236f021794573e1eb1456c2e37df3\" returns successfully" May 16 03:30:07.684014 containerd[1476]: time="2025-05-16T03:30:07.682658683Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58c54ff885-qgwjf,Uid:c58e974e-e8d9-443d-a0c3-984047d750c6,Namespace:calico-system,Attempt:0,} returns sandbox id \"4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92\"" May 16 03:30:07.783959 systemd-networkd[1395]: vxlan.calico: Gained IPv6LL May 16 03:30:08.139079 containerd[1476]: time="2025-05-16T03:30:08.139010830Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:08.140123 containerd[1476]: time="2025-05-16T03:30:08.140030955Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8758390" May 16 03:30:08.141793 containerd[1476]: time="2025-05-16T03:30:08.141706658Z" level=info msg="ImageCreate event name:\"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:08.145061 containerd[1476]: time="2025-05-16T03:30:08.144905551Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:08.146764 containerd[1476]: time="2025-05-16T03:30:08.146657067Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"10251093\" in 2.540989752s" May 16 03:30:08.146764 containerd[1476]: time="2025-05-16T03:30:08.146691702Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\"" May 16 03:30:08.150451 containerd[1476]: time="2025-05-16T03:30:08.148474737Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 16 03:30:08.153163 containerd[1476]: time="2025-05-16T03:30:08.153066653Z" level=info msg="CreateContainer within sandbox \"ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 16 03:30:08.167553 containerd[1476]: time="2025-05-16T03:30:08.167295430Z" level=info msg="Container d4cfa518291ececa5c63eec8c09a8956a638541603167ea6e6f565977c71924c: CDI devices from CRI Config.CDIDevices: []" May 16 03:30:08.188802 containerd[1476]: time="2025-05-16T03:30:08.188763171Z" level=info msg="CreateContainer within sandbox \"ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"d4cfa518291ececa5c63eec8c09a8956a638541603167ea6e6f565977c71924c\"" May 16 03:30:08.190713 containerd[1476]: time="2025-05-16T03:30:08.190382880Z" level=info msg="StartContainer for \"d4cfa518291ececa5c63eec8c09a8956a638541603167ea6e6f565977c71924c\"" May 16 03:30:08.193132 containerd[1476]: time="2025-05-16T03:30:08.193103805Z" level=info msg="connecting to shim d4cfa518291ececa5c63eec8c09a8956a638541603167ea6e6f565977c71924c" address="unix:///run/containerd/s/e15187788fc978fc9172499e65b584a91abc96c3e79f7428ca324e5a782606bc" protocol=ttrpc version=3 May 16 03:30:08.223164 systemd[1]: Started cri-containerd-d4cfa518291ececa5c63eec8c09a8956a638541603167ea6e6f565977c71924c.scope - libcontainer container d4cfa518291ececa5c63eec8c09a8956a638541603167ea6e6f565977c71924c. May 16 03:30:08.345055 containerd[1476]: time="2025-05-16T03:30:08.344812926Z" level=info msg="StartContainer for \"d4cfa518291ececa5c63eec8c09a8956a638541603167ea6e6f565977c71924c\" returns successfully" May 16 03:30:08.400921 kubelet[2715]: I0516 03:30:08.399179 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-54gvm" podStartSLOduration=49.399155288 podStartE2EDuration="49.399155288s" podCreationTimestamp="2025-05-16 03:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 03:30:08.397021744 +0000 UTC m=+54.530295511" watchObservedRunningTime="2025-05-16 03:30:08.399155288 +0000 UTC m=+54.532429035" May 16 03:30:08.616325 systemd-networkd[1395]: califc35af0786d: Gained IPv6LL May 16 03:30:08.871369 systemd-networkd[1395]: cali83984adc8a4: Gained IPv6LL May 16 03:30:13.020897 containerd[1476]: time="2025-05-16T03:30:13.020805441Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594f46878-mm7r4,Uid:60ec12df-16f1-4a36-aa58-7ef6c1d545b7,Namespace:calico-apiserver,Attempt:0,}" May 16 03:30:13.218522 systemd-networkd[1395]: calif884da9678c: Link UP May 16 03:30:13.220390 systemd-networkd[1395]: calif884da9678c: Gained carrier May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.090 [INFO][4472] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--mm7r4-eth0 calico-apiserver-594f46878- calico-apiserver 60ec12df-16f1-4a36-aa58-7ef6c1d545b7 848 0 2025-05-16 03:29:32 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:594f46878 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-34cf5e3c62.novalocal calico-apiserver-594f46878-mm7r4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif884da9678c [] [] }} ContainerID="679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-mm7r4" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--mm7r4-" May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.091 [INFO][4472] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-mm7r4" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--mm7r4-eth0" May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.150 [INFO][4485] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" HandleID="k8s-pod-network.679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--mm7r4-eth0" May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.150 [INFO][4485] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" HandleID="k8s-pod-network.679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--mm7r4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fb00), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-34cf5e3c62.novalocal", "pod":"calico-apiserver-594f46878-mm7r4", "timestamp":"2025-05-16 03:30:13.15061915 +0000 UTC"}, Hostname:"ci-4284-0-0-n-34cf5e3c62.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.151 [INFO][4485] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.151 [INFO][4485] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.151 [INFO][4485] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-34cf5e3c62.novalocal' May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.162 [INFO][4485] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.170 [INFO][4485] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.178 [INFO][4485] ipam/ipam.go 511: Trying affinity for 192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.182 [INFO][4485] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.186 [INFO][4485] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.186 [INFO][4485] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.0/26 handle="k8s-pod-network.679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.189 [INFO][4485] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.195 [INFO][4485] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.0/26 handle="k8s-pod-network.679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.207 [INFO][4485] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.5/26] block=192.168.116.0/26 handle="k8s-pod-network.679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.207 [INFO][4485] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.5/26] handle="k8s-pod-network.679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.209 [INFO][4485] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 03:30:13.257115 containerd[1476]: 2025-05-16 03:30:13.209 [INFO][4485] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.5/26] IPv6=[] ContainerID="679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" HandleID="k8s-pod-network.679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--mm7r4-eth0" May 16 03:30:13.259348 containerd[1476]: 2025-05-16 03:30:13.210 [INFO][4472] cni-plugin/k8s.go 418: Populated endpoint ContainerID="679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-mm7r4" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--mm7r4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--mm7r4-eth0", GenerateName:"calico-apiserver-594f46878-", Namespace:"calico-apiserver", SelfLink:"", UID:"60ec12df-16f1-4a36-aa58-7ef6c1d545b7", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 29, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"594f46878", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"", Pod:"calico-apiserver-594f46878-mm7r4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.116.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif884da9678c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:13.259348 containerd[1476]: 2025-05-16 03:30:13.211 [INFO][4472] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.5/32] ContainerID="679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-mm7r4" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--mm7r4-eth0" May 16 03:30:13.259348 containerd[1476]: 2025-05-16 03:30:13.211 [INFO][4472] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif884da9678c ContainerID="679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-mm7r4" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--mm7r4-eth0" May 16 03:30:13.259348 containerd[1476]: 2025-05-16 03:30:13.221 [INFO][4472] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-mm7r4" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--mm7r4-eth0" May 16 03:30:13.259348 containerd[1476]: 2025-05-16 03:30:13.222 [INFO][4472] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-mm7r4" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--mm7r4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--mm7r4-eth0", GenerateName:"calico-apiserver-594f46878-", Namespace:"calico-apiserver", SelfLink:"", UID:"60ec12df-16f1-4a36-aa58-7ef6c1d545b7", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 29, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"594f46878", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa", Pod:"calico-apiserver-594f46878-mm7r4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.116.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif884da9678c", MAC:"4a:58:87:64:f0:c0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:13.259348 containerd[1476]: 2025-05-16 03:30:13.252 [INFO][4472] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-mm7r4" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--mm7r4-eth0" May 16 03:30:13.316205 containerd[1476]: time="2025-05-16T03:30:13.315354566Z" level=info msg="connecting to shim 679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa" address="unix:///run/containerd/s/03f84751ae005c2e721dcbabc304e25cf2539fc14c2b470199aee1b2d97906f1" namespace=k8s.io protocol=ttrpc version=3 May 16 03:30:13.377141 systemd[1]: Started cri-containerd-679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa.scope - libcontainer container 679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa. May 16 03:30:13.587376 containerd[1476]: time="2025-05-16T03:30:13.586242241Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594f46878-mm7r4,Uid:60ec12df-16f1-4a36-aa58-7ef6c1d545b7,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa\"" May 16 03:30:14.375379 systemd-networkd[1395]: calif884da9678c: Gained IPv6LL May 16 03:30:14.888762 containerd[1476]: time="2025-05-16T03:30:14.888709876Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:14.890218 containerd[1476]: time="2025-05-16T03:30:14.889901943Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=51178512" May 16 03:30:14.891437 containerd[1476]: time="2025-05-16T03:30:14.891399793Z" level=info msg="ImageCreate event name:\"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:14.894865 containerd[1476]: time="2025-05-16T03:30:14.894825689Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:14.896065 containerd[1476]: time="2025-05-16T03:30:14.895869989Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"52671183\" in 6.747361699s" May 16 03:30:14.896065 containerd[1476]: time="2025-05-16T03:30:14.895903161Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\"" May 16 03:30:14.898579 containerd[1476]: time="2025-05-16T03:30:14.897824816Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 16 03:30:14.915037 containerd[1476]: time="2025-05-16T03:30:14.914776337Z" level=info msg="CreateContainer within sandbox \"4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 16 03:30:14.929146 containerd[1476]: time="2025-05-16T03:30:14.929104648Z" level=info msg="Container 6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02: CDI devices from CRI Config.CDIDevices: []" May 16 03:30:14.941649 containerd[1476]: time="2025-05-16T03:30:14.941611311Z" level=info msg="CreateContainer within sandbox \"4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\"" May 16 03:30:14.942963 containerd[1476]: time="2025-05-16T03:30:14.942747552Z" level=info msg="StartContainer for \"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\"" May 16 03:30:14.944300 containerd[1476]: time="2025-05-16T03:30:14.944103727Z" level=info msg="connecting to shim 6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02" address="unix:///run/containerd/s/5451c429303577c633e62f0db18456c83efc1bfe11acb25cef4b99cab3754ff5" protocol=ttrpc version=3 May 16 03:30:14.971138 systemd[1]: Started cri-containerd-6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02.scope - libcontainer container 6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02. May 16 03:30:15.033593 containerd[1476]: time="2025-05-16T03:30:15.033450434Z" level=info msg="StartContainer for \"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" returns successfully" May 16 03:30:15.473306 kubelet[2715]: I0516 03:30:15.471917 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-58c54ff885-qgwjf" podStartSLOduration=32.260767163 podStartE2EDuration="39.471812363s" podCreationTimestamp="2025-05-16 03:29:36 +0000 UTC" firstStartedPulling="2025-05-16 03:30:07.686445268 +0000 UTC m=+53.819719016" lastFinishedPulling="2025-05-16 03:30:14.897490459 +0000 UTC m=+61.030764216" observedRunningTime="2025-05-16 03:30:15.465389985 +0000 UTC m=+61.598663792" watchObservedRunningTime="2025-05-16 03:30:15.471812363 +0000 UTC m=+61.605086160" May 16 03:30:15.517505 containerd[1476]: time="2025-05-16T03:30:15.517461942Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"17bb7a4d8df2df909991c04cf189e949d36bacd705274e44737a56165a647008\" pid:4608 exited_at:{seconds:1747366215 nanos:516106449}" May 16 03:30:17.022638 containerd[1476]: time="2025-05-16T03:30:17.022552306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594f46878-gp6pw,Uid:73457dc8-5848-40bf-ace2-479bc6054dd2,Namespace:calico-apiserver,Attempt:0,}" May 16 03:30:17.047001 containerd[1476]: time="2025-05-16T03:30:17.045615492Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mskp9,Uid:44c65729-467c-46d8-ae6e-1add08755687,Namespace:kube-system,Attempt:0,}" May 16 03:30:17.298605 systemd-networkd[1395]: cali8321628bb50: Link UP May 16 03:30:17.301475 systemd-networkd[1395]: cali8321628bb50: Gained carrier May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.144 [INFO][4623] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--mskp9-eth0 coredns-668d6bf9bc- kube-system 44c65729-467c-46d8-ae6e-1add08755687 837 0 2025-05-16 03:29:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-34cf5e3c62.novalocal coredns-668d6bf9bc-mskp9 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8321628bb50 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" Namespace="kube-system" Pod="coredns-668d6bf9bc-mskp9" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--mskp9-" May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.144 [INFO][4623] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" Namespace="kube-system" Pod="coredns-668d6bf9bc-mskp9" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--mskp9-eth0" May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.201 [INFO][4649] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" HandleID="k8s-pod-network.d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--mskp9-eth0" May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.201 [INFO][4649] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" HandleID="k8s-pod-network.d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--mskp9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d9150), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-34cf5e3c62.novalocal", "pod":"coredns-668d6bf9bc-mskp9", "timestamp":"2025-05-16 03:30:17.201718311 +0000 UTC"}, Hostname:"ci-4284-0-0-n-34cf5e3c62.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.201 [INFO][4649] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.201 [INFO][4649] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.201 [INFO][4649] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-34cf5e3c62.novalocal' May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.228 [INFO][4649] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.238 [INFO][4649] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.245 [INFO][4649] ipam/ipam.go 511: Trying affinity for 192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.248 [INFO][4649] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.253 [INFO][4649] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.253 [INFO][4649] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.0/26 handle="k8s-pod-network.d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.256 [INFO][4649] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.263 [INFO][4649] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.0/26 handle="k8s-pod-network.d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.280 [INFO][4649] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.6/26] block=192.168.116.0/26 handle="k8s-pod-network.d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.280 [INFO][4649] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.6/26] handle="k8s-pod-network.d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.280 [INFO][4649] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 03:30:17.333893 containerd[1476]: 2025-05-16 03:30:17.280 [INFO][4649] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.6/26] IPv6=[] ContainerID="d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" HandleID="k8s-pod-network.d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--mskp9-eth0" May 16 03:30:17.334620 containerd[1476]: 2025-05-16 03:30:17.285 [INFO][4623] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" Namespace="kube-system" Pod="coredns-668d6bf9bc-mskp9" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--mskp9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--mskp9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"44c65729-467c-46d8-ae6e-1add08755687", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 29, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"", Pod:"coredns-668d6bf9bc-mskp9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.116.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8321628bb50", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:17.334620 containerd[1476]: 2025-05-16 03:30:17.285 [INFO][4623] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.6/32] ContainerID="d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" Namespace="kube-system" Pod="coredns-668d6bf9bc-mskp9" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--mskp9-eth0" May 16 03:30:17.334620 containerd[1476]: 2025-05-16 03:30:17.285 [INFO][4623] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8321628bb50 ContainerID="d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" Namespace="kube-system" Pod="coredns-668d6bf9bc-mskp9" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--mskp9-eth0" May 16 03:30:17.334620 containerd[1476]: 2025-05-16 03:30:17.300 [INFO][4623] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" Namespace="kube-system" Pod="coredns-668d6bf9bc-mskp9" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--mskp9-eth0" May 16 03:30:17.334620 containerd[1476]: 2025-05-16 03:30:17.303 [INFO][4623] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" Namespace="kube-system" Pod="coredns-668d6bf9bc-mskp9" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--mskp9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--mskp9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"44c65729-467c-46d8-ae6e-1add08755687", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 29, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e", Pod:"coredns-668d6bf9bc-mskp9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.116.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8321628bb50", MAC:"ee:f3:93:69:36:1b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:17.334620 containerd[1476]: 2025-05-16 03:30:17.328 [INFO][4623] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" Namespace="kube-system" Pod="coredns-668d6bf9bc-mskp9" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-coredns--668d6bf9bc--mskp9-eth0" May 16 03:30:17.412577 containerd[1476]: time="2025-05-16T03:30:17.412102992Z" level=info msg="connecting to shim d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e" address="unix:///run/containerd/s/382e3f7d09e8371a607b418183ef622d6173a2fe0404efbc2f8d77a43ef1242b" namespace=k8s.io protocol=ttrpc version=3 May 16 03:30:17.440042 systemd-networkd[1395]: calie5a3a3f06db: Link UP May 16 03:30:17.441841 systemd-networkd[1395]: calie5a3a3f06db: Gained carrier May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.148 [INFO][4625] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--gp6pw-eth0 calico-apiserver-594f46878- calico-apiserver 73457dc8-5848-40bf-ace2-479bc6054dd2 846 0 2025-05-16 03:29:32 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:594f46878 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-34cf5e3c62.novalocal calico-apiserver-594f46878-gp6pw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie5a3a3f06db [] [] }} ContainerID="8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-gp6pw" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--gp6pw-" May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.148 [INFO][4625] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-gp6pw" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--gp6pw-eth0" May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.336 [INFO][4656] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" HandleID="k8s-pod-network.8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--gp6pw-eth0" May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.336 [INFO][4656] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" HandleID="k8s-pod-network.8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--gp6pw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003e6120), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-34cf5e3c62.novalocal", "pod":"calico-apiserver-594f46878-gp6pw", "timestamp":"2025-05-16 03:30:17.336350334 +0000 UTC"}, Hostname:"ci-4284-0-0-n-34cf5e3c62.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.336 [INFO][4656] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.336 [INFO][4656] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.336 [INFO][4656] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-34cf5e3c62.novalocal' May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.350 [INFO][4656] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.359 [INFO][4656] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.367 [INFO][4656] ipam/ipam.go 511: Trying affinity for 192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.374 [INFO][4656] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.388 [INFO][4656] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.389 [INFO][4656] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.0/26 handle="k8s-pod-network.8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.393 [INFO][4656] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91 May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.401 [INFO][4656] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.0/26 handle="k8s-pod-network.8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.425 [INFO][4656] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.7/26] block=192.168.116.0/26 handle="k8s-pod-network.8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.425 [INFO][4656] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.7/26] handle="k8s-pod-network.8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.425 [INFO][4656] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 03:30:17.475165 containerd[1476]: 2025-05-16 03:30:17.425 [INFO][4656] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.7/26] IPv6=[] ContainerID="8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" HandleID="k8s-pod-network.8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--gp6pw-eth0" May 16 03:30:17.475938 containerd[1476]: 2025-05-16 03:30:17.433 [INFO][4625] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-gp6pw" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--gp6pw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--gp6pw-eth0", GenerateName:"calico-apiserver-594f46878-", Namespace:"calico-apiserver", SelfLink:"", UID:"73457dc8-5848-40bf-ace2-479bc6054dd2", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 29, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"594f46878", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"", Pod:"calico-apiserver-594f46878-gp6pw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.116.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie5a3a3f06db", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:17.475938 containerd[1476]: 2025-05-16 03:30:17.433 [INFO][4625] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.7/32] ContainerID="8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-gp6pw" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--gp6pw-eth0" May 16 03:30:17.475938 containerd[1476]: 2025-05-16 03:30:17.433 [INFO][4625] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie5a3a3f06db ContainerID="8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-gp6pw" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--gp6pw-eth0" May 16 03:30:17.475938 containerd[1476]: 2025-05-16 03:30:17.443 [INFO][4625] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-gp6pw" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--gp6pw-eth0" May 16 03:30:17.475938 containerd[1476]: 2025-05-16 03:30:17.449 [INFO][4625] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-gp6pw" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--gp6pw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--gp6pw-eth0", GenerateName:"calico-apiserver-594f46878-", Namespace:"calico-apiserver", SelfLink:"", UID:"73457dc8-5848-40bf-ace2-479bc6054dd2", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 29, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"594f46878", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91", Pod:"calico-apiserver-594f46878-gp6pw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.116.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie5a3a3f06db", MAC:"f6:f1:36:67:0f:6a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:17.475938 containerd[1476]: 2025-05-16 03:30:17.469 [INFO][4625] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" Namespace="calico-apiserver" Pod="calico-apiserver-594f46878-gp6pw" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-calico--apiserver--594f46878--gp6pw-eth0" May 16 03:30:17.513083 systemd[1]: Started cri-containerd-d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e.scope - libcontainer container d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e. May 16 03:30:17.545613 containerd[1476]: time="2025-05-16T03:30:17.544966726Z" level=info msg="connecting to shim 8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91" address="unix:///run/containerd/s/13781bad4f618cea6d91aa0405a9ca59ccf527e6976e796171756dbc9e240e5f" namespace=k8s.io protocol=ttrpc version=3 May 16 03:30:17.618515 systemd[1]: Started cri-containerd-8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91.scope - libcontainer container 8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91. May 16 03:30:17.635776 containerd[1476]: time="2025-05-16T03:30:17.635289717Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mskp9,Uid:44c65729-467c-46d8-ae6e-1add08755687,Namespace:kube-system,Attempt:0,} returns sandbox id \"d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e\"" May 16 03:30:17.666536 containerd[1476]: time="2025-05-16T03:30:17.666183001Z" level=info msg="CreateContainer within sandbox \"d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 16 03:30:17.690908 containerd[1476]: time="2025-05-16T03:30:17.690856538Z" level=info msg="Container 5ef3d88d8f21ebc9d486f3e825296892495869ace657a0c6c86a26b00aaf2452: CDI devices from CRI Config.CDIDevices: []" May 16 03:30:17.712242 containerd[1476]: time="2025-05-16T03:30:17.712194576Z" level=info msg="CreateContainer within sandbox \"d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5ef3d88d8f21ebc9d486f3e825296892495869ace657a0c6c86a26b00aaf2452\"" May 16 03:30:17.712947 containerd[1476]: time="2025-05-16T03:30:17.712921981Z" level=info msg="StartContainer for \"5ef3d88d8f21ebc9d486f3e825296892495869ace657a0c6c86a26b00aaf2452\"" May 16 03:30:17.716076 containerd[1476]: time="2025-05-16T03:30:17.716020003Z" level=info msg="connecting to shim 5ef3d88d8f21ebc9d486f3e825296892495869ace657a0c6c86a26b00aaf2452" address="unix:///run/containerd/s/382e3f7d09e8371a607b418183ef622d6173a2fe0404efbc2f8d77a43ef1242b" protocol=ttrpc version=3 May 16 03:30:17.729010 containerd[1476]: time="2025-05-16T03:30:17.725740307Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594f46878-gp6pw,Uid:73457dc8-5848-40bf-ace2-479bc6054dd2,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91\"" May 16 03:30:17.751338 systemd[1]: Started cri-containerd-5ef3d88d8f21ebc9d486f3e825296892495869ace657a0c6c86a26b00aaf2452.scope - libcontainer container 5ef3d88d8f21ebc9d486f3e825296892495869ace657a0c6c86a26b00aaf2452. May 16 03:30:17.805501 containerd[1476]: time="2025-05-16T03:30:17.805457974Z" level=info msg="StartContainer for \"5ef3d88d8f21ebc9d486f3e825296892495869ace657a0c6c86a26b00aaf2452\" returns successfully" May 16 03:30:17.906204 containerd[1476]: time="2025-05-16T03:30:17.906082759Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:17.907605 containerd[1476]: time="2025-05-16T03:30:17.907540214Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=14705639" May 16 03:30:17.909700 containerd[1476]: time="2025-05-16T03:30:17.909604616Z" level=info msg="ImageCreate event name:\"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:17.913139 containerd[1476]: time="2025-05-16T03:30:17.913105585Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:17.913867 containerd[1476]: time="2025-05-16T03:30:17.913829693Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"16198294\" in 3.015970061s" May 16 03:30:17.913928 containerd[1476]: time="2025-05-16T03:30:17.913868576Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\"" May 16 03:30:17.916008 containerd[1476]: time="2025-05-16T03:30:17.915787676Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 16 03:30:17.917557 containerd[1476]: time="2025-05-16T03:30:17.917148618Z" level=info msg="CreateContainer within sandbox \"ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 16 03:30:17.930823 containerd[1476]: time="2025-05-16T03:30:17.930762849Z" level=info msg="Container dca1141edafcd1c7ace4d7026130bbf55e1579a0ebf1d02bc97df400c598ce14: CDI devices from CRI Config.CDIDevices: []" May 16 03:30:17.943577 containerd[1476]: time="2025-05-16T03:30:17.943482560Z" level=info msg="CreateContainer within sandbox \"ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"dca1141edafcd1c7ace4d7026130bbf55e1579a0ebf1d02bc97df400c598ce14\"" May 16 03:30:17.944751 containerd[1476]: time="2025-05-16T03:30:17.944720242Z" level=info msg="StartContainer for \"dca1141edafcd1c7ace4d7026130bbf55e1579a0ebf1d02bc97df400c598ce14\"" May 16 03:30:17.946935 containerd[1476]: time="2025-05-16T03:30:17.946895853Z" level=info msg="connecting to shim dca1141edafcd1c7ace4d7026130bbf55e1579a0ebf1d02bc97df400c598ce14" address="unix:///run/containerd/s/e15187788fc978fc9172499e65b584a91abc96c3e79f7428ca324e5a782606bc" protocol=ttrpc version=3 May 16 03:30:17.970168 systemd[1]: Started cri-containerd-dca1141edafcd1c7ace4d7026130bbf55e1579a0ebf1d02bc97df400c598ce14.scope - libcontainer container dca1141edafcd1c7ace4d7026130bbf55e1579a0ebf1d02bc97df400c598ce14. May 16 03:30:18.045074 containerd[1476]: time="2025-05-16T03:30:18.044936751Z" level=info msg="StartContainer for \"dca1141edafcd1c7ace4d7026130bbf55e1579a0ebf1d02bc97df400c598ce14\" returns successfully" May 16 03:30:18.069884 containerd[1476]: time="2025-05-16T03:30:18.069589146Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-4kgrm,Uid:c0064a9e-4be1-4ce0-a21f-9e78adbef175,Namespace:calico-system,Attempt:0,}" May 16 03:30:18.137076 kubelet[2715]: I0516 03:30:18.136890 2715 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 16 03:30:18.137076 kubelet[2715]: I0516 03:30:18.136940 2715 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 16 03:30:18.255403 systemd-networkd[1395]: cali1325279613c: Link UP May 16 03:30:18.255621 systemd-networkd[1395]: cali1325279613c: Gained carrier May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.141 [INFO][4843] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-goldmane--78d55f7ddc--4kgrm-eth0 goldmane-78d55f7ddc- calico-system c0064a9e-4be1-4ce0-a21f-9e78adbef175 847 0 2025-05-16 03:29:35 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:78d55f7ddc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4284-0-0-n-34cf5e3c62.novalocal goldmane-78d55f7ddc-4kgrm eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali1325279613c [] [] }} ContainerID="87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-4kgrm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-goldmane--78d55f7ddc--4kgrm-" May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.142 [INFO][4843] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-4kgrm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-goldmane--78d55f7ddc--4kgrm-eth0" May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.193 [INFO][4853] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" HandleID="k8s-pod-network.87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-goldmane--78d55f7ddc--4kgrm-eth0" May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.193 [INFO][4853] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" HandleID="k8s-pod-network.87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-goldmane--78d55f7ddc--4kgrm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d9890), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-34cf5e3c62.novalocal", "pod":"goldmane-78d55f7ddc-4kgrm", "timestamp":"2025-05-16 03:30:18.193796499 +0000 UTC"}, Hostname:"ci-4284-0-0-n-34cf5e3c62.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.194 [INFO][4853] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.194 [INFO][4853] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.194 [INFO][4853] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-34cf5e3c62.novalocal' May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.210 [INFO][4853] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.217 [INFO][4853] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.224 [INFO][4853] ipam/ipam.go 511: Trying affinity for 192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.227 [INFO][4853] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.230 [INFO][4853] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.0/26 host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.230 [INFO][4853] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.0/26 handle="k8s-pod-network.87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.232 [INFO][4853] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1 May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.239 [INFO][4853] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.0/26 handle="k8s-pod-network.87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.247 [INFO][4853] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.8/26] block=192.168.116.0/26 handle="k8s-pod-network.87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.247 [INFO][4853] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.8/26] handle="k8s-pod-network.87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" host="ci-4284-0-0-n-34cf5e3c62.novalocal" May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.247 [INFO][4853] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 03:30:18.281074 containerd[1476]: 2025-05-16 03:30:18.247 [INFO][4853] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.8/26] IPv6=[] ContainerID="87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" HandleID="k8s-pod-network.87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" Workload="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-goldmane--78d55f7ddc--4kgrm-eth0" May 16 03:30:18.282662 containerd[1476]: 2025-05-16 03:30:18.251 [INFO][4843] cni-plugin/k8s.go 418: Populated endpoint ContainerID="87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-4kgrm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-goldmane--78d55f7ddc--4kgrm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-goldmane--78d55f7ddc--4kgrm-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"c0064a9e-4be1-4ce0-a21f-9e78adbef175", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 29, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"", Pod:"goldmane-78d55f7ddc-4kgrm", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.116.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali1325279613c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:18.282662 containerd[1476]: 2025-05-16 03:30:18.251 [INFO][4843] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.8/32] ContainerID="87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-4kgrm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-goldmane--78d55f7ddc--4kgrm-eth0" May 16 03:30:18.282662 containerd[1476]: 2025-05-16 03:30:18.251 [INFO][4843] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1325279613c ContainerID="87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-4kgrm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-goldmane--78d55f7ddc--4kgrm-eth0" May 16 03:30:18.282662 containerd[1476]: 2025-05-16 03:30:18.254 [INFO][4843] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-4kgrm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-goldmane--78d55f7ddc--4kgrm-eth0" May 16 03:30:18.282662 containerd[1476]: 2025-05-16 03:30:18.256 [INFO][4843] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-4kgrm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-goldmane--78d55f7ddc--4kgrm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-goldmane--78d55f7ddc--4kgrm-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"c0064a9e-4be1-4ce0-a21f-9e78adbef175", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 3, 29, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-34cf5e3c62.novalocal", ContainerID:"87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1", Pod:"goldmane-78d55f7ddc-4kgrm", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.116.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali1325279613c", MAC:"86:59:b1:99:b7:e8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 03:30:18.282662 containerd[1476]: 2025-05-16 03:30:18.277 [INFO][4843] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-4kgrm" WorkloadEndpoint="ci--4284--0--0--n--34cf5e3c62.novalocal-k8s-goldmane--78d55f7ddc--4kgrm-eth0" May 16 03:30:18.331422 containerd[1476]: time="2025-05-16T03:30:18.330818941Z" level=info msg="connecting to shim 87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1" address="unix:///run/containerd/s/445de5076fbe59567191529d42a3cfcdc2c277003ba5cb03e05a40791d5abd35" namespace=k8s.io protocol=ttrpc version=3 May 16 03:30:18.360170 systemd[1]: Started cri-containerd-87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1.scope - libcontainer container 87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1. May 16 03:30:18.416205 containerd[1476]: time="2025-05-16T03:30:18.416119509Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-4kgrm,Uid:c0064a9e-4be1-4ce0-a21f-9e78adbef175,Namespace:calico-system,Attempt:0,} returns sandbox id \"87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1\"" May 16 03:30:18.467343 kubelet[2715]: I0516 03:30:18.467084 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-5gjgv" podStartSLOduration=30.155761316 podStartE2EDuration="42.467064557s" podCreationTimestamp="2025-05-16 03:29:36 +0000 UTC" firstStartedPulling="2025-05-16 03:30:05.603792397 +0000 UTC m=+51.737066144" lastFinishedPulling="2025-05-16 03:30:17.915095628 +0000 UTC m=+64.048369385" observedRunningTime="2025-05-16 03:30:18.466930065 +0000 UTC m=+64.600203812" watchObservedRunningTime="2025-05-16 03:30:18.467064557 +0000 UTC m=+64.600338304" May 16 03:30:18.489680 kubelet[2715]: I0516 03:30:18.489597 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-mskp9" podStartSLOduration=59.489580015 podStartE2EDuration="59.489580015s" podCreationTimestamp="2025-05-16 03:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 03:30:18.488632727 +0000 UTC m=+64.621906484" watchObservedRunningTime="2025-05-16 03:30:18.489580015 +0000 UTC m=+64.622853762" May 16 03:30:18.599199 systemd-networkd[1395]: cali8321628bb50: Gained IPv6LL May 16 03:30:19.047792 systemd-networkd[1395]: calie5a3a3f06db: Gained IPv6LL May 16 03:30:19.879502 systemd-networkd[1395]: cali1325279613c: Gained IPv6LL May 16 03:30:23.064343 containerd[1476]: time="2025-05-16T03:30:23.064176622Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:23.067313 containerd[1476]: time="2025-05-16T03:30:23.067082804Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=47252431" May 16 03:30:23.096911 containerd[1476]: time="2025-05-16T03:30:23.095923273Z" level=info msg="ImageCreate event name:\"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:23.127267 containerd[1476]: time="2025-05-16T03:30:23.127155628Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:23.129822 containerd[1476]: time="2025-05-16T03:30:23.129721943Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 5.213888762s" May 16 03:30:23.129822 containerd[1476]: time="2025-05-16T03:30:23.129801522Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 16 03:30:23.135187 containerd[1476]: time="2025-05-16T03:30:23.134652241Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 16 03:30:23.143138 containerd[1476]: time="2025-05-16T03:30:23.142371691Z" level=info msg="CreateContainer within sandbox \"679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 16 03:30:23.163432 containerd[1476]: time="2025-05-16T03:30:23.163351254Z" level=info msg="Container 36426fd7141f7dc2d0eea003e008e0e3bfa8bd97c8bc5f17110baba688bd19c6: CDI devices from CRI Config.CDIDevices: []" May 16 03:30:23.187297 containerd[1476]: time="2025-05-16T03:30:23.187175464Z" level=info msg="CreateContainer within sandbox \"679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"36426fd7141f7dc2d0eea003e008e0e3bfa8bd97c8bc5f17110baba688bd19c6\"" May 16 03:30:23.190777 containerd[1476]: time="2025-05-16T03:30:23.189246268Z" level=info msg="StartContainer for \"36426fd7141f7dc2d0eea003e008e0e3bfa8bd97c8bc5f17110baba688bd19c6\"" May 16 03:30:23.192558 containerd[1476]: time="2025-05-16T03:30:23.192494091Z" level=info msg="connecting to shim 36426fd7141f7dc2d0eea003e008e0e3bfa8bd97c8bc5f17110baba688bd19c6" address="unix:///run/containerd/s/03f84751ae005c2e721dcbabc304e25cf2539fc14c2b470199aee1b2d97906f1" protocol=ttrpc version=3 May 16 03:30:23.251196 systemd[1]: Started cri-containerd-36426fd7141f7dc2d0eea003e008e0e3bfa8bd97c8bc5f17110baba688bd19c6.scope - libcontainer container 36426fd7141f7dc2d0eea003e008e0e3bfa8bd97c8bc5f17110baba688bd19c6. May 16 03:30:23.380416 containerd[1476]: time="2025-05-16T03:30:23.380279579Z" level=info msg="StartContainer for \"36426fd7141f7dc2d0eea003e008e0e3bfa8bd97c8bc5f17110baba688bd19c6\" returns successfully" May 16 03:30:23.681003 containerd[1476]: time="2025-05-16T03:30:23.680800115Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 03:30:23.684624 containerd[1476]: time="2025-05-16T03:30:23.684508011Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 16 03:30:23.694625 containerd[1476]: time="2025-05-16T03:30:23.694460120Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 559.656755ms" May 16 03:30:23.694890 containerd[1476]: time="2025-05-16T03:30:23.694750725Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 16 03:30:23.698708 containerd[1476]: time="2025-05-16T03:30:23.698658124Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 16 03:30:23.703363 containerd[1476]: time="2025-05-16T03:30:23.703308877Z" level=info msg="CreateContainer within sandbox \"8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 16 03:30:23.744603 containerd[1476]: time="2025-05-16T03:30:23.744542464Z" level=info msg="Container 449db808f5f65fbaa071976d3b6fb2bd0185103764eccf1c2e64b98215702699: CDI devices from CRI Config.CDIDevices: []" May 16 03:30:23.763634 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2225001525.mount: Deactivated successfully. May 16 03:30:23.767438 containerd[1476]: time="2025-05-16T03:30:23.767391815Z" level=info msg="CreateContainer within sandbox \"8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"449db808f5f65fbaa071976d3b6fb2bd0185103764eccf1c2e64b98215702699\"" May 16 03:30:23.770154 containerd[1476]: time="2025-05-16T03:30:23.770110714Z" level=info msg="StartContainer for \"449db808f5f65fbaa071976d3b6fb2bd0185103764eccf1c2e64b98215702699\"" May 16 03:30:23.774756 containerd[1476]: time="2025-05-16T03:30:23.774493266Z" level=info msg="connecting to shim 449db808f5f65fbaa071976d3b6fb2bd0185103764eccf1c2e64b98215702699" address="unix:///run/containerd/s/13781bad4f618cea6d91aa0405a9ca59ccf527e6976e796171756dbc9e240e5f" protocol=ttrpc version=3 May 16 03:30:23.834201 systemd[1]: Started cri-containerd-449db808f5f65fbaa071976d3b6fb2bd0185103764eccf1c2e64b98215702699.scope - libcontainer container 449db808f5f65fbaa071976d3b6fb2bd0185103764eccf1c2e64b98215702699. May 16 03:30:23.950650 containerd[1476]: time="2025-05-16T03:30:23.950390514Z" level=info msg="StartContainer for \"449db808f5f65fbaa071976d3b6fb2bd0185103764eccf1c2e64b98215702699\" returns successfully" May 16 03:30:24.139339 containerd[1476]: time="2025-05-16T03:30:24.139269751Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:30:24.142096 containerd[1476]: time="2025-05-16T03:30:24.142028235Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:30:24.142247 containerd[1476]: time="2025-05-16T03:30:24.142158109Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 16 03:30:24.142440 kubelet[2715]: E0516 03:30:24.142359 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 03:30:24.143183 kubelet[2715]: E0516 03:30:24.142438 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 03:30:24.143238 containerd[1476]: time="2025-05-16T03:30:24.142889080Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 16 03:30:24.144939 kubelet[2715]: E0516 03:30:24.144865 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1b39ead48ab84d5997218c5ff179c936,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gfk9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79cdbd85dd-m22sm_calico-system(32870edc-65ff-47a6-9110-9ba1fe628ed6): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:30:24.504166 kubelet[2715]: I0516 03:30:24.504006 2715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 16 03:30:24.508101 containerd[1476]: time="2025-05-16T03:30:24.508035826Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:30:24.510145 containerd[1476]: time="2025-05-16T03:30:24.510101460Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:30:24.511069 containerd[1476]: time="2025-05-16T03:30:24.511017840Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 16 03:30:24.511462 kubelet[2715]: E0516 03:30:24.511388 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 03:30:24.512215 kubelet[2715]: E0516 03:30:24.511440 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 03:30:24.512215 kubelet[2715]: E0516 03:30:24.511933 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-4kgrm_calico-system(c0064a9e-4be1-4ce0-a21f-9e78adbef175): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:30:24.512771 containerd[1476]: time="2025-05-16T03:30:24.512745640Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 16 03:30:24.514398 kubelet[2715]: E0516 03:30:24.514348 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:30:24.529153 kubelet[2715]: I0516 03:30:24.528534 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-594f46878-gp6pw" podStartSLOduration=46.562573433 podStartE2EDuration="52.528512216s" podCreationTimestamp="2025-05-16 03:29:32 +0000 UTC" firstStartedPulling="2025-05-16 03:30:17.73149325 +0000 UTC m=+63.864766997" lastFinishedPulling="2025-05-16 03:30:23.697431973 +0000 UTC m=+69.830705780" observedRunningTime="2025-05-16 03:30:24.528102065 +0000 UTC m=+70.661375812" watchObservedRunningTime="2025-05-16 03:30:24.528512216 +0000 UTC m=+70.661785963" May 16 03:30:24.530198 kubelet[2715]: I0516 03:30:24.529554 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-594f46878-mm7r4" podStartSLOduration=42.985447372 podStartE2EDuration="52.529540614s" podCreationTimestamp="2025-05-16 03:29:32 +0000 UTC" firstStartedPulling="2025-05-16 03:30:13.589164042 +0000 UTC m=+59.722437799" lastFinishedPulling="2025-05-16 03:30:23.133257244 +0000 UTC m=+69.266531041" observedRunningTime="2025-05-16 03:30:23.504884723 +0000 UTC m=+69.638158470" watchObservedRunningTime="2025-05-16 03:30:24.529540614 +0000 UTC m=+70.662814371" May 16 03:30:24.867166 containerd[1476]: time="2025-05-16T03:30:24.867073592Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:30:24.869347 containerd[1476]: time="2025-05-16T03:30:24.869264071Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:30:24.869590 containerd[1476]: time="2025-05-16T03:30:24.869499062Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 16 03:30:24.870840 kubelet[2715]: E0516 03:30:24.869767 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 03:30:24.870840 kubelet[2715]: E0516 03:30:24.869822 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 03:30:24.870840 kubelet[2715]: E0516 03:30:24.869940 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gfk9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79cdbd85dd-m22sm_calico-system(32870edc-65ff-47a6-9110-9ba1fe628ed6): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:30:24.871479 kubelet[2715]: E0516 03:30:24.871443 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:30:25.508622 kubelet[2715]: I0516 03:30:25.508522 2715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 16 03:30:25.515054 kubelet[2715]: E0516 03:30:25.515004 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:30:33.509570 containerd[1476]: time="2025-05-16T03:30:33.509525680Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"abc3a590ffd617eb4e7c143bb65da1954b651cc7eef490c4829a8a010af8aef9\" pid:5032 exited_at:{seconds:1747366233 nanos:509016067}" May 16 03:30:33.642321 containerd[1476]: time="2025-05-16T03:30:33.642042204Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"40ae0dbf7775d98982381cce7332b44e54dd03d03437fa123bc13804abb88b30\" pid:5055 exited_at:{seconds:1747366233 nanos:641312424}" May 16 03:30:36.053462 kubelet[2715]: E0516 03:30:36.052885 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:30:37.025782 containerd[1476]: time="2025-05-16T03:30:37.025481963Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 16 03:30:37.394671 containerd[1476]: time="2025-05-16T03:30:37.394498860Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:30:37.398446 containerd[1476]: time="2025-05-16T03:30:37.398160340Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:30:37.398446 containerd[1476]: time="2025-05-16T03:30:37.398324350Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 16 03:30:37.398995 kubelet[2715]: E0516 03:30:37.398764 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 03:30:37.398995 kubelet[2715]: E0516 03:30:37.398850 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 03:30:37.400326 kubelet[2715]: E0516 03:30:37.399670 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-4kgrm_calico-system(c0064a9e-4be1-4ce0-a21f-9e78adbef175): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:30:37.401596 kubelet[2715]: E0516 03:30:37.401541 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:30:40.008998 kubelet[2715]: I0516 03:30:40.008251 2715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 16 03:30:40.715180 kubelet[2715]: I0516 03:30:40.713269 2715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 16 03:30:45.521874 containerd[1476]: time="2025-05-16T03:30:45.521132799Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"578ff112b40df9b3cef499d5952ac9d7a5304e4bd9d533eac098639026ff9ab1\" pid:5087 exited_at:{seconds:1747366245 nanos:519939377}" May 16 03:30:47.024055 containerd[1476]: time="2025-05-16T03:30:47.022283607Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 16 03:30:47.259950 containerd[1476]: time="2025-05-16T03:30:47.259887733Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"57f1f1bccb91dcb5dd72e4f5e7a68003e62c67210a09c7d079593f30f90a9472\" pid:5114 exited_at:{seconds:1747366247 nanos:256808534}" May 16 03:30:47.416708 containerd[1476]: time="2025-05-16T03:30:47.416617602Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:30:47.418550 containerd[1476]: time="2025-05-16T03:30:47.417952119Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:30:47.420025 containerd[1476]: time="2025-05-16T03:30:47.418200317Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 16 03:30:47.420109 kubelet[2715]: E0516 03:30:47.418896 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 03:30:47.420109 kubelet[2715]: E0516 03:30:47.419017 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 03:30:47.422598 kubelet[2715]: E0516 03:30:47.422513 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1b39ead48ab84d5997218c5ff179c936,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gfk9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79cdbd85dd-m22sm_calico-system(32870edc-65ff-47a6-9110-9ba1fe628ed6): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:30:47.425232 containerd[1476]: time="2025-05-16T03:30:47.425169274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 16 03:30:47.768112 containerd[1476]: time="2025-05-16T03:30:47.765686434Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:30:47.768112 containerd[1476]: time="2025-05-16T03:30:47.767455181Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:30:47.768112 containerd[1476]: time="2025-05-16T03:30:47.767493774Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 16 03:30:47.768435 kubelet[2715]: E0516 03:30:47.767849 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 03:30:47.768435 kubelet[2715]: E0516 03:30:47.767915 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 03:30:47.768435 kubelet[2715]: E0516 03:30:47.768083 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gfk9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79cdbd85dd-m22sm_calico-system(32870edc-65ff-47a6-9110-9ba1fe628ed6): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:30:47.769850 kubelet[2715]: E0516 03:30:47.769781 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:30:50.026554 kubelet[2715]: E0516 03:30:50.025551 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:31:00.034743 kubelet[2715]: E0516 03:31:00.034226 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:31:03.830399 containerd[1476]: time="2025-05-16T03:31:03.830223190Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"ae9d5ba5082acabdc71fa7eaa4b26caee87b34bd8bac9891e6a665f962cadcdc\" pid:5139 exited_at:{seconds:1747366263 nanos:829050312}" May 16 03:31:04.025002 containerd[1476]: time="2025-05-16T03:31:04.024230608Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 16 03:31:04.392460 containerd[1476]: time="2025-05-16T03:31:04.392404579Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:31:04.395366 containerd[1476]: time="2025-05-16T03:31:04.394952877Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:31:04.395366 containerd[1476]: time="2025-05-16T03:31:04.394963166Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 16 03:31:04.396309 kubelet[2715]: E0516 03:31:04.395666 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 03:31:04.396309 kubelet[2715]: E0516 03:31:04.395876 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 03:31:04.396309 kubelet[2715]: E0516 03:31:04.396202 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-4kgrm_calico-system(c0064a9e-4be1-4ce0-a21f-9e78adbef175): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:31:04.397633 kubelet[2715]: E0516 03:31:04.397592 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:31:12.027114 kubelet[2715]: E0516 03:31:12.026788 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:31:15.504754 containerd[1476]: time="2025-05-16T03:31:15.504699396Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"7f46070c8f8f712b358d848657054df316729774cad47ae87ef11c4d6d5ad168\" pid:5168 exited_at:{seconds:1747366275 nanos:504117171}" May 16 03:31:19.033150 kubelet[2715]: E0516 03:31:19.032922 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:31:25.031830 kubelet[2715]: E0516 03:31:25.030925 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:31:33.722096 containerd[1476]: time="2025-05-16T03:31:33.721696708Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"f97a3c36ec406f7e33bf9e229bbbb4617282bc99e6d222f48443b202d3ff47c7\" pid:5198 exited_at:{seconds:1747366293 nanos:714714695}" May 16 03:31:34.039150 kubelet[2715]: E0516 03:31:34.038328 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:31:40.035325 containerd[1476]: time="2025-05-16T03:31:40.033601396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 16 03:31:40.437357 containerd[1476]: time="2025-05-16T03:31:40.436793196Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:31:40.439556 containerd[1476]: time="2025-05-16T03:31:40.439183929Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:31:40.439964 containerd[1476]: time="2025-05-16T03:31:40.439272114Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 16 03:31:40.441163 kubelet[2715]: E0516 03:31:40.440550 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 03:31:40.441163 kubelet[2715]: E0516 03:31:40.440925 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 03:31:40.444036 kubelet[2715]: E0516 03:31:40.443526 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1b39ead48ab84d5997218c5ff179c936,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gfk9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79cdbd85dd-m22sm_calico-system(32870edc-65ff-47a6-9110-9ba1fe628ed6): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:31:40.449969 containerd[1476]: time="2025-05-16T03:31:40.449696654Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 16 03:31:40.843486 containerd[1476]: time="2025-05-16T03:31:40.843380296Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:31:40.846922 containerd[1476]: time="2025-05-16T03:31:40.845400814Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:31:40.846922 containerd[1476]: time="2025-05-16T03:31:40.845521500Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 16 03:31:40.847402 kubelet[2715]: E0516 03:31:40.845931 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 03:31:40.847402 kubelet[2715]: E0516 03:31:40.846084 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 03:31:40.847402 kubelet[2715]: E0516 03:31:40.846327 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gfk9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79cdbd85dd-m22sm_calico-system(32870edc-65ff-47a6-9110-9ba1fe628ed6): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:31:40.848419 kubelet[2715]: E0516 03:31:40.848231 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:31:44.700193 update_engine[1461]: I20250516 03:31:44.699563 1461 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 16 03:31:44.700193 update_engine[1461]: I20250516 03:31:44.699895 1461 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 16 03:31:44.704060 update_engine[1461]: I20250516 03:31:44.703933 1461 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 16 03:31:44.708383 update_engine[1461]: I20250516 03:31:44.707584 1461 omaha_request_params.cc:62] Current group set to alpha May 16 03:31:44.709102 update_engine[1461]: I20250516 03:31:44.708642 1461 update_attempter.cc:499] Already updated boot flags. Skipping. May 16 03:31:44.709102 update_engine[1461]: I20250516 03:31:44.708725 1461 update_attempter.cc:643] Scheduling an action processor start. May 16 03:31:44.709102 update_engine[1461]: I20250516 03:31:44.708830 1461 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 16 03:31:44.709537 update_engine[1461]: I20250516 03:31:44.709160 1461 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 16 03:31:44.709537 update_engine[1461]: I20250516 03:31:44.709381 1461 omaha_request_action.cc:271] Posting an Omaha request to disabled May 16 03:31:44.709537 update_engine[1461]: I20250516 03:31:44.709420 1461 omaha_request_action.cc:272] Request: May 16 03:31:44.709537 update_engine[1461]: May 16 03:31:44.709537 update_engine[1461]: May 16 03:31:44.709537 update_engine[1461]: May 16 03:31:44.709537 update_engine[1461]: May 16 03:31:44.709537 update_engine[1461]: May 16 03:31:44.709537 update_engine[1461]: May 16 03:31:44.709537 update_engine[1461]: May 16 03:31:44.709537 update_engine[1461]: May 16 03:31:44.709537 update_engine[1461]: I20250516 03:31:44.709447 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 16 03:31:44.719027 locksmithd[1489]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 16 03:31:44.722623 update_engine[1461]: I20250516 03:31:44.722543 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 16 03:31:44.723912 update_engine[1461]: I20250516 03:31:44.723768 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 16 03:31:44.729395 update_engine[1461]: E20250516 03:31:44.729300 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 16 03:31:44.729605 update_engine[1461]: I20250516 03:31:44.729539 1461 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 16 03:31:45.025666 containerd[1476]: time="2025-05-16T03:31:45.025261640Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 16 03:31:45.466102 containerd[1476]: time="2025-05-16T03:31:45.466022392Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:31:45.468107 containerd[1476]: time="2025-05-16T03:31:45.467970723Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:31:45.469644 containerd[1476]: time="2025-05-16T03:31:45.468249606Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 16 03:31:45.471357 kubelet[2715]: E0516 03:31:45.468472 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 03:31:45.471357 kubelet[2715]: E0516 03:31:45.468558 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 03:31:45.471357 kubelet[2715]: E0516 03:31:45.469134 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-4kgrm_calico-system(c0064a9e-4be1-4ce0-a21f-9e78adbef175): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:31:45.471357 kubelet[2715]: E0516 03:31:45.470857 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:31:45.517736 containerd[1476]: time="2025-05-16T03:31:45.517683361Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"348ec299773fbf2f7424617b2cd7f8a6a00c4b1a8f8a62b2f048370a240a393d\" pid:5245 exited_at:{seconds:1747366305 nanos:516877136}" May 16 03:31:47.279687 containerd[1476]: time="2025-05-16T03:31:47.279547937Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"ce1e5c2ea826a230ce60227a77e7e80e5d20f839c49bcbfa797ab7a4a022856d\" pid:5267 exited_at:{seconds:1747366307 nanos:278522130}" May 16 03:31:53.027243 kubelet[2715]: E0516 03:31:53.026613 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:31:54.626975 update_engine[1461]: I20250516 03:31:54.626821 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 16 03:31:54.627973 update_engine[1461]: I20250516 03:31:54.627352 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 16 03:31:54.627973 update_engine[1461]: I20250516 03:31:54.627907 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 16 03:31:54.633542 update_engine[1461]: E20250516 03:31:54.633445 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 16 03:31:54.633743 update_engine[1461]: I20250516 03:31:54.633615 1461 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 16 03:31:58.025059 kubelet[2715]: E0516 03:31:58.023929 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:32:03.683334 containerd[1476]: time="2025-05-16T03:32:03.683270028Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"e22c2a2837a60dc0bdee104517306f27d06d4d6abc1241e5274388f9acf06fd5\" pid:5291 exited_at:{seconds:1747366323 nanos:681810356}" May 16 03:32:04.627967 update_engine[1461]: I20250516 03:32:04.627804 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 16 03:32:04.629063 update_engine[1461]: I20250516 03:32:04.628410 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 16 03:32:04.629063 update_engine[1461]: I20250516 03:32:04.629017 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 16 03:32:04.634752 update_engine[1461]: E20250516 03:32:04.634640 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 16 03:32:04.634929 update_engine[1461]: I20250516 03:32:04.634789 1461 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 16 03:32:07.028454 kubelet[2715]: E0516 03:32:07.028074 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:32:10.027700 kubelet[2715]: E0516 03:32:10.027537 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:32:14.627816 update_engine[1461]: I20250516 03:32:14.627640 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 16 03:32:14.628771 update_engine[1461]: I20250516 03:32:14.628221 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 16 03:32:14.628866 update_engine[1461]: I20250516 03:32:14.628788 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 16 03:32:14.634142 update_engine[1461]: E20250516 03:32:14.634053 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 16 03:32:14.634347 update_engine[1461]: I20250516 03:32:14.634198 1461 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 16 03:32:14.634347 update_engine[1461]: I20250516 03:32:14.634225 1461 omaha_request_action.cc:617] Omaha request response: May 16 03:32:14.634589 update_engine[1461]: E20250516 03:32:14.634474 1461 omaha_request_action.cc:636] Omaha request network transfer failed. May 16 03:32:14.634748 update_engine[1461]: I20250516 03:32:14.634582 1461 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 16 03:32:14.634748 update_engine[1461]: I20250516 03:32:14.634601 1461 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 16 03:32:14.634748 update_engine[1461]: I20250516 03:32:14.634649 1461 update_attempter.cc:306] Processing Done. May 16 03:32:14.634748 update_engine[1461]: E20250516 03:32:14.634706 1461 update_attempter.cc:619] Update failed. May 16 03:32:14.634748 update_engine[1461]: I20250516 03:32:14.634738 1461 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 16 03:32:14.635278 update_engine[1461]: I20250516 03:32:14.634754 1461 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 16 03:32:14.635278 update_engine[1461]: I20250516 03:32:14.634768 1461 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 16 03:32:14.635278 update_engine[1461]: I20250516 03:32:14.634919 1461 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 16 03:32:14.635278 update_engine[1461]: I20250516 03:32:14.634969 1461 omaha_request_action.cc:271] Posting an Omaha request to disabled May 16 03:32:14.635278 update_engine[1461]: I20250516 03:32:14.635042 1461 omaha_request_action.cc:272] Request: May 16 03:32:14.635278 update_engine[1461]: May 16 03:32:14.635278 update_engine[1461]: May 16 03:32:14.635278 update_engine[1461]: May 16 03:32:14.635278 update_engine[1461]: May 16 03:32:14.635278 update_engine[1461]: May 16 03:32:14.635278 update_engine[1461]: May 16 03:32:14.635278 update_engine[1461]: I20250516 03:32:14.635059 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 16 03:32:14.636938 update_engine[1461]: I20250516 03:32:14.635664 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 16 03:32:14.636938 update_engine[1461]: I20250516 03:32:14.636200 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 16 03:32:14.637314 locksmithd[1489]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 16 03:32:14.641535 update_engine[1461]: E20250516 03:32:14.641388 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 16 03:32:14.641700 update_engine[1461]: I20250516 03:32:14.641575 1461 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 16 03:32:14.641700 update_engine[1461]: I20250516 03:32:14.641605 1461 omaha_request_action.cc:617] Omaha request response: May 16 03:32:14.641700 update_engine[1461]: I20250516 03:32:14.641620 1461 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 16 03:32:14.641700 update_engine[1461]: I20250516 03:32:14.641633 1461 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 16 03:32:14.641700 update_engine[1461]: I20250516 03:32:14.641644 1461 update_attempter.cc:306] Processing Done. May 16 03:32:14.641700 update_engine[1461]: I20250516 03:32:14.641659 1461 update_attempter.cc:310] Error event sent. May 16 03:32:14.642558 update_engine[1461]: I20250516 03:32:14.641696 1461 update_check_scheduler.cc:74] Next update check in 45m41s May 16 03:32:14.643064 locksmithd[1489]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 16 03:32:15.537443 containerd[1476]: time="2025-05-16T03:32:15.537392388Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"b1c1a1fcc93bc08946bf7a1f0d78ae058cfc47b86a5dd163459dca5740508df3\" pid:5318 exited_at:{seconds:1747366335 nanos:536949466}" May 16 03:32:21.026832 kubelet[2715]: E0516 03:32:21.026432 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:32:22.022951 kubelet[2715]: E0516 03:32:22.022649 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:32:33.684582 containerd[1476]: time="2025-05-16T03:32:33.683930112Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"36989933d15f063c385dd4b0b0e3d17067664176b70c29844340132828efcbae\" pid:5342 exited_at:{seconds:1747366353 nanos:681850416}" May 16 03:32:34.032196 kubelet[2715]: E0516 03:32:34.028911 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:32:36.029323 kubelet[2715]: E0516 03:32:36.029172 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:32:45.557397 containerd[1476]: time="2025-05-16T03:32:45.557107905Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"d87ca5bca7d0c69c834b5cb4052afa922fcc41e72cb86720eb63efc337fb75a4\" pid:5367 exited_at:{seconds:1747366365 nanos:555490238}" May 16 03:32:47.024467 kubelet[2715]: E0516 03:32:47.022464 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:32:47.256281 containerd[1476]: time="2025-05-16T03:32:47.256203818Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"c9269c38e05d1a068912f0470e81d876d3d31c6b9618e28fa9f5fd35dc5870ae\" pid:5395 exited_at:{seconds:1747366367 nanos:255807213}" May 16 03:32:49.023734 kubelet[2715]: E0516 03:32:49.022948 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:32:58.027064 kubelet[2715]: E0516 03:32:58.025739 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:33:03.024085 containerd[1476]: time="2025-05-16T03:33:03.023591093Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 16 03:33:03.408417 containerd[1476]: time="2025-05-16T03:33:03.408072659Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:33:03.410359 containerd[1476]: time="2025-05-16T03:33:03.410065190Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:33:03.410359 containerd[1476]: time="2025-05-16T03:33:03.410081341Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 16 03:33:03.411087 kubelet[2715]: E0516 03:33:03.410834 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 03:33:03.412500 kubelet[2715]: E0516 03:33:03.411823 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 03:33:03.412500 kubelet[2715]: E0516 03:33:03.412303 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1b39ead48ab84d5997218c5ff179c936,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gfk9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79cdbd85dd-m22sm_calico-system(32870edc-65ff-47a6-9110-9ba1fe628ed6): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:33:03.416206 containerd[1476]: time="2025-05-16T03:33:03.415758421Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 16 03:33:03.670418 containerd[1476]: time="2025-05-16T03:33:03.670239981Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"6cca8909836bede8783e50c8f388c3c0909b81c210420833b20e578c0d50bba0\" pid:5422 exited_at:{seconds:1747366383 nanos:669592054}" May 16 03:33:03.841314 containerd[1476]: time="2025-05-16T03:33:03.840943164Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:33:03.843176 containerd[1476]: time="2025-05-16T03:33:03.843014884Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 16 03:33:03.843176 containerd[1476]: time="2025-05-16T03:33:03.843022017Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:33:03.843734 kubelet[2715]: E0516 03:33:03.843623 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 03:33:03.843944 kubelet[2715]: E0516 03:33:03.843736 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 03:33:03.844195 kubelet[2715]: E0516 03:33:03.844032 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gfk9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79cdbd85dd-m22sm_calico-system(32870edc-65ff-47a6-9110-9ba1fe628ed6): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:33:03.846114 kubelet[2715]: E0516 03:33:03.845928 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:33:11.023752 containerd[1476]: time="2025-05-16T03:33:11.023609927Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 16 03:33:11.420850 containerd[1476]: time="2025-05-16T03:33:11.420415353Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:33:11.422473 containerd[1476]: time="2025-05-16T03:33:11.422276928Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:33:11.422473 containerd[1476]: time="2025-05-16T03:33:11.422377657Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 16 03:33:11.423204 kubelet[2715]: E0516 03:33:11.423087 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 03:33:11.423905 kubelet[2715]: E0516 03:33:11.423215 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 03:33:11.423905 kubelet[2715]: E0516 03:33:11.423525 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-4kgrm_calico-system(c0064a9e-4be1-4ce0-a21f-9e78adbef175): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:33:11.424959 kubelet[2715]: E0516 03:33:11.424890 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:33:14.048789 kubelet[2715]: E0516 03:33:14.048481 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:33:15.548230 containerd[1476]: time="2025-05-16T03:33:15.547316014Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"c30edd7ed4d5d7938da2589e97831fa11c6dec1a3a9bd999a4e5fa4b332fc84b\" pid:5472 exited_at:{seconds:1747366395 nanos:543974100}" May 16 03:33:26.028724 kubelet[2715]: E0516 03:33:26.026823 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:33:26.028724 kubelet[2715]: E0516 03:33:26.029173 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:33:33.763090 containerd[1476]: time="2025-05-16T03:33:33.762917475Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"f47fe75fd28d50130b674b2370ce23b7d38e67b7100f14e3065f6b2743313981\" pid:5495 exited_at:{seconds:1747366413 nanos:761476850}" May 16 03:33:37.023739 kubelet[2715]: E0516 03:33:37.023552 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:33:39.021580 kubelet[2715]: E0516 03:33:39.021459 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:33:45.532461 containerd[1476]: time="2025-05-16T03:33:45.532369071Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"7d96f4bf3316b77d463caf6cf612f519580dafc088b807d0252a6b9ae189e472\" pid:5517 exited_at:{seconds:1747366425 nanos:531878570}" May 16 03:33:47.294698 containerd[1476]: time="2025-05-16T03:33:47.294383143Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"91442c85a86e7cf428f49d1c3a01581319c0f5a34451b1b104ad437dc77bceaf\" pid:5538 exited_at:{seconds:1747366427 nanos:294071678}" May 16 03:33:51.024588 kubelet[2715]: E0516 03:33:51.024450 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:33:52.021968 kubelet[2715]: E0516 03:33:52.021227 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:34:03.024368 kubelet[2715]: E0516 03:34:03.024082 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:34:03.697742 containerd[1476]: time="2025-05-16T03:34:03.697676719Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"d1695ff1de73b30912f062f5e4f92b533c4a3007228f2c83f8c2da6be624005a\" pid:5561 exited_at:{seconds:1747366443 nanos:696857290}" May 16 03:34:04.029029 kubelet[2715]: E0516 03:34:04.028815 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:34:06.784754 systemd[1]: Started sshd@9-172.24.4.18:22-172.24.4.1:57190.service - OpenSSH per-connection server daemon (172.24.4.1:57190). May 16 03:34:07.897958 containerd[1476]: time="2025-05-16T03:34:07.897557226Z" level=warning msg="container event discarded" container=a47a87742e348af182193a202bbc43ab5bb7c21e022981a556586551fe0a357b type=CONTAINER_CREATED_EVENT May 16 03:34:07.897958 containerd[1476]: time="2025-05-16T03:34:07.897878700Z" level=warning msg="container event discarded" container=a47a87742e348af182193a202bbc43ab5bb7c21e022981a556586551fe0a357b type=CONTAINER_STARTED_EVENT May 16 03:34:07.933488 containerd[1476]: time="2025-05-16T03:34:07.933305526Z" level=warning msg="container event discarded" container=63f733cbadf97bdb24f727cc66f98d8df03b43fc083df73575dabb724babc0b6 type=CONTAINER_CREATED_EVENT May 16 03:34:07.933488 containerd[1476]: time="2025-05-16T03:34:07.933453694Z" level=warning msg="container event discarded" container=63f733cbadf97bdb24f727cc66f98d8df03b43fc083df73575dabb724babc0b6 type=CONTAINER_STARTED_EVENT May 16 03:34:07.933488 containerd[1476]: time="2025-05-16T03:34:07.933479643Z" level=warning msg="container event discarded" container=98c4672d3bb7acca896d4d379a5d6b6b1f0f6b51f81e2888fa9b6bc1ebfaa969 type=CONTAINER_CREATED_EVENT May 16 03:34:07.933955 containerd[1476]: time="2025-05-16T03:34:07.933505381Z" level=warning msg="container event discarded" container=98c4672d3bb7acca896d4d379a5d6b6b1f0f6b51f81e2888fa9b6bc1ebfaa969 type=CONTAINER_STARTED_EVENT May 16 03:34:07.961899 containerd[1476]: time="2025-05-16T03:34:07.961325997Z" level=warning msg="container event discarded" container=99cd49291a632c73a3bb28eb263acaae782c0a71272fc4ef6c10472240dd5b7f type=CONTAINER_CREATED_EVENT May 16 03:34:07.994091 containerd[1476]: time="2025-05-16T03:34:07.993889496Z" level=warning msg="container event discarded" container=aff0d092e2b167c222f66195ca6e32e5a809b76d6f43472bbc4761df8e2d4e4f type=CONTAINER_CREATED_EVENT May 16 03:34:08.010596 containerd[1476]: time="2025-05-16T03:34:08.010467763Z" level=warning msg="container event discarded" container=6879405abcf075161fec3b31d386f72b0af81309229a6e5015336d82b5e1cbfe type=CONTAINER_CREATED_EVENT May 16 03:34:08.101050 containerd[1476]: time="2025-05-16T03:34:08.100782794Z" level=warning msg="container event discarded" container=99cd49291a632c73a3bb28eb263acaae782c0a71272fc4ef6c10472240dd5b7f type=CONTAINER_STARTED_EVENT May 16 03:34:08.147275 containerd[1476]: time="2025-05-16T03:34:08.147010259Z" level=warning msg="container event discarded" container=aff0d092e2b167c222f66195ca6e32e5a809b76d6f43472bbc4761df8e2d4e4f type=CONTAINER_STARTED_EVENT May 16 03:34:08.183900 containerd[1476]: time="2025-05-16T03:34:08.183661173Z" level=warning msg="container event discarded" container=6879405abcf075161fec3b31d386f72b0af81309229a6e5015336d82b5e1cbfe type=CONTAINER_STARTED_EVENT May 16 03:34:08.220895 sshd[5578]: Accepted publickey for core from 172.24.4.1 port 57190 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:34:08.224831 sshd-session[5578]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:34:08.233882 systemd-logind[1458]: New session 12 of user core. May 16 03:34:08.241214 systemd[1]: Started session-12.scope - Session 12 of User core. May 16 03:34:09.045104 sshd[5580]: Connection closed by 172.24.4.1 port 57190 May 16 03:34:09.045475 sshd-session[5578]: pam_unix(sshd:session): session closed for user core May 16 03:34:09.053401 systemd-logind[1458]: Session 12 logged out. Waiting for processes to exit. May 16 03:34:09.053856 systemd[1]: sshd@9-172.24.4.18:22-172.24.4.1:57190.service: Deactivated successfully. May 16 03:34:09.058164 systemd[1]: session-12.scope: Deactivated successfully. May 16 03:34:09.063065 systemd-logind[1458]: Removed session 12. May 16 03:34:14.070313 systemd[1]: Started sshd@10-172.24.4.18:22-172.24.4.1:34630.service - OpenSSH per-connection server daemon (172.24.4.1:34630). May 16 03:34:15.024097 kubelet[2715]: E0516 03:34:15.024029 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:34:15.281554 sshd[5599]: Accepted publickey for core from 172.24.4.1 port 34630 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:34:15.282861 sshd-session[5599]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:34:15.292627 systemd-logind[1458]: New session 13 of user core. May 16 03:34:15.298238 systemd[1]: Started session-13.scope - Session 13 of User core. May 16 03:34:15.470070 containerd[1476]: time="2025-05-16T03:34:15.469895110Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"95629f20fbc8ff6bef858c3bd8b15f19bc42c8dca33f799bf01762e910b9010b\" pid:5615 exited_at:{seconds:1747366455 nanos:469337353}" May 16 03:34:15.921994 sshd[5601]: Connection closed by 172.24.4.1 port 34630 May 16 03:34:15.922771 sshd-session[5599]: pam_unix(sshd:session): session closed for user core May 16 03:34:15.929058 systemd[1]: sshd@10-172.24.4.18:22-172.24.4.1:34630.service: Deactivated successfully. May 16 03:34:15.929203 systemd-logind[1458]: Session 13 logged out. Waiting for processes to exit. May 16 03:34:15.932802 systemd[1]: session-13.scope: Deactivated successfully. May 16 03:34:15.935141 systemd-logind[1458]: Removed session 13. May 16 03:34:16.022058 kubelet[2715]: E0516 03:34:16.021452 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:34:20.616372 containerd[1476]: time="2025-05-16T03:34:20.616113476Z" level=warning msg="container event discarded" container=a7f8859a55b063a3acb7093d16a6a59d6b5c03b9605f67d514c845737d73b55e type=CONTAINER_CREATED_EVENT May 16 03:34:20.616372 containerd[1476]: time="2025-05-16T03:34:20.616280780Z" level=warning msg="container event discarded" container=a7f8859a55b063a3acb7093d16a6a59d6b5c03b9605f67d514c845737d73b55e type=CONTAINER_STARTED_EVENT May 16 03:34:20.651907 containerd[1476]: time="2025-05-16T03:34:20.651731431Z" level=warning msg="container event discarded" container=f349462f6a0b9ef104fc1c8970945e4b47ee633b02eab79ef6023dd5333eed65 type=CONTAINER_CREATED_EVENT May 16 03:34:20.651907 containerd[1476]: time="2025-05-16T03:34:20.651831849Z" level=warning msg="container event discarded" container=f349462f6a0b9ef104fc1c8970945e4b47ee633b02eab79ef6023dd5333eed65 type=CONTAINER_STARTED_EVENT May 16 03:34:20.670249 containerd[1476]: time="2025-05-16T03:34:20.670105128Z" level=warning msg="container event discarded" container=5bb6edd27c8ac51a6714cf7714cecba4ff89be9f092cb60030cbc0305aa7e113 type=CONTAINER_CREATED_EVENT May 16 03:34:20.749857 containerd[1476]: time="2025-05-16T03:34:20.749699351Z" level=warning msg="container event discarded" container=5bb6edd27c8ac51a6714cf7714cecba4ff89be9f092cb60030cbc0305aa7e113 type=CONTAINER_STARTED_EVENT May 16 03:34:20.955515 systemd[1]: Started sshd@11-172.24.4.18:22-172.24.4.1:34636.service - OpenSSH per-connection server daemon (172.24.4.1:34636). May 16 03:34:22.147088 sshd[5637]: Accepted publickey for core from 172.24.4.1 port 34636 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:34:22.151204 sshd-session[5637]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:34:22.166492 systemd-logind[1458]: New session 14 of user core. May 16 03:34:22.174353 systemd[1]: Started session-14.scope - Session 14 of User core. May 16 03:34:22.917671 sshd[5639]: Connection closed by 172.24.4.1 port 34636 May 16 03:34:22.918489 sshd-session[5637]: pam_unix(sshd:session): session closed for user core May 16 03:34:22.931374 systemd[1]: sshd@11-172.24.4.18:22-172.24.4.1:34636.service: Deactivated successfully. May 16 03:34:22.943530 systemd[1]: session-14.scope: Deactivated successfully. May 16 03:34:22.944746 systemd-logind[1458]: Session 14 logged out. Waiting for processes to exit. May 16 03:34:22.948764 systemd[1]: Started sshd@12-172.24.4.18:22-172.24.4.1:34642.service - OpenSSH per-connection server daemon (172.24.4.1:34642). May 16 03:34:22.950898 systemd-logind[1458]: Removed session 14. May 16 03:34:23.866202 containerd[1476]: time="2025-05-16T03:34:23.866089319Z" level=warning msg="container event discarded" container=ab9310c0525b83ab54deab403850dace40e4ddd7abbd5faccfede54695980a7b type=CONTAINER_CREATED_EVENT May 16 03:34:23.942546 containerd[1476]: time="2025-05-16T03:34:23.942411538Z" level=warning msg="container event discarded" container=ab9310c0525b83ab54deab403850dace40e4ddd7abbd5faccfede54695980a7b type=CONTAINER_STARTED_EVENT May 16 03:34:24.290812 sshd[5651]: Accepted publickey for core from 172.24.4.1 port 34642 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:34:24.293855 sshd-session[5651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:34:24.306353 systemd-logind[1458]: New session 15 of user core. May 16 03:34:24.316346 systemd[1]: Started session-15.scope - Session 15 of User core. May 16 03:34:25.023015 sshd[5654]: Connection closed by 172.24.4.1 port 34642 May 16 03:34:25.025267 sshd-session[5651]: pam_unix(sshd:session): session closed for user core May 16 03:34:25.042334 systemd[1]: sshd@12-172.24.4.18:22-172.24.4.1:34642.service: Deactivated successfully. May 16 03:34:25.048532 systemd[1]: session-15.scope: Deactivated successfully. May 16 03:34:25.050551 systemd-logind[1458]: Session 15 logged out. Waiting for processes to exit. May 16 03:34:25.058566 systemd[1]: Started sshd@13-172.24.4.18:22-172.24.4.1:38256.service - OpenSSH per-connection server daemon (172.24.4.1:38256). May 16 03:34:25.059708 systemd-logind[1458]: Removed session 15. May 16 03:34:26.225237 sshd[5664]: Accepted publickey for core from 172.24.4.1 port 38256 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:34:26.228851 sshd-session[5664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:34:26.245194 systemd-logind[1458]: New session 16 of user core. May 16 03:34:26.251383 systemd[1]: Started session-16.scope - Session 16 of User core. May 16 03:34:26.978318 sshd[5667]: Connection closed by 172.24.4.1 port 38256 May 16 03:34:26.979308 sshd-session[5664]: pam_unix(sshd:session): session closed for user core May 16 03:34:26.984280 systemd-logind[1458]: Session 16 logged out. Waiting for processes to exit. May 16 03:34:26.984775 systemd[1]: sshd@13-172.24.4.18:22-172.24.4.1:38256.service: Deactivated successfully. May 16 03:34:26.989471 systemd[1]: session-16.scope: Deactivated successfully. May 16 03:34:26.993411 systemd-logind[1458]: Removed session 16. May 16 03:34:27.022297 kubelet[2715]: E0516 03:34:27.021528 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:34:28.031448 kubelet[2715]: E0516 03:34:28.031250 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:34:31.998620 systemd[1]: Started sshd@14-172.24.4.18:22-172.24.4.1:38262.service - OpenSSH per-connection server daemon (172.24.4.1:38262). May 16 03:34:33.374460 sshd[5679]: Accepted publickey for core from 172.24.4.1 port 38262 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:34:33.376352 sshd-session[5679]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:34:33.382118 systemd-logind[1458]: New session 17 of user core. May 16 03:34:33.389180 systemd[1]: Started session-17.scope - Session 17 of User core. May 16 03:34:33.673787 containerd[1476]: time="2025-05-16T03:34:33.673641952Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"6b8721530830112919d89a17ae7f5339a40219a94a3285113f9fcd8bc80d18c3\" pid:5700 exit_status:1 exited_at:{seconds:1747366473 nanos:673153926}" May 16 03:34:34.122277 sshd[5684]: Connection closed by 172.24.4.1 port 38262 May 16 03:34:34.121599 sshd-session[5679]: pam_unix(sshd:session): session closed for user core May 16 03:34:34.128727 systemd[1]: sshd@14-172.24.4.18:22-172.24.4.1:38262.service: Deactivated successfully. May 16 03:34:34.132476 systemd[1]: session-17.scope: Deactivated successfully. May 16 03:34:34.134013 systemd-logind[1458]: Session 17 logged out. Waiting for processes to exit. May 16 03:34:34.135844 systemd-logind[1458]: Removed session 17. May 16 03:34:36.278879 containerd[1476]: time="2025-05-16T03:34:36.278657719Z" level=warning msg="container event discarded" container=f6795a17a1c56d2f50a2a41bda7254bc663a9503616f8c37f30a2bdff32d2a55 type=CONTAINER_CREATED_EVENT May 16 03:34:36.278879 containerd[1476]: time="2025-05-16T03:34:36.278827667Z" level=warning msg="container event discarded" container=f6795a17a1c56d2f50a2a41bda7254bc663a9503616f8c37f30a2bdff32d2a55 type=CONTAINER_STARTED_EVENT May 16 03:34:36.581143 containerd[1476]: time="2025-05-16T03:34:36.580722801Z" level=warning msg="container event discarded" container=9d7f50032733af504fe12d3345a33b7d3ff2d238a9ef730c1a5afbd6fe910994 type=CONTAINER_CREATED_EVENT May 16 03:34:36.581143 containerd[1476]: time="2025-05-16T03:34:36.580811196Z" level=warning msg="container event discarded" container=9d7f50032733af504fe12d3345a33b7d3ff2d238a9ef730c1a5afbd6fe910994 type=CONTAINER_STARTED_EVENT May 16 03:34:39.023739 kubelet[2715]: E0516 03:34:39.023583 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:34:39.152212 systemd[1]: Started sshd@15-172.24.4.18:22-172.24.4.1:42332.service - OpenSSH per-connection server daemon (172.24.4.1:42332). May 16 03:34:39.891334 containerd[1476]: time="2025-05-16T03:34:39.891209714Z" level=warning msg="container event discarded" container=a834d1e63b749457e1c042f6592e21caa32ae027aad8bbdd6eec369cb961444d type=CONTAINER_CREATED_EVENT May 16 03:34:39.981951 containerd[1476]: time="2025-05-16T03:34:39.981755678Z" level=warning msg="container event discarded" container=a834d1e63b749457e1c042f6592e21caa32ae027aad8bbdd6eec369cb961444d type=CONTAINER_STARTED_EVENT May 16 03:34:40.464340 sshd[5724]: Accepted publickey for core from 172.24.4.1 port 42332 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:34:40.469581 sshd-session[5724]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:34:40.492453 systemd-logind[1458]: New session 18 of user core. May 16 03:34:40.503373 systemd[1]: Started session-18.scope - Session 18 of User core. May 16 03:34:41.024527 kubelet[2715]: E0516 03:34:41.024287 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:34:41.452333 sshd[5726]: Connection closed by 172.24.4.1 port 42332 May 16 03:34:41.453270 sshd-session[5724]: pam_unix(sshd:session): session closed for user core May 16 03:34:41.458426 systemd[1]: sshd@15-172.24.4.18:22-172.24.4.1:42332.service: Deactivated successfully. May 16 03:34:41.461515 systemd[1]: session-18.scope: Deactivated successfully. May 16 03:34:41.462591 systemd-logind[1458]: Session 18 logged out. Waiting for processes to exit. May 16 03:34:41.464709 systemd-logind[1458]: Removed session 18. May 16 03:34:42.077157 containerd[1476]: time="2025-05-16T03:34:42.077070493Z" level=warning msg="container event discarded" container=6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17 type=CONTAINER_CREATED_EVENT May 16 03:34:42.158463 containerd[1476]: time="2025-05-16T03:34:42.158363455Z" level=warning msg="container event discarded" container=6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17 type=CONTAINER_STARTED_EVENT May 16 03:34:42.915579 containerd[1476]: time="2025-05-16T03:34:42.915374800Z" level=warning msg="container event discarded" container=6642d847616d2fe9164bd9eda6b5294edc68db8ce126d1ae18f081a31c8f7e17 type=CONTAINER_STOPPED_EVENT May 16 03:34:45.550237 containerd[1476]: time="2025-05-16T03:34:45.550137856Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"bb2a0840402c88d11563734692546aaf2feb044ab0e5d34aca59bcdafb5e5dfd\" pid:5757 exited_at:{seconds:1747366485 nanos:549279564}" May 16 03:34:46.493763 systemd[1]: Started sshd@16-172.24.4.18:22-172.24.4.1:50002.service - OpenSSH per-connection server daemon (172.24.4.1:50002). May 16 03:34:47.297641 containerd[1476]: time="2025-05-16T03:34:47.297555245Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"6d902ac647aedbc02b665b637b92ca8634d57c21d7abdddbadbafd3c0ca198b3\" pid:5782 exited_at:{seconds:1747366487 nanos:296752679}" May 16 03:34:47.717918 sshd[5767]: Accepted publickey for core from 172.24.4.1 port 50002 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:34:47.720213 sshd-session[5767]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:34:47.765497 systemd-logind[1458]: New session 19 of user core. May 16 03:34:47.778346 systemd[1]: Started session-19.scope - Session 19 of User core. May 16 03:34:48.417629 sshd[5791]: Connection closed by 172.24.4.1 port 50002 May 16 03:34:48.419165 sshd-session[5767]: pam_unix(sshd:session): session closed for user core May 16 03:34:48.432251 systemd[1]: Started sshd@17-172.24.4.18:22-172.24.4.1:50008.service - OpenSSH per-connection server daemon (172.24.4.1:50008). May 16 03:34:48.432789 systemd[1]: sshd@16-172.24.4.18:22-172.24.4.1:50002.service: Deactivated successfully. May 16 03:34:48.436813 systemd[1]: session-19.scope: Deactivated successfully. May 16 03:34:48.442110 systemd-logind[1458]: Session 19 logged out. Waiting for processes to exit. May 16 03:34:48.447134 systemd-logind[1458]: Removed session 19. May 16 03:34:48.518795 containerd[1476]: time="2025-05-16T03:34:48.518608221Z" level=warning msg="container event discarded" container=2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6 type=CONTAINER_CREATED_EVENT May 16 03:34:48.615306 containerd[1476]: time="2025-05-16T03:34:48.615209544Z" level=warning msg="container event discarded" container=2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6 type=CONTAINER_STARTED_EVENT May 16 03:34:49.796716 sshd[5813]: Accepted publickey for core from 172.24.4.1 port 50008 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:34:49.799735 sshd-session[5813]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:34:49.807100 systemd-logind[1458]: New session 20 of user core. May 16 03:34:49.815155 systemd[1]: Started session-20.scope - Session 20 of User core. May 16 03:34:51.186241 containerd[1476]: time="2025-05-16T03:34:51.186116644Z" level=warning msg="container event discarded" container=2863aa1bbfbdccccd428381c2c96b6406b966cd4a6d6960b4ca5e27fff1518f6 type=CONTAINER_STOPPED_EVENT May 16 03:34:51.190836 sshd[5818]: Connection closed by 172.24.4.1 port 50008 May 16 03:34:51.192325 sshd-session[5813]: pam_unix(sshd:session): session closed for user core May 16 03:34:51.214699 systemd[1]: sshd@17-172.24.4.18:22-172.24.4.1:50008.service: Deactivated successfully. May 16 03:34:51.221598 systemd[1]: session-20.scope: Deactivated successfully. May 16 03:34:51.224612 systemd-logind[1458]: Session 20 logged out. Waiting for processes to exit. May 16 03:34:51.232556 systemd[1]: Started sshd@18-172.24.4.18:22-172.24.4.1:50022.service - OpenSSH per-connection server daemon (172.24.4.1:50022). May 16 03:34:51.236672 systemd-logind[1458]: Removed session 20. May 16 03:34:52.029050 kubelet[2715]: E0516 03:34:52.028453 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:34:52.351452 sshd[5828]: Accepted publickey for core from 172.24.4.1 port 50022 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:34:52.354639 sshd-session[5828]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:34:52.369130 systemd-logind[1458]: New session 21 of user core. May 16 03:34:52.377356 systemd[1]: Started session-21.scope - Session 21 of User core. May 16 03:34:54.530189 sshd[5831]: Connection closed by 172.24.4.1 port 50022 May 16 03:34:54.531222 sshd-session[5828]: pam_unix(sshd:session): session closed for user core May 16 03:34:54.549683 systemd[1]: sshd@18-172.24.4.18:22-172.24.4.1:50022.service: Deactivated successfully. May 16 03:34:54.553349 systemd[1]: session-21.scope: Deactivated successfully. May 16 03:34:54.559546 systemd-logind[1458]: Session 21 logged out. Waiting for processes to exit. May 16 03:34:54.563221 systemd-logind[1458]: Removed session 21. May 16 03:34:54.567361 systemd[1]: Started sshd@19-172.24.4.18:22-172.24.4.1:51212.service - OpenSSH per-connection server daemon (172.24.4.1:51212). May 16 03:34:55.023625 kubelet[2715]: E0516 03:34:55.023405 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:34:55.838171 sshd[5847]: Accepted publickey for core from 172.24.4.1 port 51212 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:34:55.842693 sshd-session[5847]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:34:55.858360 systemd-logind[1458]: New session 22 of user core. May 16 03:34:55.869373 systemd[1]: Started session-22.scope - Session 22 of User core. May 16 03:34:56.847754 sshd[5850]: Connection closed by 172.24.4.1 port 51212 May 16 03:34:56.848142 sshd-session[5847]: pam_unix(sshd:session): session closed for user core May 16 03:34:56.868147 systemd[1]: sshd@19-172.24.4.18:22-172.24.4.1:51212.service: Deactivated successfully. May 16 03:34:56.873446 systemd[1]: session-22.scope: Deactivated successfully. May 16 03:34:56.875080 systemd-logind[1458]: Session 22 logged out. Waiting for processes to exit. May 16 03:34:56.880866 systemd[1]: Started sshd@20-172.24.4.18:22-172.24.4.1:51214.service - OpenSSH per-connection server daemon (172.24.4.1:51214). May 16 03:34:56.885587 systemd-logind[1458]: Removed session 22. May 16 03:34:58.062138 sshd[5859]: Accepted publickey for core from 172.24.4.1 port 51214 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:34:58.066320 sshd-session[5859]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:34:58.102579 systemd-logind[1458]: New session 23 of user core. May 16 03:34:58.113507 systemd[1]: Started session-23.scope - Session 23 of User core. May 16 03:34:58.840140 sshd[5862]: Connection closed by 172.24.4.1 port 51214 May 16 03:34:58.841467 sshd-session[5859]: pam_unix(sshd:session): session closed for user core May 16 03:34:58.849973 systemd[1]: sshd@20-172.24.4.18:22-172.24.4.1:51214.service: Deactivated successfully. May 16 03:34:58.858113 systemd[1]: session-23.scope: Deactivated successfully. May 16 03:34:58.862205 systemd-logind[1458]: Session 23 logged out. Waiting for processes to exit. May 16 03:34:58.866367 systemd-logind[1458]: Removed session 23. May 16 03:35:02.595474 containerd[1476]: time="2025-05-16T03:35:02.595318964Z" level=warning msg="container event discarded" container=bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21 type=CONTAINER_CREATED_EVENT May 16 03:35:02.770606 containerd[1476]: time="2025-05-16T03:35:02.770465239Z" level=warning msg="container event discarded" container=bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21 type=CONTAINER_STARTED_EVENT May 16 03:35:03.023368 kubelet[2715]: E0516 03:35:03.022935 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:35:03.707421 containerd[1476]: time="2025-05-16T03:35:03.707240347Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"6bdb1844f6d7146a3f4122968f170b1d8999c7601d342decebfdc566cab378ea\" pid:5888 exited_at:{seconds:1747366503 nanos:706691937}" May 16 03:35:03.862897 systemd[1]: Started sshd@21-172.24.4.18:22-172.24.4.1:38808.service - OpenSSH per-connection server daemon (172.24.4.1:38808). May 16 03:35:04.349483 containerd[1476]: time="2025-05-16T03:35:04.349298971Z" level=warning msg="container event discarded" container=f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819 type=CONTAINER_CREATED_EVENT May 16 03:35:04.349483 containerd[1476]: time="2025-05-16T03:35:04.349466997Z" level=warning msg="container event discarded" container=f0b57d3614033efd8038a87931246568a29029ab91f3d461aa31eb4ff9d25819 type=CONTAINER_STARTED_EVENT May 16 03:35:05.176162 sshd[5901]: Accepted publickey for core from 172.24.4.1 port 38808 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:35:05.181570 sshd-session[5901]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:35:05.200091 systemd-logind[1458]: New session 24 of user core. May 16 03:35:05.204342 systemd[1]: Started session-24.scope - Session 24 of User core. May 16 03:35:05.612889 containerd[1476]: time="2025-05-16T03:35:05.612761700Z" level=warning msg="container event discarded" container=ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7 type=CONTAINER_CREATED_EVENT May 16 03:35:05.612889 containerd[1476]: time="2025-05-16T03:35:05.612864834Z" level=warning msg="container event discarded" container=ff98b2fe86c442a3a9379b4a188d1ee56461b0f791b81415e941cbf4f3f447c7 type=CONTAINER_STARTED_EVENT May 16 03:35:05.963667 sshd[5903]: Connection closed by 172.24.4.1 port 38808 May 16 03:35:05.963261 sshd-session[5901]: pam_unix(sshd:session): session closed for user core May 16 03:35:05.969885 systemd[1]: sshd@21-172.24.4.18:22-172.24.4.1:38808.service: Deactivated successfully. May 16 03:35:05.978153 systemd[1]: session-24.scope: Deactivated successfully. May 16 03:35:05.980087 systemd-logind[1458]: Session 24 logged out. Waiting for processes to exit. May 16 03:35:05.981767 systemd-logind[1458]: Removed session 24. May 16 03:35:07.504541 containerd[1476]: time="2025-05-16T03:35:07.504434492Z" level=warning msg="container event discarded" container=89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e type=CONTAINER_CREATED_EVENT May 16 03:35:07.504541 containerd[1476]: time="2025-05-16T03:35:07.504488042Z" level=warning msg="container event discarded" container=89319fc2b7460564f188070034b32bc5c60dd981de9d83a86aa10306960bdf9e type=CONTAINER_STARTED_EVENT May 16 03:35:07.550755 containerd[1476]: time="2025-05-16T03:35:07.550667597Z" level=warning msg="container event discarded" container=f284e2aee57741212f6bed34f6be980acef236f021794573e1eb1456c2e37df3 type=CONTAINER_CREATED_EVENT May 16 03:35:07.630270 containerd[1476]: time="2025-05-16T03:35:07.630180817Z" level=warning msg="container event discarded" container=f284e2aee57741212f6bed34f6be980acef236f021794573e1eb1456c2e37df3 type=CONTAINER_STARTED_EVENT May 16 03:35:07.693725 containerd[1476]: time="2025-05-16T03:35:07.693536570Z" level=warning msg="container event discarded" container=4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92 type=CONTAINER_CREATED_EVENT May 16 03:35:07.693725 containerd[1476]: time="2025-05-16T03:35:07.693648921Z" level=warning msg="container event discarded" container=4f1b34423daa5a3882034f1235b0240a641a8d8e5ebec4a4e4f1ec9d6f523e92 type=CONTAINER_STARTED_EVENT May 16 03:35:08.198622 containerd[1476]: time="2025-05-16T03:35:08.198475005Z" level=warning msg="container event discarded" container=d4cfa518291ececa5c63eec8c09a8956a638541603167ea6e6f565977c71924c type=CONTAINER_CREATED_EVENT May 16 03:35:08.351251 containerd[1476]: time="2025-05-16T03:35:08.351103713Z" level=warning msg="container event discarded" container=d4cfa518291ececa5c63eec8c09a8956a638541603167ea6e6f565977c71924c type=CONTAINER_STARTED_EVENT May 16 03:35:09.025434 kubelet[2715]: E0516 03:35:09.025187 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:35:10.990233 systemd[1]: Started sshd@22-172.24.4.18:22-172.24.4.1:38820.service - OpenSSH per-connection server daemon (172.24.4.1:38820). May 16 03:35:12.164558 sshd[5915]: Accepted publickey for core from 172.24.4.1 port 38820 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:35:12.167718 sshd-session[5915]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:35:12.177588 systemd-logind[1458]: New session 25 of user core. May 16 03:35:12.184388 systemd[1]: Started session-25.scope - Session 25 of User core. May 16 03:35:12.940903 sshd[5917]: Connection closed by 172.24.4.1 port 38820 May 16 03:35:12.944646 sshd-session[5915]: pam_unix(sshd:session): session closed for user core May 16 03:35:12.953599 systemd-logind[1458]: Session 25 logged out. Waiting for processes to exit. May 16 03:35:12.955736 systemd[1]: sshd@22-172.24.4.18:22-172.24.4.1:38820.service: Deactivated successfully. May 16 03:35:12.968339 systemd[1]: session-25.scope: Deactivated successfully. May 16 03:35:12.977309 systemd-logind[1458]: Removed session 25. May 16 03:35:13.599022 containerd[1476]: time="2025-05-16T03:35:13.596947009Z" level=warning msg="container event discarded" container=679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa type=CONTAINER_CREATED_EVENT May 16 03:35:13.599022 containerd[1476]: time="2025-05-16T03:35:13.597266089Z" level=warning msg="container event discarded" container=679133f3240aee94fe5883fc0b549f3bf1dac321a2351be0b59329b229ff63aa type=CONTAINER_STARTED_EVENT May 16 03:35:14.951045 containerd[1476]: time="2025-05-16T03:35:14.950911107Z" level=warning msg="container event discarded" container=6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02 type=CONTAINER_CREATED_EVENT May 16 03:35:15.043127 containerd[1476]: time="2025-05-16T03:35:15.042941990Z" level=warning msg="container event discarded" container=6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02 type=CONTAINER_STARTED_EVENT May 16 03:35:15.718166 containerd[1476]: time="2025-05-16T03:35:15.718102764Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"820636056cb156bcc5f44bbcc87fc133ded6995f2e890970c4fe3d7226c953b2\" pid:5942 exited_at:{seconds:1747366515 nanos:717355271}" May 16 03:35:17.647019 containerd[1476]: time="2025-05-16T03:35:17.646298088Z" level=warning msg="container event discarded" container=d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e type=CONTAINER_CREATED_EVENT May 16 03:35:17.647019 containerd[1476]: time="2025-05-16T03:35:17.646381334Z" level=warning msg="container event discarded" container=d5ce531e2f5c104d0a1f74faaa157e52fd9bcb87afeb22efaef2be2d063f006e type=CONTAINER_STARTED_EVENT May 16 03:35:17.719719 containerd[1476]: time="2025-05-16T03:35:17.719628649Z" level=warning msg="container event discarded" container=5ef3d88d8f21ebc9d486f3e825296892495869ace657a0c6c86a26b00aaf2452 type=CONTAINER_CREATED_EVENT May 16 03:35:17.736230 containerd[1476]: time="2025-05-16T03:35:17.736130151Z" level=warning msg="container event discarded" container=8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91 type=CONTAINER_CREATED_EVENT May 16 03:35:17.736230 containerd[1476]: time="2025-05-16T03:35:17.736193710Z" level=warning msg="container event discarded" container=8d3e527b79ace0ab0defaab4f9ca05dab406a9930b366f078c502e5f703e9c91 type=CONTAINER_STARTED_EVENT May 16 03:35:17.814416 containerd[1476]: time="2025-05-16T03:35:17.814330706Z" level=warning msg="container event discarded" container=5ef3d88d8f21ebc9d486f3e825296892495869ace657a0c6c86a26b00aaf2452 type=CONTAINER_STARTED_EVENT May 16 03:35:17.952799 containerd[1476]: time="2025-05-16T03:35:17.952657470Z" level=warning msg="container event discarded" container=dca1141edafcd1c7ace4d7026130bbf55e1579a0ebf1d02bc97df400c598ce14 type=CONTAINER_CREATED_EVENT May 16 03:35:17.957765 systemd[1]: Started sshd@23-172.24.4.18:22-172.24.4.1:46238.service - OpenSSH per-connection server daemon (172.24.4.1:46238). May 16 03:35:18.023575 kubelet[2715]: E0516 03:35:18.023455 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:35:18.050822 containerd[1476]: time="2025-05-16T03:35:18.050726549Z" level=warning msg="container event discarded" container=dca1141edafcd1c7ace4d7026130bbf55e1579a0ebf1d02bc97df400c598ce14 type=CONTAINER_STARTED_EVENT May 16 03:35:18.427242 containerd[1476]: time="2025-05-16T03:35:18.427066285Z" level=warning msg="container event discarded" container=87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1 type=CONTAINER_CREATED_EVENT May 16 03:35:18.427242 containerd[1476]: time="2025-05-16T03:35:18.427178465Z" level=warning msg="container event discarded" container=87a2f2f11d54e0f84661b1b5d1129fc2eac9fc41452d149e3de5960bcf6554b1 type=CONTAINER_STARTED_EVENT May 16 03:35:19.296453 sshd[5952]: Accepted publickey for core from 172.24.4.1 port 46238 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:35:19.300135 sshd-session[5952]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:35:19.317153 systemd-logind[1458]: New session 26 of user core. May 16 03:35:19.323360 systemd[1]: Started session-26.scope - Session 26 of User core. May 16 03:35:20.009618 sshd[5954]: Connection closed by 172.24.4.1 port 46238 May 16 03:35:20.008864 sshd-session[5952]: pam_unix(sshd:session): session closed for user core May 16 03:35:20.015380 systemd[1]: sshd@23-172.24.4.18:22-172.24.4.1:46238.service: Deactivated successfully. May 16 03:35:20.022498 systemd[1]: session-26.scope: Deactivated successfully. May 16 03:35:20.025477 systemd-logind[1458]: Session 26 logged out. Waiting for processes to exit. May 16 03:35:20.029754 systemd-logind[1458]: Removed session 26. May 16 03:35:23.195752 containerd[1476]: time="2025-05-16T03:35:23.195606944Z" level=warning msg="container event discarded" container=36426fd7141f7dc2d0eea003e008e0e3bfa8bd97c8bc5f17110baba688bd19c6 type=CONTAINER_CREATED_EVENT May 16 03:35:23.384417 containerd[1476]: time="2025-05-16T03:35:23.384178815Z" level=warning msg="container event discarded" container=36426fd7141f7dc2d0eea003e008e0e3bfa8bd97c8bc5f17110baba688bd19c6 type=CONTAINER_STARTED_EVENT May 16 03:35:23.775240 containerd[1476]: time="2025-05-16T03:35:23.775049163Z" level=warning msg="container event discarded" container=449db808f5f65fbaa071976d3b6fb2bd0185103764eccf1c2e64b98215702699 type=CONTAINER_CREATED_EVENT May 16 03:35:23.955871 containerd[1476]: time="2025-05-16T03:35:23.955573956Z" level=warning msg="container event discarded" container=449db808f5f65fbaa071976d3b6fb2bd0185103764eccf1c2e64b98215702699 type=CONTAINER_STARTED_EVENT May 16 03:35:24.031450 kubelet[2715]: E0516 03:35:24.030767 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:35:25.047648 systemd[1]: Started sshd@24-172.24.4.18:22-172.24.4.1:52618.service - OpenSSH per-connection server daemon (172.24.4.1:52618). May 16 03:35:26.224069 sshd[5968]: Accepted publickey for core from 172.24.4.1 port 52618 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:35:26.228672 sshd-session[5968]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:35:26.253379 systemd-logind[1458]: New session 27 of user core. May 16 03:35:26.259433 systemd[1]: Started session-27.scope - Session 27 of User core. May 16 03:35:26.982218 sshd[5976]: Connection closed by 172.24.4.1 port 52618 May 16 03:35:26.980505 sshd-session[5968]: pam_unix(sshd:session): session closed for user core May 16 03:35:27.000261 systemd[1]: sshd@24-172.24.4.18:22-172.24.4.1:52618.service: Deactivated successfully. May 16 03:35:27.009967 systemd[1]: session-27.scope: Deactivated successfully. May 16 03:35:27.012546 systemd-logind[1458]: Session 27 logged out. Waiting for processes to exit. May 16 03:35:27.017688 systemd-logind[1458]: Removed session 27. May 16 03:35:29.026232 kubelet[2715]: E0516 03:35:29.025908 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:35:32.016201 systemd[1]: Started sshd@25-172.24.4.18:22-172.24.4.1:52622.service - OpenSSH per-connection server daemon (172.24.4.1:52622). May 16 03:35:33.180007 sshd[5988]: Accepted publickey for core from 172.24.4.1 port 52622 ssh2: RSA SHA256:owno7cXPe7mCZ8El09DavZ3D/t1OlRFkGW4Z9BoK2co May 16 03:35:33.183566 sshd-session[5988]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 03:35:33.192840 systemd-logind[1458]: New session 28 of user core. May 16 03:35:33.197392 systemd[1]: Started session-28.scope - Session 28 of User core. May 16 03:35:33.725499 containerd[1476]: time="2025-05-16T03:35:33.725120984Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"b2e9df13ee7227c7a18bea28d26518acf1e1857008aa8ebf8fdec54e215fdb0e\" pid:6002 exited_at:{seconds:1747366533 nanos:724353413}" May 16 03:35:33.969695 sshd[5990]: Connection closed by 172.24.4.1 port 52622 May 16 03:35:33.972031 sshd-session[5988]: pam_unix(sshd:session): session closed for user core May 16 03:35:33.982666 systemd[1]: sshd@25-172.24.4.18:22-172.24.4.1:52622.service: Deactivated successfully. May 16 03:35:33.982942 systemd-logind[1458]: Session 28 logged out. Waiting for processes to exit. May 16 03:35:33.990876 systemd[1]: session-28.scope: Deactivated successfully. May 16 03:35:33.998974 systemd-logind[1458]: Removed session 28. May 16 03:35:39.025213 kubelet[2715]: E0516 03:35:39.024515 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:35:43.022727 kubelet[2715]: E0516 03:35:43.022516 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:35:45.536196 containerd[1476]: time="2025-05-16T03:35:45.536041480Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"f2c93c1d06b0437fd1b9cea7713a8ee187d1f93f9d5469ed70b41273ec0d4c62\" pid:6036 exited_at:{seconds:1747366545 nanos:535256295}" May 16 03:35:47.258596 containerd[1476]: time="2025-05-16T03:35:47.258539375Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"4374aad5c2f91aee051adab55cfd21637de1e05999fd6f8509883e2755edcae8\" pid:6059 exited_at:{seconds:1747366547 nanos:258174720}" May 16 03:35:50.023840 containerd[1476]: time="2025-05-16T03:35:50.023329210Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 16 03:35:50.426577 containerd[1476]: time="2025-05-16T03:35:50.426143622Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:35:50.428759 containerd[1476]: time="2025-05-16T03:35:50.428524922Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 16 03:35:50.428936 containerd[1476]: time="2025-05-16T03:35:50.428540392Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:35:50.429888 kubelet[2715]: E0516 03:35:50.429664 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 03:35:50.432090 kubelet[2715]: E0516 03:35:50.430046 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 03:35:50.432090 kubelet[2715]: E0516 03:35:50.430861 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1b39ead48ab84d5997218c5ff179c936,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gfk9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79cdbd85dd-m22sm_calico-system(32870edc-65ff-47a6-9110-9ba1fe628ed6): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:35:50.434791 containerd[1476]: time="2025-05-16T03:35:50.434292240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 16 03:35:50.789697 containerd[1476]: time="2025-05-16T03:35:50.789240551Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:35:50.792577 containerd[1476]: time="2025-05-16T03:35:50.792237488Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:35:50.792577 containerd[1476]: time="2025-05-16T03:35:50.792291110Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 16 03:35:50.793930 kubelet[2715]: E0516 03:35:50.793275 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 03:35:50.793930 kubelet[2715]: E0516 03:35:50.793410 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 03:35:50.793930 kubelet[2715]: E0516 03:35:50.793721 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gfk9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79cdbd85dd-m22sm_calico-system(32870edc-65ff-47a6-9110-9ba1fe628ed6): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:35:50.795631 kubelet[2715]: E0516 03:35:50.795512 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:35:56.023474 containerd[1476]: time="2025-05-16T03:35:56.023327040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 16 03:35:56.404524 containerd[1476]: time="2025-05-16T03:35:56.404381897Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 03:35:56.409950 containerd[1476]: time="2025-05-16T03:35:56.409608050Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 16 03:35:56.409950 containerd[1476]: time="2025-05-16T03:35:56.409833813Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 16 03:35:56.411932 kubelet[2715]: E0516 03:35:56.410920 2715 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 03:35:56.411932 kubelet[2715]: E0516 03:35:56.411355 2715 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 03:35:56.416093 kubelet[2715]: E0516 03:35:56.414785 2715 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-4kgrm_calico-system(c0064a9e-4be1-4ce0-a21f-9e78adbef175): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 03:35:56.418324 kubelet[2715]: E0516 03:35:56.418226 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:36:03.641441 containerd[1476]: time="2025-05-16T03:36:03.641390680Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"6bf0681c16a13fffffc6f304373b8b2fc4800e97c14d014d06d45becc734bf57\" pid:6083 exited_at:{seconds:1747366563 nanos:640674725}" May 16 03:36:05.025772 kubelet[2715]: E0516 03:36:05.025221 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:36:08.024448 kubelet[2715]: E0516 03:36:08.024381 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:36:15.538336 containerd[1476]: time="2025-05-16T03:36:15.538202605Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"0eb117bdf447aa586d104ccfc7b2a25b88de3fdd599875a6bda1a8797c8fda7a\" pid:6108 exited_at:{seconds:1747366575 nanos:536521339}" May 16 03:36:17.027854 kubelet[2715]: E0516 03:36:17.027375 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:36:20.027846 kubelet[2715]: E0516 03:36:20.024941 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:36:30.028209 kubelet[2715]: E0516 03:36:30.026681 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:36:33.737973 containerd[1476]: time="2025-05-16T03:36:33.737797098Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"4b84439bf8aa5a72cf210316af66a617b68a7e23b83b3b60b08d30817da96157\" pid:6151 exited_at:{seconds:1747366593 nanos:736686412}" May 16 03:36:35.024475 kubelet[2715]: E0516 03:36:35.023577 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:36:43.024710 kubelet[2715]: E0516 03:36:43.024521 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:36:45.561617 containerd[1476]: time="2025-05-16T03:36:45.561503139Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"20d56a81c8380efe6798f458eca25c1e340c49ef97b44be789e7aacdb3ea0fce\" pid:6176 exited_at:{seconds:1747366605 nanos:560238014}" May 16 03:36:47.284041 containerd[1476]: time="2025-05-16T03:36:47.283919200Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"399b46fb61d7a8442563d358d5c7ea0b4912de03935f2ec736e4125ae6c4f782\" pid:6197 exited_at:{seconds:1747366607 nanos:283646819}" May 16 03:36:48.023867 kubelet[2715]: E0516 03:36:48.022972 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:36:54.026112 kubelet[2715]: E0516 03:36:54.025898 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:36:59.023765 kubelet[2715]: E0516 03:36:59.023620 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:37:03.725369 containerd[1476]: time="2025-05-16T03:37:03.725125215Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"d5745b4a39bdc145f03d90e43386b56dc63313cb6f8d8b0c8b7f28d8187a9dfd\" pid:6221 exited_at:{seconds:1747366623 nanos:724443153}" May 16 03:37:08.023177 kubelet[2715]: E0516 03:37:08.022607 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:37:11.024265 kubelet[2715]: E0516 03:37:11.024169 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:37:15.558596 containerd[1476]: time="2025-05-16T03:37:15.558546433Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a8d897ef03a0dba8006317e0dffeaa658479644e84548230f60530cff0ffc02\" id:\"f8bbcd216d54113b97a3ea8a5e9659c9c188df753ffdf7a7a6fbf01eba010ff0\" pid:6248 exited_at:{seconds:1747366635 nanos:557933030}" May 16 03:37:22.026067 kubelet[2715]: E0516 03:37:22.025797 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:37:24.027144 kubelet[2715]: E0516 03:37:24.027029 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175" May 16 03:37:33.700108 containerd[1476]: time="2025-05-16T03:37:33.700025316Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb6824489427df1bf4a3cfb10861a309edef7182c904350a7c6c7ed4136baf21\" id:\"951e45d330c07e84a3f8dedbe7ed27a2180bb3f8b324d296945007622a63b4e0\" pid:6271 exited_at:{seconds:1747366653 nanos:699362871}" May 16 03:37:34.027494 kubelet[2715]: E0516 03:37:34.026115 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79cdbd85dd-m22sm" podUID="32870edc-65ff-47a6-9110-9ba1fe628ed6" May 16 03:37:37.022593 kubelet[2715]: E0516 03:37:37.021801 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-4kgrm" podUID="c0064a9e-4be1-4ce0-a21f-9e78adbef175"