Mar 13 00:36:58.883862 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Mar 12 22:08:29 -00 2026 Mar 13 00:36:58.883881 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=a2116dc4421f78fe124deb19b9ad6d70a0cb4fc0b3349854f4ce4e2904d4925d Mar 13 00:36:58.883888 kernel: BIOS-provided physical RAM map: Mar 13 00:36:58.883893 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 13 00:36:58.883901 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ed3efff] usable Mar 13 00:36:58.883905 kernel: BIOS-e820: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Mar 13 00:36:58.883911 kernel: BIOS-e820: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Mar 13 00:36:58.883915 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Mar 13 00:36:58.883920 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Mar 13 00:36:58.883924 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Mar 13 00:36:58.883929 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Mar 13 00:36:58.883934 kernel: BIOS-e820: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Mar 13 00:36:58.883939 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 13 00:36:58.883946 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 13 00:36:58.883952 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 13 00:36:58.883957 kernel: BIOS-e820: [mem 0x0000000100000000-0x0000000179ffffff] usable Mar 13 00:36:58.883961 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 13 00:36:58.883966 kernel: NX (Execute Disable) protection: active Mar 13 00:36:58.883973 kernel: APIC: Static calls initialized Mar 13 00:36:58.883978 kernel: e820: update [mem 0x7dfac018-0x7dfb5a57] usable ==> usable Mar 13 00:36:58.883983 kernel: e820: update [mem 0x7df70018-0x7dfab657] usable ==> usable Mar 13 00:36:58.883988 kernel: e820: update [mem 0x7df34018-0x7df6f657] usable ==> usable Mar 13 00:36:58.883993 kernel: extended physical RAM map: Mar 13 00:36:58.883998 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 13 00:36:58.884003 kernel: reserve setup_data: [mem 0x0000000000100000-0x000000007df34017] usable Mar 13 00:36:58.884007 kernel: reserve setup_data: [mem 0x000000007df34018-0x000000007df6f657] usable Mar 13 00:36:58.884012 kernel: reserve setup_data: [mem 0x000000007df6f658-0x000000007df70017] usable Mar 13 00:36:58.884017 kernel: reserve setup_data: [mem 0x000000007df70018-0x000000007dfab657] usable Mar 13 00:36:58.884022 kernel: reserve setup_data: [mem 0x000000007dfab658-0x000000007dfac017] usable Mar 13 00:36:58.884029 kernel: reserve setup_data: [mem 0x000000007dfac018-0x000000007dfb5a57] usable Mar 13 00:36:58.884034 kernel: reserve setup_data: [mem 0x000000007dfb5a58-0x000000007ed3efff] usable Mar 13 00:36:58.884038 kernel: reserve setup_data: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Mar 13 00:36:58.884043 kernel: reserve setup_data: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Mar 13 00:36:58.884048 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Mar 13 00:36:58.884053 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Mar 13 00:36:58.884058 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Mar 13 00:36:58.884063 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Mar 13 00:36:58.884068 kernel: reserve setup_data: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Mar 13 00:36:58.884072 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 13 00:36:58.884078 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 13 00:36:58.884087 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 13 00:36:58.884093 kernel: reserve setup_data: [mem 0x0000000100000000-0x0000000179ffffff] usable Mar 13 00:36:58.884098 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 13 00:36:58.884103 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Mar 13 00:36:58.884108 kernel: efi: SMBIOS=0x7f988000 SMBIOS 3.0=0x7f986000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7e845198 RNG=0x7fb73018 Mar 13 00:36:58.884115 kernel: random: crng init done Mar 13 00:36:58.884120 kernel: efi: Remove mem136: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Mar 13 00:36:58.884126 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Mar 13 00:36:58.884131 kernel: secureboot: Secure boot disabled Mar 13 00:36:58.884136 kernel: SMBIOS 3.0.0 present. Mar 13 00:36:58.884141 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Mar 13 00:36:58.884146 kernel: DMI: Memory slots populated: 1/1 Mar 13 00:36:58.884151 kernel: Hypervisor detected: KVM Mar 13 00:36:58.884156 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Mar 13 00:36:58.884161 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 13 00:36:58.884166 kernel: kvm-clock: using sched offset of 13665761612 cycles Mar 13 00:36:58.884173 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 13 00:36:58.884179 kernel: tsc: Detected 2399.998 MHz processor Mar 13 00:36:58.884184 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 13 00:36:58.884189 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 13 00:36:58.884195 kernel: last_pfn = 0x17a000 max_arch_pfn = 0x10000000000 Mar 13 00:36:58.884200 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Mar 13 00:36:58.884213 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 13 00:36:58.884219 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Mar 13 00:36:58.884224 kernel: Using GB pages for direct mapping Mar 13 00:36:58.884231 kernel: ACPI: Early table checksum verification disabled Mar 13 00:36:58.884236 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Mar 13 00:36:58.884242 kernel: ACPI: XSDT 0x000000007FB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Mar 13 00:36:58.884247 kernel: ACPI: FACP 0x000000007FB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:36:58.884252 kernel: ACPI: DSDT 0x000000007FB7A000 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:36:58.884257 kernel: ACPI: FACS 0x000000007FBDD000 000040 Mar 13 00:36:58.884262 kernel: ACPI: APIC 0x000000007FB78000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:36:58.884268 kernel: ACPI: HPET 0x000000007FB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:36:58.884273 kernel: ACPI: MCFG 0x000000007FB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:36:58.884280 kernel: ACPI: WAET 0x000000007FB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:36:58.884285 kernel: ACPI: BGRT 0x000000007FB74000 000038 (v01 INTEL EDK2 00000002 01000013) Mar 13 00:36:58.884290 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb79000-0x7fb790f3] Mar 13 00:36:58.884296 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb7a000-0x7fb7c442] Mar 13 00:36:58.884303 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Mar 13 00:36:58.884311 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb78000-0x7fb7807f] Mar 13 00:36:58.884320 kernel: ACPI: Reserving HPET table memory at [mem 0x7fb77000-0x7fb77037] Mar 13 00:36:58.884328 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb76000-0x7fb7603b] Mar 13 00:36:58.884335 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb75000-0x7fb75027] Mar 13 00:36:58.884345 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb74000-0x7fb74037] Mar 13 00:36:58.884352 kernel: No NUMA configuration found Mar 13 00:36:58.884359 kernel: Faking a node at [mem 0x0000000000000000-0x0000000179ffffff] Mar 13 00:36:58.884366 kernel: NODE_DATA(0) allocated [mem 0x179ff8dc0-0x179ffffff] Mar 13 00:36:58.884374 kernel: Zone ranges: Mar 13 00:36:58.884381 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 13 00:36:58.884387 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 13 00:36:58.884392 kernel: Normal [mem 0x0000000100000000-0x0000000179ffffff] Mar 13 00:36:58.884397 kernel: Device empty Mar 13 00:36:58.884405 kernel: Movable zone start for each node Mar 13 00:36:58.884410 kernel: Early memory node ranges Mar 13 00:36:58.884415 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Mar 13 00:36:58.884420 kernel: node 0: [mem 0x0000000000100000-0x000000007ed3efff] Mar 13 00:36:58.884425 kernel: node 0: [mem 0x000000007ee00000-0x000000007f8ecfff] Mar 13 00:36:58.884431 kernel: node 0: [mem 0x000000007fbff000-0x000000007ff7bfff] Mar 13 00:36:58.884436 kernel: node 0: [mem 0x0000000100000000-0x0000000179ffffff] Mar 13 00:36:58.884441 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x0000000179ffffff] Mar 13 00:36:58.884446 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 13 00:36:58.884451 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Mar 13 00:36:58.884459 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Mar 13 00:36:58.884464 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Mar 13 00:36:58.884469 kernel: On node 0, zone Normal: 132 pages in unavailable ranges Mar 13 00:36:58.884474 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Mar 13 00:36:58.884480 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 13 00:36:58.884485 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 13 00:36:58.884490 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 13 00:36:58.884495 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 13 00:36:58.884500 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 13 00:36:58.884508 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 13 00:36:58.884513 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 13 00:36:58.884518 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 13 00:36:58.884523 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 13 00:36:58.884528 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 13 00:36:58.884534 kernel: CPU topo: Max. logical packages: 1 Mar 13 00:36:58.884539 kernel: CPU topo: Max. logical dies: 1 Mar 13 00:36:58.884552 kernel: CPU topo: Max. dies per package: 1 Mar 13 00:36:58.884558 kernel: CPU topo: Max. threads per core: 1 Mar 13 00:36:58.884563 kernel: CPU topo: Num. cores per package: 2 Mar 13 00:36:58.884568 kernel: CPU topo: Num. threads per package: 2 Mar 13 00:36:58.884574 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Mar 13 00:36:58.884581 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 13 00:36:58.884587 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Mar 13 00:36:58.884592 kernel: Booting paravirtualized kernel on KVM Mar 13 00:36:58.884598 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 13 00:36:58.884603 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 13 00:36:58.884611 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Mar 13 00:36:58.884616 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Mar 13 00:36:58.884622 kernel: pcpu-alloc: [0] 0 1 Mar 13 00:36:58.884627 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 13 00:36:58.884633 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=a2116dc4421f78fe124deb19b9ad6d70a0cb4fc0b3349854f4ce4e2904d4925d Mar 13 00:36:58.884639 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 13 00:36:58.884644 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 13 00:36:58.884650 kernel: Fallback order for Node 0: 0 Mar 13 00:36:58.884657 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1022792 Mar 13 00:36:58.884663 kernel: Policy zone: Normal Mar 13 00:36:58.884668 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 13 00:36:58.884673 kernel: software IO TLB: area num 2. Mar 13 00:36:58.884679 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 13 00:36:58.884684 kernel: ftrace: allocating 40099 entries in 157 pages Mar 13 00:36:58.884690 kernel: ftrace: allocated 157 pages with 5 groups Mar 13 00:36:58.884695 kernel: Dynamic Preempt: voluntary Mar 13 00:36:58.884701 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 13 00:36:58.884709 kernel: rcu: RCU event tracing is enabled. Mar 13 00:36:58.884715 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 13 00:36:58.884720 kernel: Trampoline variant of Tasks RCU enabled. Mar 13 00:36:58.884726 kernel: Rude variant of Tasks RCU enabled. Mar 13 00:36:58.884731 kernel: Tracing variant of Tasks RCU enabled. Mar 13 00:36:58.884737 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 13 00:36:58.884742 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 13 00:36:58.884747 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 13 00:36:58.884753 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 13 00:36:58.884761 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 13 00:36:58.884766 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 13 00:36:58.884771 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 13 00:36:58.884777 kernel: Console: colour dummy device 80x25 Mar 13 00:36:58.884782 kernel: printk: legacy console [tty0] enabled Mar 13 00:36:58.884788 kernel: printk: legacy console [ttyS0] enabled Mar 13 00:36:58.884794 kernel: ACPI: Core revision 20240827 Mar 13 00:36:58.884800 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Mar 13 00:36:58.884805 kernel: APIC: Switch to symmetric I/O mode setup Mar 13 00:36:58.884812 kernel: x2apic enabled Mar 13 00:36:58.884818 kernel: APIC: Switched APIC routing to: physical x2apic Mar 13 00:36:58.884823 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 13 00:36:58.884829 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Mar 13 00:36:58.884835 kernel: Calibrating delay loop (skipped) preset value.. 4799.99 BogoMIPS (lpj=2399998) Mar 13 00:36:58.884840 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 13 00:36:58.886870 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Mar 13 00:36:58.886886 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Mar 13 00:36:58.886893 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 13 00:36:58.886902 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Mar 13 00:36:58.886908 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 13 00:36:58.886914 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 13 00:36:58.886920 kernel: active return thunk: srso_alias_return_thunk Mar 13 00:36:58.886925 kernel: Speculative Return Stack Overflow: Mitigation: Safe RET Mar 13 00:36:58.886931 kernel: Transient Scheduler Attacks: Forcing mitigation on in a VM Mar 13 00:36:58.886936 kernel: Transient Scheduler Attacks: Vulnerable: Clear CPU buffers attempted, no microcode Mar 13 00:36:58.886942 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 13 00:36:58.886947 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 13 00:36:58.886955 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 13 00:36:58.886960 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Mar 13 00:36:58.886966 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Mar 13 00:36:58.886971 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Mar 13 00:36:58.886984 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Mar 13 00:36:58.886990 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 13 00:36:58.887000 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Mar 13 00:36:58.887006 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Mar 13 00:36:58.887021 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Mar 13 00:36:58.887030 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Mar 13 00:36:58.887035 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Mar 13 00:36:58.887041 kernel: Freeing SMP alternatives memory: 32K Mar 13 00:36:58.887056 kernel: pid_max: default: 32768 minimum: 301 Mar 13 00:36:58.887066 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 13 00:36:58.887072 kernel: landlock: Up and running. Mar 13 00:36:58.887077 kernel: SELinux: Initializing. Mar 13 00:36:58.887083 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 13 00:36:58.887089 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 13 00:36:58.887097 kernel: smpboot: CPU0: AMD EPYC-Genoa Processor (family: 0x19, model: 0x11, stepping: 0x0) Mar 13 00:36:58.887102 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Mar 13 00:36:58.887108 kernel: ... version: 0 Mar 13 00:36:58.887113 kernel: ... bit width: 48 Mar 13 00:36:58.887119 kernel: ... generic registers: 6 Mar 13 00:36:58.887124 kernel: ... value mask: 0000ffffffffffff Mar 13 00:36:58.887130 kernel: ... max period: 00007fffffffffff Mar 13 00:36:58.887135 kernel: ... fixed-purpose events: 0 Mar 13 00:36:58.887140 kernel: ... event mask: 000000000000003f Mar 13 00:36:58.887148 kernel: signal: max sigframe size: 3376 Mar 13 00:36:58.887154 kernel: rcu: Hierarchical SRCU implementation. Mar 13 00:36:58.887160 kernel: rcu: Max phase no-delay instances is 400. Mar 13 00:36:58.887166 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 13 00:36:58.887171 kernel: smp: Bringing up secondary CPUs ... Mar 13 00:36:58.887177 kernel: smpboot: x86: Booting SMP configuration: Mar 13 00:36:58.887182 kernel: .... node #0, CPUs: #1 Mar 13 00:36:58.887188 kernel: smp: Brought up 1 node, 2 CPUs Mar 13 00:36:58.887193 kernel: smpboot: Total of 2 processors activated (9599.99 BogoMIPS) Mar 13 00:36:58.887202 kernel: Memory: 3848516K/4091168K available (14336K kernel code, 2445K rwdata, 26064K rodata, 46200K init, 2560K bss, 237016K reserved, 0K cma-reserved) Mar 13 00:36:58.887218 kernel: devtmpfs: initialized Mar 13 00:36:58.887223 kernel: x86/mm: Memory block size: 128MB Mar 13 00:36:58.887229 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Mar 13 00:36:58.887235 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 13 00:36:58.887240 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 13 00:36:58.887246 kernel: pinctrl core: initialized pinctrl subsystem Mar 13 00:36:58.887251 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 13 00:36:58.887256 kernel: audit: initializing netlink subsys (disabled) Mar 13 00:36:58.887264 kernel: audit: type=2000 audit(1773362216.390:1): state=initialized audit_enabled=0 res=1 Mar 13 00:36:58.887270 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 13 00:36:58.887275 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 13 00:36:58.887281 kernel: cpuidle: using governor menu Mar 13 00:36:58.887286 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 13 00:36:58.887291 kernel: dca service started, version 1.12.1 Mar 13 00:36:58.887297 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Mar 13 00:36:58.887302 kernel: PCI: Using configuration type 1 for base access Mar 13 00:36:58.887308 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 13 00:36:58.887316 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 13 00:36:58.887321 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 13 00:36:58.887327 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 13 00:36:58.887332 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 13 00:36:58.887338 kernel: ACPI: Added _OSI(Module Device) Mar 13 00:36:58.887343 kernel: ACPI: Added _OSI(Processor Device) Mar 13 00:36:58.887349 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 13 00:36:58.887355 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 13 00:36:58.887360 kernel: ACPI: Interpreter enabled Mar 13 00:36:58.887368 kernel: ACPI: PM: (supports S0 S5) Mar 13 00:36:58.887373 kernel: ACPI: Using IOAPIC for interrupt routing Mar 13 00:36:58.887379 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 13 00:36:58.887384 kernel: PCI: Using E820 reservations for host bridge windows Mar 13 00:36:58.887390 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 13 00:36:58.887395 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 13 00:36:58.887554 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 13 00:36:58.887694 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 13 00:36:58.887818 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 13 00:36:58.887826 kernel: PCI host bridge to bus 0000:00 Mar 13 00:36:58.887967 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 13 00:36:58.888059 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 13 00:36:58.888148 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 13 00:36:58.888249 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Mar 13 00:36:58.888340 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Mar 13 00:36:58.888433 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc7ffffffff window] Mar 13 00:36:58.888523 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 13 00:36:58.888637 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Mar 13 00:36:58.888748 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Mar 13 00:36:58.888925 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Mar 13 00:36:58.890922 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc060500000-0xc060503fff 64bit pref] Mar 13 00:36:58.891039 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8138a000-0x8138afff] Mar 13 00:36:58.891140 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Mar 13 00:36:58.891249 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 13 00:36:58.891356 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:36:58.891454 kernel: pci 0000:00:02.0: BAR 0 [mem 0x81389000-0x81389fff] Mar 13 00:36:58.891552 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 13 00:36:58.891648 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Mar 13 00:36:58.891748 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 13 00:36:58.891871 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:36:58.891973 kernel: pci 0000:00:02.1: BAR 0 [mem 0x81388000-0x81388fff] Mar 13 00:36:58.892070 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 13 00:36:58.892167 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Mar 13 00:36:58.892281 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:36:58.892384 kernel: pci 0000:00:02.2: BAR 0 [mem 0x81387000-0x81387fff] Mar 13 00:36:58.892483 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 13 00:36:58.892580 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Mar 13 00:36:58.892676 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 13 00:36:58.892783 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:36:58.892897 kernel: pci 0000:00:02.3: BAR 0 [mem 0x81386000-0x81386fff] Mar 13 00:36:58.892995 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 13 00:36:58.893095 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 13 00:36:58.893198 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:36:58.893305 kernel: pci 0000:00:02.4: BAR 0 [mem 0x81385000-0x81385fff] Mar 13 00:36:58.893402 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 13 00:36:58.893500 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Mar 13 00:36:58.893603 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 13 00:36:58.893711 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:36:58.893813 kernel: pci 0000:00:02.5: BAR 0 [mem 0x81384000-0x81384fff] Mar 13 00:36:58.895542 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 13 00:36:58.895652 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Mar 13 00:36:58.895752 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 13 00:36:58.896951 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:36:58.897069 kernel: pci 0000:00:02.6: BAR 0 [mem 0x81383000-0x81383fff] Mar 13 00:36:58.897169 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 13 00:36:58.897277 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Mar 13 00:36:58.897379 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 13 00:36:58.897486 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:36:58.897590 kernel: pci 0000:00:02.7: BAR 0 [mem 0x81382000-0x81382fff] Mar 13 00:36:58.897690 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 13 00:36:58.897787 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Mar 13 00:36:58.897906 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 13 00:36:58.898014 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:36:58.898111 kernel: pci 0000:00:03.0: BAR 0 [mem 0x81381000-0x81381fff] Mar 13 00:36:58.898216 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 13 00:36:58.898313 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Mar 13 00:36:58.898410 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 13 00:36:58.898514 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Mar 13 00:36:58.898611 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 13 00:36:58.898716 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Mar 13 00:36:58.898814 kernel: pci 0000:00:1f.2: BAR 4 [io 0x6040-0x605f] Mar 13 00:36:58.900966 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x81380000-0x81380fff] Mar 13 00:36:58.901086 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Mar 13 00:36:58.901187 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6000-0x603f] Mar 13 00:36:58.901310 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 13 00:36:58.901418 kernel: pci 0000:01:00.0: BAR 1 [mem 0x81200000-0x81200fff] Mar 13 00:36:58.901520 kernel: pci 0000:01:00.0: BAR 4 [mem 0xc060000000-0xc060003fff 64bit pref] Mar 13 00:36:58.901622 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 13 00:36:58.901721 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 13 00:36:58.901830 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Mar 13 00:36:58.901949 kernel: pci 0000:02:00.0: BAR 0 [mem 0x81100000-0x81103fff 64bit] Mar 13 00:36:58.902049 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 13 00:36:58.902166 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Mar 13 00:36:58.902285 kernel: pci 0000:03:00.0: BAR 1 [mem 0x81000000-0x81000fff] Mar 13 00:36:58.902389 kernel: pci 0000:03:00.0: BAR 4 [mem 0xc060100000-0xc060103fff 64bit pref] Mar 13 00:36:58.902487 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 13 00:36:58.902598 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Mar 13 00:36:58.902700 kernel: pci 0000:04:00.0: BAR 4 [mem 0xc060200000-0xc060203fff 64bit pref] Mar 13 00:36:58.902799 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 13 00:36:58.904966 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Mar 13 00:36:58.905082 kernel: pci 0000:05:00.0: BAR 1 [mem 0x80f00000-0x80f00fff] Mar 13 00:36:58.905186 kernel: pci 0000:05:00.0: BAR 4 [mem 0xc060300000-0xc060303fff 64bit pref] Mar 13 00:36:58.905296 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 13 00:36:58.905407 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Mar 13 00:36:58.905510 kernel: pci 0000:06:00.0: BAR 1 [mem 0x80e00000-0x80e00fff] Mar 13 00:36:58.905615 kernel: pci 0000:06:00.0: BAR 4 [mem 0xc060400000-0xc060403fff 64bit pref] Mar 13 00:36:58.906915 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 13 00:36:58.906927 kernel: acpiphp: Slot [0] registered Mar 13 00:36:58.907043 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 13 00:36:58.907148 kernel: pci 0000:07:00.0: BAR 1 [mem 0x80c00000-0x80c00fff] Mar 13 00:36:58.907261 kernel: pci 0000:07:00.0: BAR 4 [mem 0xc000000000-0xc000003fff 64bit pref] Mar 13 00:36:58.907364 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 13 00:36:58.907462 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 13 00:36:58.907475 kernel: acpiphp: Slot [0-2] registered Mar 13 00:36:58.907572 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 13 00:36:58.907580 kernel: acpiphp: Slot [0-3] registered Mar 13 00:36:58.907678 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 13 00:36:58.907687 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 13 00:36:58.907709 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 13 00:36:58.907717 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 13 00:36:58.907724 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 13 00:36:58.907736 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 13 00:36:58.907745 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 13 00:36:58.907751 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 13 00:36:58.907756 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 13 00:36:58.907765 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 13 00:36:58.907770 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 13 00:36:58.907776 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 13 00:36:58.907782 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 13 00:36:58.907788 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 13 00:36:58.907796 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 13 00:36:58.907802 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 13 00:36:58.907810 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 13 00:36:58.907816 kernel: iommu: Default domain type: Translated Mar 13 00:36:58.907822 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 13 00:36:58.907828 kernel: efivars: Registered efivars operations Mar 13 00:36:58.907835 kernel: PCI: Using ACPI for IRQ routing Mar 13 00:36:58.907841 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 13 00:36:58.907871 kernel: e820: reserve RAM buffer [mem 0x7df34018-0x7fffffff] Mar 13 00:36:58.907877 kernel: e820: reserve RAM buffer [mem 0x7df70018-0x7fffffff] Mar 13 00:36:58.907883 kernel: e820: reserve RAM buffer [mem 0x7dfac018-0x7fffffff] Mar 13 00:36:58.907889 kernel: e820: reserve RAM buffer [mem 0x7ed3f000-0x7fffffff] Mar 13 00:36:58.907894 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Mar 13 00:36:58.907900 kernel: e820: reserve RAM buffer [mem 0x7ff7c000-0x7fffffff] Mar 13 00:36:58.907909 kernel: e820: reserve RAM buffer [mem 0x17a000000-0x17bffffff] Mar 13 00:36:58.908015 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 13 00:36:58.908147 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 13 00:36:58.908258 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 13 00:36:58.908265 kernel: vgaarb: loaded Mar 13 00:36:58.908271 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Mar 13 00:36:58.908277 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Mar 13 00:36:58.908283 kernel: clocksource: Switched to clocksource kvm-clock Mar 13 00:36:58.908289 kernel: VFS: Disk quotas dquot_6.6.0 Mar 13 00:36:58.908298 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 13 00:36:58.908304 kernel: pnp: PnP ACPI init Mar 13 00:36:58.908429 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Mar 13 00:36:58.908439 kernel: pnp: PnP ACPI: found 5 devices Mar 13 00:36:58.908445 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 13 00:36:58.908451 kernel: NET: Registered PF_INET protocol family Mar 13 00:36:58.908457 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 13 00:36:58.908463 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 13 00:36:58.908473 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 13 00:36:58.908479 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 13 00:36:58.908485 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 13 00:36:58.908491 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 13 00:36:58.908497 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 13 00:36:58.908503 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 13 00:36:58.908509 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 13 00:36:58.908514 kernel: NET: Registered PF_XDP protocol family Mar 13 00:36:58.908621 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Mar 13 00:36:58.908729 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Mar 13 00:36:58.908827 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 13 00:36:58.908953 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 13 00:36:58.909051 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 13 00:36:58.910033 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Mar 13 00:36:58.910140 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Mar 13 00:36:58.910251 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Mar 13 00:36:58.910356 kernel: pci 0000:01:00.0: ROM [mem 0x81280000-0x812fffff pref]: assigned Mar 13 00:36:58.910459 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 13 00:36:58.910556 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Mar 13 00:36:58.910653 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 13 00:36:58.910751 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 13 00:36:58.911353 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Mar 13 00:36:58.913170 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 13 00:36:58.913306 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Mar 13 00:36:58.913408 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 13 00:36:58.913511 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 13 00:36:58.913614 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 13 00:36:58.913713 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 13 00:36:58.913810 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Mar 13 00:36:58.913946 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 13 00:36:58.914047 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 13 00:36:58.914147 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Mar 13 00:36:58.914254 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 13 00:36:58.914358 kernel: pci 0000:07:00.0: ROM [mem 0x80c80000-0x80cfffff pref]: assigned Mar 13 00:36:58.914459 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 13 00:36:58.914557 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Mar 13 00:36:58.914653 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Mar 13 00:36:58.914750 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 13 00:36:58.915740 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 13 00:36:58.915877 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Mar 13 00:36:58.915984 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Mar 13 00:36:58.916082 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 13 00:36:58.916181 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 13 00:36:58.916288 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Mar 13 00:36:58.916386 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Mar 13 00:36:58.916484 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 13 00:36:58.916580 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 13 00:36:58.916672 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 13 00:36:58.916765 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 13 00:36:58.916871 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Mar 13 00:36:58.916964 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Mar 13 00:36:58.917074 kernel: pci_bus 0000:00: resource 9 [mem 0xc000000000-0xc7ffffffff window] Mar 13 00:36:58.917216 kernel: pci_bus 0000:01: resource 1 [mem 0x81200000-0x812fffff] Mar 13 00:36:58.917349 kernel: pci_bus 0000:01: resource 2 [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 13 00:36:58.917494 kernel: pci_bus 0000:02: resource 1 [mem 0x81100000-0x811fffff] Mar 13 00:36:58.917639 kernel: pci_bus 0000:03: resource 1 [mem 0x81000000-0x810fffff] Mar 13 00:36:58.917773 kernel: pci_bus 0000:03: resource 2 [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 13 00:36:58.919307 kernel: pci_bus 0000:04: resource 2 [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 13 00:36:58.919435 kernel: pci_bus 0000:05: resource 1 [mem 0x80f00000-0x80ffffff] Mar 13 00:36:58.919535 kernel: pci_bus 0000:05: resource 2 [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 13 00:36:58.919637 kernel: pci_bus 0000:06: resource 1 [mem 0x80e00000-0x80efffff] Mar 13 00:36:58.919766 kernel: pci_bus 0000:06: resource 2 [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 13 00:36:58.920297 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Mar 13 00:36:58.920948 kernel: pci_bus 0000:07: resource 1 [mem 0x80c00000-0x80dfffff] Mar 13 00:36:58.921057 kernel: pci_bus 0000:07: resource 2 [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 13 00:36:58.921168 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Mar 13 00:36:58.921279 kernel: pci_bus 0000:08: resource 1 [mem 0x80a00000-0x80bfffff] Mar 13 00:36:58.921381 kernel: pci_bus 0000:08: resource 2 [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 13 00:36:58.921493 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Mar 13 00:36:58.921594 kernel: pci_bus 0000:09: resource 1 [mem 0x80800000-0x809fffff] Mar 13 00:36:58.921695 kernel: pci_bus 0000:09: resource 2 [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 13 00:36:58.921703 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 13 00:36:58.921710 kernel: PCI: CLS 0 bytes, default 64 Mar 13 00:36:58.921716 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 13 00:36:58.921722 kernel: software IO TLB: mapped [mem 0x0000000077ffd000-0x000000007bffd000] (64MB) Mar 13 00:36:58.921728 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Mar 13 00:36:58.921736 kernel: Initialise system trusted keyrings Mar 13 00:36:58.921742 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 13 00:36:58.921748 kernel: Key type asymmetric registered Mar 13 00:36:58.921754 kernel: Asymmetric key parser 'x509' registered Mar 13 00:36:58.921760 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 13 00:36:58.921766 kernel: io scheduler mq-deadline registered Mar 13 00:36:58.921772 kernel: io scheduler kyber registered Mar 13 00:36:58.921777 kernel: io scheduler bfq registered Mar 13 00:36:58.924896 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 13 00:36:58.925089 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 13 00:36:58.925241 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 13 00:36:58.925345 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 13 00:36:58.925447 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 13 00:36:58.925546 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 13 00:36:58.925647 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 13 00:36:58.925757 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 13 00:36:58.925872 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 13 00:36:58.925977 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 13 00:36:58.926077 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 13 00:36:58.926183 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 13 00:36:58.926296 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 13 00:36:58.926398 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 13 00:36:58.926499 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 13 00:36:58.926597 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 13 00:36:58.926609 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 13 00:36:58.926709 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Mar 13 00:36:58.926808 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Mar 13 00:36:58.926839 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 13 00:36:58.928941 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Mar 13 00:36:58.928952 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 13 00:36:58.928959 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 13 00:36:58.928970 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 13 00:36:58.928976 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 13 00:36:58.928983 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 13 00:36:58.929109 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 13 00:36:58.929118 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 13 00:36:58.929237 kernel: rtc_cmos 00:03: registered as rtc0 Mar 13 00:36:58.929338 kernel: rtc_cmos 00:03: setting system clock to 2026-03-13T00:36:58 UTC (1773362218) Mar 13 00:36:58.929450 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Mar 13 00:36:58.929464 kernel: amd_pstate: The CPPC feature is supported but currently disabled by the BIOS. Please enable it if your BIOS has the CPPC option. Mar 13 00:36:58.929471 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 13 00:36:58.929477 kernel: efifb: probing for efifb Mar 13 00:36:58.929483 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Mar 13 00:36:58.929490 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Mar 13 00:36:58.929495 kernel: efifb: scrolling: redraw Mar 13 00:36:58.929501 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 13 00:36:58.929508 kernel: Console: switching to colour frame buffer device 160x50 Mar 13 00:36:58.929517 kernel: fb0: EFI VGA frame buffer device Mar 13 00:36:58.929523 kernel: pstore: Using crash dump compression: deflate Mar 13 00:36:58.929529 kernel: pstore: Registered efi_pstore as persistent store backend Mar 13 00:36:58.929535 kernel: NET: Registered PF_INET6 protocol family Mar 13 00:36:58.929541 kernel: Segment Routing with IPv6 Mar 13 00:36:58.929547 kernel: In-situ OAM (IOAM) with IPv6 Mar 13 00:36:58.929552 kernel: NET: Registered PF_PACKET protocol family Mar 13 00:36:58.929558 kernel: Key type dns_resolver registered Mar 13 00:36:58.929564 kernel: IPI shorthand broadcast: enabled Mar 13 00:36:58.929570 kernel: sched_clock: Marking stable (2876006847, 269656858)->(3186943545, -41279840) Mar 13 00:36:58.929578 kernel: registered taskstats version 1 Mar 13 00:36:58.929584 kernel: Loading compiled-in X.509 certificates Mar 13 00:36:58.929590 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 5aff49df330f42445474818d085d5033fee752d8' Mar 13 00:36:58.929596 kernel: Demotion targets for Node 0: null Mar 13 00:36:58.929602 kernel: Key type .fscrypt registered Mar 13 00:36:58.929611 kernel: Key type fscrypt-provisioning registered Mar 13 00:36:58.929620 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 13 00:36:58.929628 kernel: ima: Allocated hash algorithm: sha1 Mar 13 00:36:58.929640 kernel: ima: No architecture policies found Mar 13 00:36:58.929649 kernel: clk: Disabling unused clocks Mar 13 00:36:58.929658 kernel: Warning: unable to open an initial console. Mar 13 00:36:58.929664 kernel: Freeing unused kernel image (initmem) memory: 46200K Mar 13 00:36:58.929670 kernel: Write protecting the kernel read-only data: 40960k Mar 13 00:36:58.929676 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Mar 13 00:36:58.929682 kernel: Run /init as init process Mar 13 00:36:58.929688 kernel: with arguments: Mar 13 00:36:58.929694 kernel: /init Mar 13 00:36:58.929703 kernel: with environment: Mar 13 00:36:58.929709 kernel: HOME=/ Mar 13 00:36:58.929714 kernel: TERM=linux Mar 13 00:36:58.929721 systemd[1]: Successfully made /usr/ read-only. Mar 13 00:36:58.929730 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 13 00:36:58.929738 systemd[1]: Detected virtualization kvm. Mar 13 00:36:58.929744 systemd[1]: Detected architecture x86-64. Mar 13 00:36:58.929750 systemd[1]: Running in initrd. Mar 13 00:36:58.929758 systemd[1]: No hostname configured, using default hostname. Mar 13 00:36:58.929765 systemd[1]: Hostname set to . Mar 13 00:36:58.929771 systemd[1]: Initializing machine ID from VM UUID. Mar 13 00:36:58.929777 systemd[1]: Queued start job for default target initrd.target. Mar 13 00:36:58.929783 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 13 00:36:58.929790 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 13 00:36:58.929797 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 13 00:36:58.929803 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 13 00:36:58.929811 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 13 00:36:58.929818 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 13 00:36:58.929825 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 13 00:36:58.929831 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 13 00:36:58.929838 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 13 00:36:58.929872 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 13 00:36:58.929878 systemd[1]: Reached target paths.target - Path Units. Mar 13 00:36:58.929887 systemd[1]: Reached target slices.target - Slice Units. Mar 13 00:36:58.929894 systemd[1]: Reached target swap.target - Swaps. Mar 13 00:36:58.929903 systemd[1]: Reached target timers.target - Timer Units. Mar 13 00:36:58.929912 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 13 00:36:58.929920 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 13 00:36:58.929930 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 13 00:36:58.929939 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 13 00:36:58.929946 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 13 00:36:58.929958 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 13 00:36:58.929967 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 13 00:36:58.929976 systemd[1]: Reached target sockets.target - Socket Units. Mar 13 00:36:58.929984 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 13 00:36:58.929994 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 13 00:36:58.930003 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 13 00:36:58.930012 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 13 00:36:58.930022 systemd[1]: Starting systemd-fsck-usr.service... Mar 13 00:36:58.930031 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 13 00:36:58.930045 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 13 00:36:58.930053 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:36:58.930062 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 13 00:36:58.930072 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 13 00:36:58.930111 systemd-journald[198]: Collecting audit messages is disabled. Mar 13 00:36:58.930128 systemd[1]: Finished systemd-fsck-usr.service. Mar 13 00:36:58.930135 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 13 00:36:58.930142 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 13 00:36:58.930153 kernel: Bridge firewalling registered Mar 13 00:36:58.930163 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:36:58.930172 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 13 00:36:58.930182 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 13 00:36:58.930192 systemd-journald[198]: Journal started Mar 13 00:36:58.930220 systemd-journald[198]: Runtime Journal (/run/log/journal/164bf7cb66cb4ba0b60c12d9f39c55be) is 8M, max 76.1M, 68.1M free. Mar 13 00:36:58.889076 systemd-modules-load[199]: Inserted module 'overlay' Mar 13 00:36:58.923942 systemd-modules-load[199]: Inserted module 'br_netfilter' Mar 13 00:36:58.936532 systemd[1]: Started systemd-journald.service - Journal Service. Mar 13 00:36:58.939897 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 13 00:36:58.941949 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 13 00:36:58.948946 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 13 00:36:58.951988 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 13 00:36:58.957884 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 13 00:36:58.964975 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 13 00:36:58.965635 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 13 00:36:58.967958 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 13 00:36:58.971378 systemd-tmpfiles[220]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 13 00:36:58.978239 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 13 00:36:58.981289 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 13 00:36:58.992670 dracut-cmdline[235]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=a2116dc4421f78fe124deb19b9ad6d70a0cb4fc0b3349854f4ce4e2904d4925d Mar 13 00:36:59.019755 systemd-resolved[237]: Positive Trust Anchors: Mar 13 00:36:59.019767 systemd-resolved[237]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 13 00:36:59.019788 systemd-resolved[237]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 13 00:36:59.023578 systemd-resolved[237]: Defaulting to hostname 'linux'. Mar 13 00:36:59.026277 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 13 00:36:59.026736 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 13 00:36:59.071890 kernel: SCSI subsystem initialized Mar 13 00:36:59.079897 kernel: Loading iSCSI transport class v2.0-870. Mar 13 00:36:59.089892 kernel: iscsi: registered transport (tcp) Mar 13 00:36:59.107004 kernel: iscsi: registered transport (qla4xxx) Mar 13 00:36:59.107044 kernel: QLogic iSCSI HBA Driver Mar 13 00:36:59.127345 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 13 00:36:59.160941 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 13 00:36:59.162818 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 13 00:36:59.209001 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 13 00:36:59.211048 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 13 00:36:59.254877 kernel: raid6: avx512x4 gen() 44893 MB/s Mar 13 00:36:59.272870 kernel: raid6: avx512x2 gen() 46463 MB/s Mar 13 00:36:59.290869 kernel: raid6: avx512x1 gen() 43303 MB/s Mar 13 00:36:59.308868 kernel: raid6: avx2x4 gen() 46435 MB/s Mar 13 00:36:59.326871 kernel: raid6: avx2x2 gen() 48858 MB/s Mar 13 00:36:59.346066 kernel: raid6: avx2x1 gen() 37324 MB/s Mar 13 00:36:59.346128 kernel: raid6: using algorithm avx2x2 gen() 48858 MB/s Mar 13 00:36:59.366102 kernel: raid6: .... xor() 35418 MB/s, rmw enabled Mar 13 00:36:59.366173 kernel: raid6: using avx512x2 recovery algorithm Mar 13 00:36:59.382915 kernel: xor: automatically using best checksumming function avx Mar 13 00:36:59.536916 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 13 00:36:59.549012 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 13 00:36:59.551463 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 13 00:36:59.592068 systemd-udevd[446]: Using default interface naming scheme 'v255'. Mar 13 00:36:59.597330 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 13 00:36:59.599925 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 13 00:36:59.622005 dracut-pre-trigger[452]: rd.md=0: removing MD RAID activation Mar 13 00:36:59.648982 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 13 00:36:59.650548 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 13 00:36:59.729068 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 13 00:36:59.735063 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 13 00:36:59.845890 kernel: cryptd: max_cpu_qlen set to 1000 Mar 13 00:36:59.855880 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Mar 13 00:36:59.865882 kernel: scsi host0: Virtio SCSI HBA Mar 13 00:36:59.870593 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Mar 13 00:36:59.870631 kernel: ACPI: bus type USB registered Mar 13 00:36:59.874238 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:36:59.875535 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:36:59.878180 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:36:59.882170 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:36:59.886872 kernel: usbcore: registered new interface driver usbfs Mar 13 00:36:59.889657 kernel: usbcore: registered new interface driver hub Mar 13 00:36:59.889679 kernel: usbcore: registered new device driver usb Mar 13 00:36:59.900399 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 13 00:36:59.904906 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Mar 13 00:36:59.916379 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:36:59.917036 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:36:59.918509 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 13 00:36:59.920229 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:36:59.923875 kernel: AES CTR mode by8 optimization enabled Mar 13 00:36:59.929902 kernel: libata version 3.00 loaded. Mar 13 00:36:59.957888 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 13 00:36:59.963449 kernel: sd 0:0:0:0: Power-on or device reset occurred Mar 13 00:36:59.963648 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 13 00:36:59.966351 kernel: sd 0:0:0:0: [sda] 160006144 512-byte logical blocks: (81.9 GB/76.3 GiB) Mar 13 00:36:59.970745 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 13 00:36:59.970937 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Mar 13 00:36:59.967534 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:36:59.978663 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 13 00:36:59.978883 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 13 00:36:59.985935 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 13 00:36:59.986119 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 13 00:36:59.986271 kernel: ahci 0000:00:1f.2: version 3.0 Mar 13 00:36:59.986971 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 13 00:36:59.987121 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 13 00:36:59.991867 kernel: hub 1-0:1.0: USB hub found Mar 13 00:36:59.992048 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Mar 13 00:36:59.992882 kernel: hub 1-0:1.0: 4 ports detected Mar 13 00:36:59.994991 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 13 00:36:59.995047 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Mar 13 00:36:59.999831 kernel: hub 2-0:1.0: USB hub found Mar 13 00:37:00.000028 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 13 00:37:00.000147 kernel: hub 2-0:1.0: 4 ports detected Mar 13 00:37:00.000289 kernel: scsi host1: ahci Mar 13 00:37:00.007911 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 13 00:37:00.010017 kernel: GPT:17805311 != 160006143 Mar 13 00:37:00.012782 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 13 00:37:00.012813 kernel: GPT:17805311 != 160006143 Mar 13 00:37:00.013418 kernel: scsi host2: ahci Mar 13 00:37:00.013621 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 13 00:37:00.016888 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 13 00:37:00.019417 kernel: scsi host3: ahci Mar 13 00:37:00.019465 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 13 00:37:00.025777 kernel: scsi host4: ahci Mar 13 00:37:00.026821 kernel: scsi host5: ahci Mar 13 00:37:00.031920 kernel: scsi host6: ahci Mar 13 00:37:00.038678 kernel: ata1: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380100 irq 51 lpm-pol 1 Mar 13 00:37:00.038703 kernel: ata2: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380180 irq 51 lpm-pol 1 Mar 13 00:37:00.038713 kernel: ata3: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380200 irq 51 lpm-pol 1 Mar 13 00:37:00.044869 kernel: ata4: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380280 irq 51 lpm-pol 1 Mar 13 00:37:00.044893 kernel: ata5: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380300 irq 51 lpm-pol 1 Mar 13 00:37:00.047064 kernel: ata6: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380380 irq 51 lpm-pol 1 Mar 13 00:37:00.075358 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Mar 13 00:37:00.082485 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Mar 13 00:37:00.093501 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 13 00:37:00.099099 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Mar 13 00:37:00.099734 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Mar 13 00:37:00.101740 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 13 00:37:00.116216 disk-uuid[649]: Primary Header is updated. Mar 13 00:37:00.116216 disk-uuid[649]: Secondary Entries is updated. Mar 13 00:37:00.116216 disk-uuid[649]: Secondary Header is updated. Mar 13 00:37:00.126877 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 13 00:37:00.141880 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 13 00:37:00.234969 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 13 00:37:00.365225 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 13 00:37:00.365300 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 13 00:37:00.365314 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 13 00:37:00.367805 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 13 00:37:00.372199 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Mar 13 00:37:00.372242 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 13 00:37:00.372251 kernel: ata1.00: LPM support broken, forcing max_power Mar 13 00:37:00.375615 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Mar 13 00:37:00.375643 kernel: ata1.00: applying bridge limits Mar 13 00:37:00.380111 kernel: ata1.00: LPM support broken, forcing max_power Mar 13 00:37:00.380163 kernel: ata1.00: configured for UDMA/100 Mar 13 00:37:00.382909 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 13 00:37:00.383863 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 13 00:37:00.411122 kernel: usbcore: registered new interface driver usbhid Mar 13 00:37:00.411164 kernel: usbhid: USB HID core driver Mar 13 00:37:00.420293 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input4 Mar 13 00:37:00.420365 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 13 00:37:00.427888 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Mar 13 00:37:00.428132 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 13 00:37:00.444880 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Mar 13 00:37:00.772316 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 13 00:37:00.773020 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 13 00:37:00.773456 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 13 00:37:00.774033 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 13 00:37:00.775280 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 13 00:37:00.791320 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 13 00:37:01.147921 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 13 00:37:01.150954 disk-uuid[650]: The operation has completed successfully. Mar 13 00:37:01.219323 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 13 00:37:01.219422 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 13 00:37:01.232067 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 13 00:37:01.249377 sh[685]: Success Mar 13 00:37:01.264928 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 13 00:37:01.264988 kernel: device-mapper: uevent: version 1.0.3 Mar 13 00:37:01.268754 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 13 00:37:01.277897 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Mar 13 00:37:01.327335 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 13 00:37:01.328577 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 13 00:37:01.350331 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 13 00:37:01.359874 kernel: BTRFS: device fsid 503642f8-c59c-4168-97a8-9c3603183fa3 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (697) Mar 13 00:37:01.359898 kernel: BTRFS info (device dm-0): first mount of filesystem 503642f8-c59c-4168-97a8-9c3603183fa3 Mar 13 00:37:01.368376 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:37:01.380001 kernel: BTRFS info (device dm-0 state E): enabling ssd optimizations Mar 13 00:37:01.380039 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 13 00:37:01.380058 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 13 00:37:01.383766 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 13 00:37:01.384455 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 13 00:37:01.385396 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 13 00:37:01.386944 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 13 00:37:01.390798 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 13 00:37:01.418139 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (724) Mar 13 00:37:01.418199 kernel: BTRFS info (device sda6): first mount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:37:01.421075 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:37:01.433567 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 13 00:37:01.433596 kernel: BTRFS info (device sda6): turning on async discard Mar 13 00:37:01.433605 kernel: BTRFS info (device sda6): enabling free space tree Mar 13 00:37:01.439864 kernel: BTRFS info (device sda6): last unmount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:37:01.441110 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 13 00:37:01.442979 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 13 00:37:01.522408 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 13 00:37:01.526972 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 13 00:37:01.533570 ignition[775]: Ignition 2.22.0 Mar 13 00:37:01.533581 ignition[775]: Stage: fetch-offline Mar 13 00:37:01.533610 ignition[775]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:37:01.533618 ignition[775]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:37:01.533678 ignition[775]: parsed url from cmdline: "" Mar 13 00:37:01.533681 ignition[775]: no config URL provided Mar 13 00:37:01.533685 ignition[775]: reading system config file "/usr/lib/ignition/user.ign" Mar 13 00:37:01.533692 ignition[775]: no config at "/usr/lib/ignition/user.ign" Mar 13 00:37:01.533696 ignition[775]: failed to fetch config: resource requires networking Mar 13 00:37:01.533950 ignition[775]: Ignition finished successfully Mar 13 00:37:01.537233 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 13 00:37:01.558337 systemd-networkd[870]: lo: Link UP Mar 13 00:37:01.558346 systemd-networkd[870]: lo: Gained carrier Mar 13 00:37:01.560670 systemd-networkd[870]: Enumeration completed Mar 13 00:37:01.560735 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 13 00:37:01.561160 systemd[1]: Reached target network.target - Network. Mar 13 00:37:01.562511 systemd-networkd[870]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:37:01.562517 systemd-networkd[870]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 13 00:37:01.562868 systemd-networkd[870]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:37:01.562872 systemd-networkd[870]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 13 00:37:01.563280 systemd-networkd[870]: eth0: Link UP Mar 13 00:37:01.563383 systemd-networkd[870]: eth1: Link UP Mar 13 00:37:01.563873 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 13 00:37:01.565131 systemd-networkd[870]: eth0: Gained carrier Mar 13 00:37:01.565140 systemd-networkd[870]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:37:01.570161 systemd-networkd[870]: eth1: Gained carrier Mar 13 00:37:01.570172 systemd-networkd[870]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:37:01.589341 ignition[875]: Ignition 2.22.0 Mar 13 00:37:01.589351 ignition[875]: Stage: fetch Mar 13 00:37:01.589435 ignition[875]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:37:01.589443 ignition[875]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:37:01.589498 ignition[875]: parsed url from cmdline: "" Mar 13 00:37:01.589504 ignition[875]: no config URL provided Mar 13 00:37:01.589508 ignition[875]: reading system config file "/usr/lib/ignition/user.ign" Mar 13 00:37:01.589515 ignition[875]: no config at "/usr/lib/ignition/user.ign" Mar 13 00:37:01.589532 ignition[875]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Mar 13 00:37:01.589641 ignition[875]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Mar 13 00:37:01.610911 systemd-networkd[870]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 13 00:37:01.624895 systemd-networkd[870]: eth0: DHCPv4 address 89.167.87.208/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 13 00:37:01.790142 ignition[875]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Mar 13 00:37:01.797483 ignition[875]: GET result: OK Mar 13 00:37:01.797586 ignition[875]: parsing config with SHA512: 15aa47e5211fc04d848d2b5d6a85aaa44838655e7312ddc79f2f851a11f580ebeea6057b35f4b8950c8fef6c8b1d37b7bba9b7a16bb033fa2a57d4ddbeca8d14 Mar 13 00:37:01.804068 unknown[875]: fetched base config from "system" Mar 13 00:37:01.804088 unknown[875]: fetched base config from "system" Mar 13 00:37:01.804667 ignition[875]: fetch: fetch complete Mar 13 00:37:01.804106 unknown[875]: fetched user config from "hetzner" Mar 13 00:37:01.804683 ignition[875]: fetch: fetch passed Mar 13 00:37:01.804755 ignition[875]: Ignition finished successfully Mar 13 00:37:01.811094 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 13 00:37:01.814033 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 13 00:37:01.868305 ignition[882]: Ignition 2.22.0 Mar 13 00:37:01.868323 ignition[882]: Stage: kargs Mar 13 00:37:01.868479 ignition[882]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:37:01.868493 ignition[882]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:37:01.869499 ignition[882]: kargs: kargs passed Mar 13 00:37:01.869556 ignition[882]: Ignition finished successfully Mar 13 00:37:01.874188 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 13 00:37:01.877449 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 13 00:37:01.910297 ignition[888]: Ignition 2.22.0 Mar 13 00:37:01.910306 ignition[888]: Stage: disks Mar 13 00:37:01.910397 ignition[888]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:37:01.910405 ignition[888]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:37:01.911025 ignition[888]: disks: disks passed Mar 13 00:37:01.911059 ignition[888]: Ignition finished successfully Mar 13 00:37:01.913940 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 13 00:37:01.915158 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 13 00:37:01.915806 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 13 00:37:01.916199 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 13 00:37:01.917326 systemd[1]: Reached target sysinit.target - System Initialization. Mar 13 00:37:01.918320 systemd[1]: Reached target basic.target - Basic System. Mar 13 00:37:01.920177 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 13 00:37:01.963527 systemd-fsck[897]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Mar 13 00:37:01.967750 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 13 00:37:01.971287 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 13 00:37:02.094878 kernel: EXT4-fs (sda9): mounted filesystem 26348f72-0225-4c06-aedc-823e61beebc6 r/w with ordered data mode. Quota mode: none. Mar 13 00:37:02.094944 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 13 00:37:02.095697 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 13 00:37:02.097206 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 13 00:37:02.098438 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 13 00:37:02.101000 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 13 00:37:02.101908 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 13 00:37:02.102957 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 13 00:37:02.121756 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 13 00:37:02.123996 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 13 00:37:02.133878 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (905) Mar 13 00:37:02.140757 kernel: BTRFS info (device sda6): first mount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:37:02.140787 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:37:02.152409 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 13 00:37:02.152439 kernel: BTRFS info (device sda6): turning on async discard Mar 13 00:37:02.152448 kernel: BTRFS info (device sda6): enabling free space tree Mar 13 00:37:02.157883 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 13 00:37:02.167197 coreos-metadata[907]: Mar 13 00:37:02.166 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Mar 13 00:37:02.168182 coreos-metadata[907]: Mar 13 00:37:02.167 INFO Fetch successful Mar 13 00:37:02.169312 coreos-metadata[907]: Mar 13 00:37:02.169 INFO wrote hostname ci-4459-2-4-n-86976195a3 to /sysroot/etc/hostname Mar 13 00:37:02.170792 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 13 00:37:02.177504 initrd-setup-root[933]: cut: /sysroot/etc/passwd: No such file or directory Mar 13 00:37:02.181279 initrd-setup-root[940]: cut: /sysroot/etc/group: No such file or directory Mar 13 00:37:02.185358 initrd-setup-root[947]: cut: /sysroot/etc/shadow: No such file or directory Mar 13 00:37:02.188430 initrd-setup-root[954]: cut: /sysroot/etc/gshadow: No such file or directory Mar 13 00:37:02.264480 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 13 00:37:02.265717 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 13 00:37:02.267115 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 13 00:37:02.281068 kernel: BTRFS info (device sda6): last unmount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:37:02.291424 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 13 00:37:02.304898 ignition[1023]: INFO : Ignition 2.22.0 Mar 13 00:37:02.304898 ignition[1023]: INFO : Stage: mount Mar 13 00:37:02.307097 ignition[1023]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 13 00:37:02.307097 ignition[1023]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:37:02.307097 ignition[1023]: INFO : mount: mount passed Mar 13 00:37:02.307097 ignition[1023]: INFO : Ignition finished successfully Mar 13 00:37:02.307496 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 13 00:37:02.308911 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 13 00:37:02.356764 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 13 00:37:02.359058 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 13 00:37:02.386992 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1033) Mar 13 00:37:02.387042 kernel: BTRFS info (device sda6): first mount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:37:02.398601 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:37:02.411376 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 13 00:37:02.411413 kernel: BTRFS info (device sda6): turning on async discard Mar 13 00:37:02.411441 kernel: BTRFS info (device sda6): enabling free space tree Mar 13 00:37:02.419022 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 13 00:37:02.456507 ignition[1050]: INFO : Ignition 2.22.0 Mar 13 00:37:02.456507 ignition[1050]: INFO : Stage: files Mar 13 00:37:02.457478 ignition[1050]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 13 00:37:02.457478 ignition[1050]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:37:02.458272 ignition[1050]: DEBUG : files: compiled without relabeling support, skipping Mar 13 00:37:02.458770 ignition[1050]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 13 00:37:02.458770 ignition[1050]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 13 00:37:02.461777 ignition[1050]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 13 00:37:02.463072 ignition[1050]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 13 00:37:02.463916 ignition[1050]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 13 00:37:02.463183 unknown[1050]: wrote ssh authorized keys file for user: core Mar 13 00:37:02.465917 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 13 00:37:02.465917 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 13 00:37:02.695793 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 13 00:37:03.006317 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 13 00:37:03.006317 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 13 00:37:03.009495 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 13 00:37:03.009495 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 13 00:37:03.009495 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 13 00:37:03.009495 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 13 00:37:03.009495 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 13 00:37:03.009495 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 13 00:37:03.009495 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 13 00:37:03.015488 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 13 00:37:03.015488 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 13 00:37:03.015488 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 13 00:37:03.015488 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 13 00:37:03.015488 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 13 00:37:03.015488 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.8-x86-64.raw: attempt #1 Mar 13 00:37:03.402052 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 13 00:37:03.473125 systemd-networkd[870]: eth1: Gained IPv6LL Mar 13 00:37:03.537055 systemd-networkd[870]: eth0: Gained IPv6LL Mar 13 00:37:03.734997 ignition[1050]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 13 00:37:03.734997 ignition[1050]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 13 00:37:03.738167 ignition[1050]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 13 00:37:03.741988 ignition[1050]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 13 00:37:03.741988 ignition[1050]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 13 00:37:03.741988 ignition[1050]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 13 00:37:03.741988 ignition[1050]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 13 00:37:03.741988 ignition[1050]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 13 00:37:03.741988 ignition[1050]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 13 00:37:03.741988 ignition[1050]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Mar 13 00:37:03.741988 ignition[1050]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Mar 13 00:37:03.741988 ignition[1050]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 13 00:37:03.741988 ignition[1050]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 13 00:37:03.741988 ignition[1050]: INFO : files: files passed Mar 13 00:37:03.741988 ignition[1050]: INFO : Ignition finished successfully Mar 13 00:37:03.746508 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 13 00:37:03.748063 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 13 00:37:03.764075 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 13 00:37:03.767055 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 13 00:37:03.768049 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 13 00:37:03.787918 initrd-setup-root-after-ignition[1079]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 13 00:37:03.787918 initrd-setup-root-after-ignition[1079]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 13 00:37:03.791779 initrd-setup-root-after-ignition[1083]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 13 00:37:03.794048 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 13 00:37:03.795334 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 13 00:37:03.796952 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 13 00:37:03.886433 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 13 00:37:03.886547 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 13 00:37:03.887178 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 13 00:37:03.888171 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 13 00:37:03.889352 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 13 00:37:03.890941 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 13 00:37:03.917782 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 13 00:37:03.920034 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 13 00:37:03.946661 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 13 00:37:03.947252 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 13 00:37:03.948427 systemd[1]: Stopped target timers.target - Timer Units. Mar 13 00:37:03.949638 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 13 00:37:03.949759 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 13 00:37:03.951972 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 13 00:37:03.952480 systemd[1]: Stopped target basic.target - Basic System. Mar 13 00:37:03.953803 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 13 00:37:03.955071 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 13 00:37:03.956248 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 13 00:37:03.957517 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 13 00:37:03.958753 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 13 00:37:03.960071 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 13 00:37:03.961342 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 13 00:37:03.962523 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 13 00:37:03.963625 systemd[1]: Stopped target swap.target - Swaps. Mar 13 00:37:03.964753 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 13 00:37:03.964882 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 13 00:37:03.966556 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 13 00:37:03.967724 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 13 00:37:03.968740 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 13 00:37:03.970833 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 13 00:37:03.971319 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 13 00:37:03.971420 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 13 00:37:03.973071 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 13 00:37:03.973183 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 13 00:37:03.974222 systemd[1]: ignition-files.service: Deactivated successfully. Mar 13 00:37:03.974320 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 13 00:37:03.975278 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 13 00:37:03.975369 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 13 00:37:03.976967 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 13 00:37:03.978882 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 13 00:37:03.979000 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 13 00:37:03.993070 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 13 00:37:03.994554 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 13 00:37:03.995431 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 13 00:37:03.998069 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 13 00:37:03.998249 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 13 00:37:04.006312 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 13 00:37:04.007813 ignition[1103]: INFO : Ignition 2.22.0 Mar 13 00:37:04.007813 ignition[1103]: INFO : Stage: umount Mar 13 00:37:04.008661 ignition[1103]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 13 00:37:04.008661 ignition[1103]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:37:04.008661 ignition[1103]: INFO : umount: umount passed Mar 13 00:37:04.008661 ignition[1103]: INFO : Ignition finished successfully Mar 13 00:37:04.010027 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 13 00:37:04.010999 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 13 00:37:04.011083 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 13 00:37:04.013193 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 13 00:37:04.013735 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 13 00:37:04.014473 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 13 00:37:04.014829 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 13 00:37:04.015598 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 13 00:37:04.015950 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 13 00:37:04.016687 systemd[1]: Stopped target network.target - Network. Mar 13 00:37:04.017393 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 13 00:37:04.017434 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 13 00:37:04.018095 systemd[1]: Stopped target paths.target - Path Units. Mar 13 00:37:04.018734 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 13 00:37:04.021892 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 13 00:37:04.022236 systemd[1]: Stopped target slices.target - Slice Units. Mar 13 00:37:04.022551 systemd[1]: Stopped target sockets.target - Socket Units. Mar 13 00:37:04.022899 systemd[1]: iscsid.socket: Deactivated successfully. Mar 13 00:37:04.022934 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 13 00:37:04.023265 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 13 00:37:04.023293 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 13 00:37:04.023608 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 13 00:37:04.023647 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 13 00:37:04.025915 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 13 00:37:04.025953 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 13 00:37:04.026409 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 13 00:37:04.026753 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 13 00:37:04.028171 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 13 00:37:04.028692 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 13 00:37:04.028781 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 13 00:37:04.029745 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 13 00:37:04.029821 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 13 00:37:04.032631 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 13 00:37:04.032741 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 13 00:37:04.035600 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 13 00:37:04.035836 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 13 00:37:04.035923 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 13 00:37:04.037503 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 13 00:37:04.039372 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 13 00:37:04.039481 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 13 00:37:04.040948 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 13 00:37:04.041132 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 13 00:37:04.041669 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 13 00:37:04.041701 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 13 00:37:04.042974 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 13 00:37:04.044885 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 13 00:37:04.044932 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 13 00:37:04.045618 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 13 00:37:04.045658 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 13 00:37:04.047718 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 13 00:37:04.047777 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 13 00:37:04.048511 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 13 00:37:04.049687 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 13 00:37:04.070169 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 13 00:37:04.070733 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 13 00:37:04.071364 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 13 00:37:04.071449 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 13 00:37:04.073666 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 13 00:37:04.073733 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 13 00:37:04.074681 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 13 00:37:04.074712 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 13 00:37:04.075363 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 13 00:37:04.075402 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 13 00:37:04.076520 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 13 00:37:04.076559 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 13 00:37:04.077599 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 13 00:37:04.077640 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 13 00:37:04.079418 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 13 00:37:04.080231 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 13 00:37:04.080276 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 13 00:37:04.083661 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 13 00:37:04.083702 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 13 00:37:04.084267 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:37:04.084305 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:37:04.097451 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 13 00:37:04.097560 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 13 00:37:04.098179 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 13 00:37:04.099999 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 13 00:37:04.110559 systemd[1]: Switching root. Mar 13 00:37:04.141960 systemd-journald[198]: Journal stopped Mar 13 00:37:05.302117 systemd-journald[198]: Received SIGTERM from PID 1 (systemd). Mar 13 00:37:05.302189 kernel: SELinux: policy capability network_peer_controls=1 Mar 13 00:37:05.302201 kernel: SELinux: policy capability open_perms=1 Mar 13 00:37:05.302209 kernel: SELinux: policy capability extended_socket_class=1 Mar 13 00:37:05.302224 kernel: SELinux: policy capability always_check_network=0 Mar 13 00:37:05.302233 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 13 00:37:05.302241 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 13 00:37:05.302253 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 13 00:37:05.302266 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 13 00:37:05.302277 kernel: SELinux: policy capability userspace_initial_context=0 Mar 13 00:37:05.302285 kernel: audit: type=1403 audit(1773362224.311:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 13 00:37:05.302294 systemd[1]: Successfully loaded SELinux policy in 62.521ms. Mar 13 00:37:05.302315 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.505ms. Mar 13 00:37:05.302324 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 13 00:37:05.302334 systemd[1]: Detected virtualization kvm. Mar 13 00:37:05.302343 systemd[1]: Detected architecture x86-64. Mar 13 00:37:05.302356 systemd[1]: Detected first boot. Mar 13 00:37:05.302367 systemd[1]: Hostname set to . Mar 13 00:37:05.302375 systemd[1]: Initializing machine ID from VM UUID. Mar 13 00:37:05.302385 zram_generator::config[1147]: No configuration found. Mar 13 00:37:05.302395 kernel: Guest personality initialized and is inactive Mar 13 00:37:05.302404 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Mar 13 00:37:05.302412 kernel: Initialized host personality Mar 13 00:37:05.302420 kernel: NET: Registered PF_VSOCK protocol family Mar 13 00:37:05.302429 systemd[1]: Populated /etc with preset unit settings. Mar 13 00:37:05.302438 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 13 00:37:05.302449 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 13 00:37:05.302458 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 13 00:37:05.302467 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 13 00:37:05.302480 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 13 00:37:05.302491 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 13 00:37:05.302500 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 13 00:37:05.302530 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 13 00:37:05.302541 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 13 00:37:05.302550 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 13 00:37:05.302560 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 13 00:37:05.302569 systemd[1]: Created slice user.slice - User and Session Slice. Mar 13 00:37:05.302578 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 13 00:37:05.302587 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 13 00:37:05.302596 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 13 00:37:05.302608 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 13 00:37:05.304551 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 13 00:37:05.304579 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 13 00:37:05.304590 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 13 00:37:05.304600 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 13 00:37:05.304610 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 13 00:37:05.304619 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 13 00:37:05.304628 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 13 00:37:05.304641 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 13 00:37:05.304650 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 13 00:37:05.304659 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 13 00:37:05.304668 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 13 00:37:05.304677 systemd[1]: Reached target slices.target - Slice Units. Mar 13 00:37:05.304685 systemd[1]: Reached target swap.target - Swaps. Mar 13 00:37:05.304694 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 13 00:37:05.304703 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 13 00:37:05.304712 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 13 00:37:05.304723 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 13 00:37:05.304732 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 13 00:37:05.304741 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 13 00:37:05.304750 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 13 00:37:05.304759 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 13 00:37:05.304768 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 13 00:37:05.304777 systemd[1]: Mounting media.mount - External Media Directory... Mar 13 00:37:05.304786 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:37:05.304795 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 13 00:37:05.304806 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 13 00:37:05.304815 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 13 00:37:05.304824 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 13 00:37:05.304832 systemd[1]: Reached target machines.target - Containers. Mar 13 00:37:05.304841 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 13 00:37:05.304866 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:37:05.304875 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 13 00:37:05.304884 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 13 00:37:05.304895 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 13 00:37:05.304904 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 13 00:37:05.304913 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 13 00:37:05.304922 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 13 00:37:05.304931 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 13 00:37:05.304939 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 13 00:37:05.304948 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 13 00:37:05.304962 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 13 00:37:05.304974 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 13 00:37:05.304985 systemd[1]: Stopped systemd-fsck-usr.service. Mar 13 00:37:05.304994 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:37:05.305003 kernel: fuse: init (API version 7.41) Mar 13 00:37:05.305012 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 13 00:37:05.305021 kernel: loop: module loaded Mar 13 00:37:05.305029 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 13 00:37:05.305041 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 13 00:37:05.305052 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 13 00:37:05.305061 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 13 00:37:05.305070 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 13 00:37:05.305082 systemd[1]: verity-setup.service: Deactivated successfully. Mar 13 00:37:05.305091 systemd[1]: Stopped verity-setup.service. Mar 13 00:37:05.305100 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:37:05.305109 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 13 00:37:05.305118 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 13 00:37:05.305127 systemd[1]: Mounted media.mount - External Media Directory. Mar 13 00:37:05.305135 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 13 00:37:05.305144 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 13 00:37:05.305153 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 13 00:37:05.305184 systemd-journald[1228]: Collecting audit messages is disabled. Mar 13 00:37:05.305201 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 13 00:37:05.305210 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 13 00:37:05.305226 systemd-journald[1228]: Journal started Mar 13 00:37:05.305244 systemd-journald[1228]: Runtime Journal (/run/log/journal/164bf7cb66cb4ba0b60c12d9f39c55be) is 8M, max 76.1M, 68.1M free. Mar 13 00:37:04.946633 systemd[1]: Queued start job for default target multi-user.target. Mar 13 00:37:04.973728 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 13 00:37:04.974379 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 13 00:37:05.310729 systemd[1]: Started systemd-journald.service - Journal Service. Mar 13 00:37:05.310171 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 13 00:37:05.310366 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 13 00:37:05.311152 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 13 00:37:05.311887 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 13 00:37:05.312512 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 13 00:37:05.312893 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 13 00:37:05.314141 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 13 00:37:05.314323 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 13 00:37:05.315042 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 13 00:37:05.315889 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 13 00:37:05.316545 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 13 00:37:05.317396 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 13 00:37:05.318499 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 13 00:37:05.323872 kernel: ACPI: bus type drm_connector registered Mar 13 00:37:05.327034 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 13 00:37:05.327290 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 13 00:37:05.328009 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 13 00:37:05.335027 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 13 00:37:05.337928 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 13 00:37:05.340976 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 13 00:37:05.341386 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 13 00:37:05.341800 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 13 00:37:05.342937 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 13 00:37:05.346943 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 13 00:37:05.349009 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:37:05.351042 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 13 00:37:05.353819 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 13 00:37:05.354212 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 13 00:37:05.357070 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 13 00:37:05.357960 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 13 00:37:05.361025 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 13 00:37:05.366987 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 13 00:37:05.372484 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 13 00:37:05.374418 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 13 00:37:05.375361 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 13 00:37:05.380304 systemd-journald[1228]: Time spent on flushing to /var/log/journal/164bf7cb66cb4ba0b60c12d9f39c55be is 24.758ms for 1243 entries. Mar 13 00:37:05.380304 systemd-journald[1228]: System Journal (/var/log/journal/164bf7cb66cb4ba0b60c12d9f39c55be) is 8M, max 584.8M, 576.8M free. Mar 13 00:37:05.425919 systemd-journald[1228]: Received client request to flush runtime journal. Mar 13 00:37:05.425949 kernel: loop0: detected capacity change from 0 to 228704 Mar 13 00:37:05.388087 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 13 00:37:05.390089 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 13 00:37:05.395090 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 13 00:37:05.427476 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 13 00:37:05.452689 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 13 00:37:05.455380 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 13 00:37:05.467291 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 13 00:37:05.475809 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 13 00:37:05.490097 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 13 00:37:05.492178 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 13 00:37:05.497887 kernel: loop1: detected capacity change from 0 to 128560 Mar 13 00:37:05.529279 kernel: loop2: detected capacity change from 0 to 8 Mar 13 00:37:05.537169 systemd-tmpfiles[1290]: ACLs are not supported, ignoring. Mar 13 00:37:05.537568 systemd-tmpfiles[1290]: ACLs are not supported, ignoring. Mar 13 00:37:05.548906 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 13 00:37:05.557880 kernel: loop3: detected capacity change from 0 to 110984 Mar 13 00:37:05.596873 kernel: loop4: detected capacity change from 0 to 228704 Mar 13 00:37:05.622869 kernel: loop5: detected capacity change from 0 to 128560 Mar 13 00:37:05.640876 kernel: loop6: detected capacity change from 0 to 8 Mar 13 00:37:05.645869 kernel: loop7: detected capacity change from 0 to 110984 Mar 13 00:37:05.667941 (sd-merge)[1299]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Mar 13 00:37:05.669021 (sd-merge)[1299]: Merged extensions into '/usr'. Mar 13 00:37:05.674722 systemd[1]: Reload requested from client PID 1272 ('systemd-sysext') (unit systemd-sysext.service)... Mar 13 00:37:05.674838 systemd[1]: Reloading... Mar 13 00:37:05.767874 zram_generator::config[1325]: No configuration found. Mar 13 00:37:05.850885 ldconfig[1267]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 13 00:37:05.941050 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 13 00:37:05.941558 systemd[1]: Reloading finished in 265 ms. Mar 13 00:37:05.972447 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 13 00:37:05.973307 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 13 00:37:05.977574 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 13 00:37:05.987128 systemd[1]: Starting ensure-sysext.service... Mar 13 00:37:05.988412 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 13 00:37:05.991190 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 13 00:37:06.012511 systemd-tmpfiles[1371]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 13 00:37:06.012885 systemd-tmpfiles[1371]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 13 00:37:06.013271 systemd-tmpfiles[1371]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 13 00:37:06.013631 systemd-tmpfiles[1371]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 13 00:37:06.014677 systemd-tmpfiles[1371]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 13 00:37:06.015059 systemd-tmpfiles[1371]: ACLs are not supported, ignoring. Mar 13 00:37:06.015187 systemd-tmpfiles[1371]: ACLs are not supported, ignoring. Mar 13 00:37:06.019502 systemd-tmpfiles[1371]: Detected autofs mount point /boot during canonicalization of boot. Mar 13 00:37:06.019573 systemd-tmpfiles[1371]: Skipping /boot Mar 13 00:37:06.022629 systemd[1]: Reload requested from client PID 1370 ('systemctl') (unit ensure-sysext.service)... Mar 13 00:37:06.022643 systemd[1]: Reloading... Mar 13 00:37:06.028396 systemd-tmpfiles[1371]: Detected autofs mount point /boot during canonicalization of boot. Mar 13 00:37:06.028463 systemd-tmpfiles[1371]: Skipping /boot Mar 13 00:37:06.061440 systemd-udevd[1372]: Using default interface naming scheme 'v255'. Mar 13 00:37:06.100932 zram_generator::config[1404]: No configuration found. Mar 13 00:37:06.304387 systemd[1]: Reloading finished in 281 ms. Mar 13 00:37:06.317887 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 Mar 13 00:37:06.320364 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 13 00:37:06.329815 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 13 00:37:06.338738 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 13 00:37:06.339898 kernel: mousedev: PS/2 mouse device common for all mice Mar 13 00:37:06.344300 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 13 00:37:06.347982 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 13 00:37:06.349296 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 13 00:37:06.358029 kernel: ACPI: button: Power Button [PWRF] Mar 13 00:37:06.356103 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 13 00:37:06.362506 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 13 00:37:06.364999 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 13 00:37:06.367601 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Mar 13 00:37:06.369269 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:37:06.369419 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:37:06.372870 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 13 00:37:06.378072 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 13 00:37:06.382527 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 13 00:37:06.384512 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:37:06.384601 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:37:06.386025 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 13 00:37:06.386348 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:37:06.387950 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:37:06.388091 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:37:06.388219 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:37:06.388283 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:37:06.388333 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:37:06.392545 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:37:06.392677 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:37:06.392790 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:37:06.392841 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:37:06.393999 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:37:06.396378 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:37:06.396553 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:37:06.401280 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 13 00:37:06.401988 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:37:06.402063 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:37:06.402152 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:37:06.408749 systemd[1]: Finished ensure-sysext.service. Mar 13 00:37:06.416034 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 13 00:37:06.416759 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 13 00:37:06.421948 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 13 00:37:06.425009 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 13 00:37:06.430189 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 13 00:37:06.433569 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 13 00:37:06.451974 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 13 00:37:06.452185 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 13 00:37:06.452943 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 13 00:37:06.453107 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 13 00:37:06.454527 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 13 00:37:06.458103 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 13 00:37:06.458166 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 13 00:37:06.461616 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 13 00:37:06.461789 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 13 00:37:06.501913 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 13 00:37:06.506779 augenrules[1545]: No rules Mar 13 00:37:06.510413 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 13 00:37:06.511055 systemd[1]: audit-rules.service: Deactivated successfully. Mar 13 00:37:06.511938 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 13 00:37:06.513189 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 13 00:37:06.537929 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Mar 13 00:37:06.538196 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 13 00:37:06.540609 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 13 00:37:06.610445 kernel: EDAC MC: Ver: 3.0.0 Mar 13 00:37:06.610507 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Mar 13 00:37:06.629862 kernel: Console: switching to colour dummy device 80x25 Mar 13 00:37:06.633287 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:37:06.633888 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Mar 13 00:37:06.640862 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 13 00:37:06.640896 kernel: [drm] features: -context_init Mar 13 00:37:06.665877 kernel: [drm] number of scanouts: 1 Mar 13 00:37:06.668880 kernel: [drm] number of cap sets: 0 Mar 13 00:37:06.670411 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 13 00:37:06.672919 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Mar 13 00:37:06.675784 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 13 00:37:06.678876 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 13 00:37:06.685250 kernel: Console: switching to colour frame buffer device 160x50 Mar 13 00:37:06.688525 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:37:06.688772 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:37:06.690886 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 13 00:37:06.705085 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:37:06.716034 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:37:06.716244 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:37:06.717945 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:37:06.734440 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 13 00:37:06.773360 systemd-networkd[1488]: lo: Link UP Mar 13 00:37:06.773674 systemd-networkd[1488]: lo: Gained carrier Mar 13 00:37:06.776124 systemd-networkd[1488]: Enumeration completed Mar 13 00:37:06.776272 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 13 00:37:06.776626 systemd-networkd[1488]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:37:06.776675 systemd-networkd[1488]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 13 00:37:06.777265 systemd-networkd[1488]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:37:06.777309 systemd-networkd[1488]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 13 00:37:06.777724 systemd-networkd[1488]: eth0: Link UP Mar 13 00:37:06.777929 systemd-networkd[1488]: eth0: Gained carrier Mar 13 00:37:06.777974 systemd-networkd[1488]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:37:06.781118 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 13 00:37:06.783024 systemd-networkd[1488]: eth1: Link UP Mar 13 00:37:06.783520 systemd-networkd[1488]: eth1: Gained carrier Mar 13 00:37:06.783533 systemd-networkd[1488]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:37:06.786953 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 13 00:37:06.818929 systemd-networkd[1488]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 13 00:37:06.824789 systemd-resolved[1489]: Positive Trust Anchors: Mar 13 00:37:06.825056 systemd-resolved[1489]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 13 00:37:06.825117 systemd-resolved[1489]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 13 00:37:06.828899 systemd-resolved[1489]: Using system hostname 'ci-4459-2-4-n-86976195a3'. Mar 13 00:37:06.831476 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 13 00:37:06.832749 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 13 00:37:06.832958 systemd[1]: Reached target network.target - Network. Mar 13 00:37:06.833018 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 13 00:37:06.837572 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 13 00:37:06.838308 systemd[1]: Reached target time-set.target - System Time Set. Mar 13 00:37:06.841927 systemd-networkd[1488]: eth0: DHCPv4 address 89.167.87.208/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 13 00:37:06.858501 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:37:06.858836 systemd[1]: Reached target sysinit.target - System Initialization. Mar 13 00:37:06.859063 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 13 00:37:06.859163 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 13 00:37:06.859289 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Mar 13 00:37:06.859529 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 13 00:37:06.859670 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 13 00:37:06.859745 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 13 00:37:06.859812 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 13 00:37:06.859839 systemd[1]: Reached target paths.target - Path Units. Mar 13 00:37:06.862083 systemd[1]: Reached target timers.target - Timer Units. Mar 13 00:37:06.863658 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 13 00:37:06.866243 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 13 00:37:06.869449 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 13 00:37:06.872002 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 13 00:37:06.872469 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 13 00:37:06.880512 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 13 00:37:06.881755 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 13 00:37:06.885055 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 13 00:37:06.886513 systemd[1]: Reached target sockets.target - Socket Units. Mar 13 00:37:06.889824 systemd[1]: Reached target basic.target - Basic System. Mar 13 00:37:06.891710 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 13 00:37:06.891742 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 13 00:37:06.892716 systemd[1]: Starting containerd.service - containerd container runtime... Mar 13 00:37:06.896235 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 13 00:37:06.900945 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 13 00:37:06.906075 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 13 00:37:06.912918 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 13 00:37:06.916172 coreos-metadata[1584]: Mar 13 00:37:06.915 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Mar 13 00:37:06.917087 coreos-metadata[1584]: Mar 13 00:37:06.916 INFO Fetch successful Mar 13 00:37:06.917174 coreos-metadata[1584]: Mar 13 00:37:06.917 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Mar 13 00:37:06.917651 coreos-metadata[1584]: Mar 13 00:37:06.917 INFO Fetch successful Mar 13 00:37:06.919186 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 13 00:37:06.921729 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 13 00:37:06.924952 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Mar 13 00:37:06.926772 jq[1589]: false Mar 13 00:37:06.927986 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 13 00:37:06.931923 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 13 00:37:06.934507 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Mar 13 00:37:06.941902 extend-filesystems[1590]: Found /dev/sda6 Mar 13 00:37:06.942008 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 13 00:37:06.945915 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 13 00:37:06.951378 extend-filesystems[1590]: Found /dev/sda9 Mar 13 00:37:06.962399 extend-filesystems[1590]: Checking size of /dev/sda9 Mar 13 00:37:06.968280 google_oslogin_nss_cache[1591]: oslogin_cache_refresh[1591]: Refreshing passwd entry cache Mar 13 00:37:06.968280 google_oslogin_nss_cache[1591]: oslogin_cache_refresh[1591]: Failure getting users, quitting Mar 13 00:37:06.968280 google_oslogin_nss_cache[1591]: oslogin_cache_refresh[1591]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 13 00:37:06.968280 google_oslogin_nss_cache[1591]: oslogin_cache_refresh[1591]: Refreshing group entry cache Mar 13 00:37:06.968280 google_oslogin_nss_cache[1591]: oslogin_cache_refresh[1591]: Failure getting groups, quitting Mar 13 00:37:06.968280 google_oslogin_nss_cache[1591]: oslogin_cache_refresh[1591]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 13 00:37:06.955043 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 13 00:37:06.952758 oslogin_cache_refresh[1591]: Refreshing passwd entry cache Mar 13 00:37:06.968648 extend-filesystems[1590]: Resized partition /dev/sda9 Mar 13 00:37:06.961575 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 13 00:37:06.958521 oslogin_cache_refresh[1591]: Failure getting users, quitting Mar 13 00:37:06.962006 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 13 00:37:06.958535 oslogin_cache_refresh[1591]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 13 00:37:06.962812 systemd[1]: Starting update-engine.service - Update Engine... Mar 13 00:37:06.958574 oslogin_cache_refresh[1591]: Refreshing group entry cache Mar 13 00:37:06.967303 oslogin_cache_refresh[1591]: Failure getting groups, quitting Mar 13 00:37:06.967312 oslogin_cache_refresh[1591]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 13 00:37:06.983007 extend-filesystems[1608]: resize2fs 1.47.3 (8-Jul-2025) Mar 13 00:37:07.010056 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 19393531 blocks Mar 13 00:37:06.977052 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 13 00:37:06.992950 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 13 00:37:07.009548 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 13 00:37:07.009737 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 13 00:37:07.010699 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Mar 13 00:37:07.011296 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Mar 13 00:37:07.015410 systemd[1]: motdgen.service: Deactivated successfully. Mar 13 00:37:07.819368 systemd-resolved[1489]: Clock change detected. Flushing caches. Mar 13 00:37:07.865994 jq[1610]: true Mar 13 00:37:07.819464 systemd-timesyncd[1504]: Contacted time server 141.95.53.20:123 (1.flatcar.pool.ntp.org). Mar 13 00:37:07.868207 update_engine[1607]: I20260313 00:37:07.865077 1607 main.cc:92] Flatcar Update Engine starting Mar 13 00:37:07.819779 systemd-timesyncd[1504]: Initial clock synchronization to Fri 2026-03-13 00:37:07.819332 UTC. Mar 13 00:37:07.819864 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 13 00:37:07.823782 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 13 00:37:07.823991 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 13 00:37:07.845337 systemd-logind[1602]: New seat seat0. Mar 13 00:37:07.863871 (ntainerd)[1633]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 13 00:37:07.867981 systemd-logind[1602]: Watching system buttons on /dev/input/event3 (Power Button) Mar 13 00:37:07.868004 systemd-logind[1602]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 13 00:37:07.874799 systemd[1]: Started systemd-logind.service - User Login Management. Mar 13 00:37:07.890503 jq[1635]: true Mar 13 00:37:07.901791 tar[1623]: linux-amd64/LICENSE Mar 13 00:37:07.901791 tar[1623]: linux-amd64/helm Mar 13 00:37:07.926386 dbus-daemon[1585]: [system] SELinux support is enabled Mar 13 00:37:07.926599 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 13 00:37:07.931942 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 13 00:37:07.931967 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 13 00:37:07.934319 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 13 00:37:07.934338 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 13 00:37:07.946569 dbus-daemon[1585]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 13 00:37:07.958822 update_engine[1607]: I20260313 00:37:07.958699 1607 update_check_scheduler.cc:74] Next update check in 6m23s Mar 13 00:37:07.961257 systemd[1]: Started update-engine.service - Update Engine. Mar 13 00:37:07.990933 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 13 00:37:08.009952 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 13 00:37:08.012085 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 13 00:37:08.070190 bash[1667]: Updated "/home/core/.ssh/authorized_keys" Mar 13 00:37:08.070548 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 13 00:37:08.081086 systemd[1]: Starting sshkeys.service... Mar 13 00:37:08.111301 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 13 00:37:08.117435 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 13 00:37:08.126089 sshd_keygen[1622]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 13 00:37:08.140875 coreos-metadata[1672]: Mar 13 00:37:08.140 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Mar 13 00:37:08.142026 coreos-metadata[1672]: Mar 13 00:37:08.141 INFO Fetch successful Mar 13 00:37:08.144359 locksmithd[1649]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 13 00:37:08.150557 unknown[1672]: wrote ssh authorized keys file for user: core Mar 13 00:37:08.165328 containerd[1633]: time="2026-03-13T00:37:08Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 13 00:37:08.186599 kernel: EXT4-fs (sda9): resized filesystem to 19393531 Mar 13 00:37:08.166198 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 13 00:37:08.179904 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 13 00:37:08.187901 containerd[1633]: time="2026-03-13T00:37:08.187859041Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 13 00:37:08.188376 extend-filesystems[1608]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 13 00:37:08.188376 extend-filesystems[1608]: old_desc_blocks = 1, new_desc_blocks = 10 Mar 13 00:37:08.188376 extend-filesystems[1608]: The filesystem on /dev/sda9 is now 19393531 (4k) blocks long. Mar 13 00:37:08.201667 extend-filesystems[1590]: Resized filesystem in /dev/sda9 Mar 13 00:37:08.194042 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 13 00:37:08.205922 update-ssh-keys[1687]: Updated "/home/core/.ssh/authorized_keys" Mar 13 00:37:08.194254 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 13 00:37:08.206208 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 13 00:37:08.206415 containerd[1633]: time="2026-03-13T00:37:08.206390446Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.2µs" Mar 13 00:37:08.206461 containerd[1633]: time="2026-03-13T00:37:08.206450666Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 13 00:37:08.206581 containerd[1633]: time="2026-03-13T00:37:08.206505876Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 13 00:37:08.206721 containerd[1633]: time="2026-03-13T00:37:08.206709806Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 13 00:37:08.206761 containerd[1633]: time="2026-03-13T00:37:08.206750996Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 13 00:37:08.206821 containerd[1633]: time="2026-03-13T00:37:08.206813017Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 13 00:37:08.206948 containerd[1633]: time="2026-03-13T00:37:08.206914677Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 13 00:37:08.206948 containerd[1633]: time="2026-03-13T00:37:08.206928347Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 13 00:37:08.207256 containerd[1633]: time="2026-03-13T00:37:08.207232877Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 13 00:37:08.207376 containerd[1633]: time="2026-03-13T00:37:08.207245757Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 13 00:37:08.207376 containerd[1633]: time="2026-03-13T00:37:08.207336017Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 13 00:37:08.207376 containerd[1633]: time="2026-03-13T00:37:08.207344277Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 13 00:37:08.207952 containerd[1633]: time="2026-03-13T00:37:08.207743947Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 13 00:37:08.208006 containerd[1633]: time="2026-03-13T00:37:08.207995108Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 13 00:37:08.208051 containerd[1633]: time="2026-03-13T00:37:08.208042838Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 13 00:37:08.208088 containerd[1633]: time="2026-03-13T00:37:08.208080058Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 13 00:37:08.208122 containerd[1633]: time="2026-03-13T00:37:08.208115498Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 13 00:37:08.208259 containerd[1633]: time="2026-03-13T00:37:08.208251018Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 13 00:37:08.208325 containerd[1633]: time="2026-03-13T00:37:08.208317398Z" level=info msg="metadata content store policy set" policy=shared Mar 13 00:37:08.210049 systemd[1]: issuegen.service: Deactivated successfully. Mar 13 00:37:08.210256 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 13 00:37:08.212090 containerd[1633]: time="2026-03-13T00:37:08.212024831Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 13 00:37:08.212090 containerd[1633]: time="2026-03-13T00:37:08.212060691Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 13 00:37:08.212488 containerd[1633]: time="2026-03-13T00:37:08.212075871Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 13 00:37:08.212488 containerd[1633]: time="2026-03-13T00:37:08.212329371Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 13 00:37:08.212488 containerd[1633]: time="2026-03-13T00:37:08.212339381Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 13 00:37:08.212488 containerd[1633]: time="2026-03-13T00:37:08.212347061Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 13 00:37:08.212488 containerd[1633]: time="2026-03-13T00:37:08.212358891Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 13 00:37:08.212488 containerd[1633]: time="2026-03-13T00:37:08.212367051Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 13 00:37:08.212488 containerd[1633]: time="2026-03-13T00:37:08.212377171Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 13 00:37:08.212488 containerd[1633]: time="2026-03-13T00:37:08.212399441Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 13 00:37:08.212488 containerd[1633]: time="2026-03-13T00:37:08.212405561Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 13 00:37:08.212488 containerd[1633]: time="2026-03-13T00:37:08.212413721Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 13 00:37:08.213211 containerd[1633]: time="2026-03-13T00:37:08.212699241Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 13 00:37:08.213211 containerd[1633]: time="2026-03-13T00:37:08.212977162Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 13 00:37:08.213211 containerd[1633]: time="2026-03-13T00:37:08.212988072Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 13 00:37:08.213211 containerd[1633]: time="2026-03-13T00:37:08.212994822Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 13 00:37:08.213211 containerd[1633]: time="2026-03-13T00:37:08.213001562Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 13 00:37:08.213211 containerd[1633]: time="2026-03-13T00:37:08.213008442Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 13 00:37:08.213211 containerd[1633]: time="2026-03-13T00:37:08.213016852Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 13 00:37:08.213211 containerd[1633]: time="2026-03-13T00:37:08.213024002Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 13 00:37:08.213211 containerd[1633]: time="2026-03-13T00:37:08.213031452Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 13 00:37:08.213211 containerd[1633]: time="2026-03-13T00:37:08.213058212Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 13 00:37:08.213211 containerd[1633]: time="2026-03-13T00:37:08.213065022Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 13 00:37:08.213211 containerd[1633]: time="2026-03-13T00:37:08.213096822Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 13 00:37:08.213211 containerd[1633]: time="2026-03-13T00:37:08.213106042Z" level=info msg="Start snapshots syncer" Mar 13 00:37:08.213211 containerd[1633]: time="2026-03-13T00:37:08.213159772Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 13 00:37:08.213141 systemd[1]: Finished sshkeys.service. Mar 13 00:37:08.213809 containerd[1633]: time="2026-03-13T00:37:08.213784342Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 13 00:37:08.213988 containerd[1633]: time="2026-03-13T00:37:08.213864422Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 13 00:37:08.214275 containerd[1633]: time="2026-03-13T00:37:08.214026323Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 13 00:37:08.214326 containerd[1633]: time="2026-03-13T00:37:08.214315583Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 13 00:37:08.214421 containerd[1633]: time="2026-03-13T00:37:08.214411493Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 13 00:37:08.214702 containerd[1633]: time="2026-03-13T00:37:08.214449813Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 13 00:37:08.214702 containerd[1633]: time="2026-03-13T00:37:08.214630373Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 13 00:37:08.214702 containerd[1633]: time="2026-03-13T00:37:08.214644873Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 13 00:37:08.214702 containerd[1633]: time="2026-03-13T00:37:08.214652293Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 13 00:37:08.214702 containerd[1633]: time="2026-03-13T00:37:08.214659233Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 13 00:37:08.214702 containerd[1633]: time="2026-03-13T00:37:08.214677803Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 13 00:37:08.214702 containerd[1633]: time="2026-03-13T00:37:08.214684843Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 13 00:37:08.214897 containerd[1633]: time="2026-03-13T00:37:08.214825553Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 13 00:37:08.214897 containerd[1633]: time="2026-03-13T00:37:08.214862063Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 13 00:37:08.214897 containerd[1633]: time="2026-03-13T00:37:08.214870683Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 13 00:37:08.214897 containerd[1633]: time="2026-03-13T00:37:08.214877053Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 13 00:37:08.215178 containerd[1633]: time="2026-03-13T00:37:08.214883523Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 13 00:37:08.215178 containerd[1633]: time="2026-03-13T00:37:08.215083263Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 13 00:37:08.215178 containerd[1633]: time="2026-03-13T00:37:08.215106123Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 13 00:37:08.215178 containerd[1633]: time="2026-03-13T00:37:08.215119083Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 13 00:37:08.215178 containerd[1633]: time="2026-03-13T00:37:08.215131513Z" level=info msg="runtime interface created" Mar 13 00:37:08.215178 containerd[1633]: time="2026-03-13T00:37:08.215135753Z" level=info msg="created NRI interface" Mar 13 00:37:08.215178 containerd[1633]: time="2026-03-13T00:37:08.215141103Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 13 00:37:08.215178 containerd[1633]: time="2026-03-13T00:37:08.215148823Z" level=info msg="Connect containerd service" Mar 13 00:37:08.215359 containerd[1633]: time="2026-03-13T00:37:08.215304534Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 13 00:37:08.216959 containerd[1633]: time="2026-03-13T00:37:08.216617245Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 13 00:37:08.224712 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 13 00:37:08.253289 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 13 00:37:08.260700 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 13 00:37:08.264675 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 13 00:37:08.265754 systemd[1]: Reached target getty.target - Login Prompts. Mar 13 00:37:08.289786 containerd[1633]: time="2026-03-13T00:37:08.289757416Z" level=info msg="Start subscribing containerd event" Mar 13 00:37:08.289919 containerd[1633]: time="2026-03-13T00:37:08.289874966Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 13 00:37:08.289963 containerd[1633]: time="2026-03-13T00:37:08.289948076Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 13 00:37:08.290057 containerd[1633]: time="2026-03-13T00:37:08.289900336Z" level=info msg="Start recovering state" Mar 13 00:37:08.290482 containerd[1633]: time="2026-03-13T00:37:08.290453836Z" level=info msg="Start event monitor" Mar 13 00:37:08.290526 containerd[1633]: time="2026-03-13T00:37:08.290518926Z" level=info msg="Start cni network conf syncer for default" Mar 13 00:37:08.290614 containerd[1633]: time="2026-03-13T00:37:08.290602176Z" level=info msg="Start streaming server" Mar 13 00:37:08.290646 containerd[1633]: time="2026-03-13T00:37:08.290639836Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 13 00:37:08.290680 containerd[1633]: time="2026-03-13T00:37:08.290673436Z" level=info msg="runtime interface starting up..." Mar 13 00:37:08.290752 containerd[1633]: time="2026-03-13T00:37:08.290745396Z" level=info msg="starting plugins..." Mar 13 00:37:08.290787 containerd[1633]: time="2026-03-13T00:37:08.290780616Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 13 00:37:08.291187 containerd[1633]: time="2026-03-13T00:37:08.291146317Z" level=info msg="containerd successfully booted in 0.128074s" Mar 13 00:37:08.291287 systemd[1]: Started containerd.service - containerd container runtime. Mar 13 00:37:08.382899 tar[1623]: linux-amd64/README.md Mar 13 00:37:08.394157 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 13 00:37:08.689727 systemd-networkd[1488]: eth1: Gained IPv6LL Mar 13 00:37:08.696908 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 13 00:37:08.699096 systemd[1]: Reached target network-online.target - Network is Online. Mar 13 00:37:08.706383 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:37:08.711798 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 13 00:37:08.766883 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 13 00:37:08.831981 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 13 00:37:08.835976 systemd[1]: Started sshd@0-89.167.87.208:22-4.153.228.146:45700.service - OpenSSH per-connection server daemon (4.153.228.146:45700). Mar 13 00:37:09.201969 systemd-networkd[1488]: eth0: Gained IPv6LL Mar 13 00:37:09.484286 sshd[1730]: Accepted publickey for core from 4.153.228.146 port 45700 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:37:09.489673 sshd-session[1730]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:37:09.509796 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 13 00:37:09.514634 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 13 00:37:09.519875 systemd-logind[1602]: New session 1 of user core. Mar 13 00:37:09.537743 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 13 00:37:09.544193 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 13 00:37:09.557412 (systemd)[1735]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 13 00:37:09.561223 systemd-logind[1602]: New session c1 of user core. Mar 13 00:37:09.669578 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:37:09.672270 systemd[1735]: Queued start job for default target default.target. Mar 13 00:37:09.672723 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 13 00:37:09.673365 systemd[1735]: Created slice app.slice - User Application Slice. Mar 13 00:37:09.673385 systemd[1735]: Reached target paths.target - Paths. Mar 13 00:37:09.673425 systemd[1735]: Reached target timers.target - Timers. Mar 13 00:37:09.674778 systemd[1735]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 13 00:37:09.677718 (kubelet)[1746]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 13 00:37:09.684596 systemd[1735]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 13 00:37:09.684711 systemd[1735]: Reached target sockets.target - Sockets. Mar 13 00:37:09.684788 systemd[1735]: Reached target basic.target - Basic System. Mar 13 00:37:09.684870 systemd[1735]: Reached target default.target - Main User Target. Mar 13 00:37:09.684936 systemd[1735]: Startup finished in 117ms. Mar 13 00:37:09.685085 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 13 00:37:09.691585 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 13 00:37:09.693573 systemd[1]: Startup finished in 2.951s (kernel) + 5.637s (initrd) + 4.643s (userspace) = 13.233s. Mar 13 00:37:10.067641 systemd[1]: Started sshd@1-89.167.87.208:22-4.153.228.146:36930.service - OpenSSH per-connection server daemon (4.153.228.146:36930). Mar 13 00:37:10.255049 kubelet[1746]: E0313 00:37:10.254973 1746 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 13 00:37:10.260310 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 13 00:37:10.260498 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 13 00:37:10.260996 systemd[1]: kubelet.service: Consumed 981ms CPU time, 269.3M memory peak. Mar 13 00:37:10.706360 sshd[1760]: Accepted publickey for core from 4.153.228.146 port 36930 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:37:10.708994 sshd-session[1760]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:37:10.719848 systemd-logind[1602]: New session 2 of user core. Mar 13 00:37:10.742753 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 13 00:37:11.078036 sshd[1764]: Connection closed by 4.153.228.146 port 36930 Mar 13 00:37:11.079784 sshd-session[1760]: pam_unix(sshd:session): session closed for user core Mar 13 00:37:11.087240 systemd-logind[1602]: Session 2 logged out. Waiting for processes to exit. Mar 13 00:37:11.089042 systemd[1]: sshd@1-89.167.87.208:22-4.153.228.146:36930.service: Deactivated successfully. Mar 13 00:37:11.093300 systemd[1]: session-2.scope: Deactivated successfully. Mar 13 00:37:11.096799 systemd-logind[1602]: Removed session 2. Mar 13 00:37:11.210736 systemd[1]: Started sshd@2-89.167.87.208:22-4.153.228.146:36942.service - OpenSSH per-connection server daemon (4.153.228.146:36942). Mar 13 00:37:11.876702 sshd[1770]: Accepted publickey for core from 4.153.228.146 port 36942 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:37:11.879198 sshd-session[1770]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:37:11.887265 systemd-logind[1602]: New session 3 of user core. Mar 13 00:37:11.891686 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 13 00:37:12.233312 sshd[1773]: Connection closed by 4.153.228.146 port 36942 Mar 13 00:37:12.234196 sshd-session[1770]: pam_unix(sshd:session): session closed for user core Mar 13 00:37:12.240918 systemd-logind[1602]: Session 3 logged out. Waiting for processes to exit. Mar 13 00:37:12.242360 systemd[1]: sshd@2-89.167.87.208:22-4.153.228.146:36942.service: Deactivated successfully. Mar 13 00:37:12.246196 systemd[1]: session-3.scope: Deactivated successfully. Mar 13 00:37:12.248683 systemd-logind[1602]: Removed session 3. Mar 13 00:37:12.365226 systemd[1]: Started sshd@3-89.167.87.208:22-4.153.228.146:36944.service - OpenSSH per-connection server daemon (4.153.228.146:36944). Mar 13 00:37:13.034580 sshd[1779]: Accepted publickey for core from 4.153.228.146 port 36944 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:37:13.036295 sshd-session[1779]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:37:13.043656 systemd-logind[1602]: New session 4 of user core. Mar 13 00:37:13.050612 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 13 00:37:13.398760 sshd[1782]: Connection closed by 4.153.228.146 port 36944 Mar 13 00:37:13.400805 sshd-session[1779]: pam_unix(sshd:session): session closed for user core Mar 13 00:37:13.408193 systemd-logind[1602]: Session 4 logged out. Waiting for processes to exit. Mar 13 00:37:13.408754 systemd[1]: sshd@3-89.167.87.208:22-4.153.228.146:36944.service: Deactivated successfully. Mar 13 00:37:13.412789 systemd[1]: session-4.scope: Deactivated successfully. Mar 13 00:37:13.416402 systemd-logind[1602]: Removed session 4. Mar 13 00:37:13.532099 systemd[1]: Started sshd@4-89.167.87.208:22-4.153.228.146:36950.service - OpenSSH per-connection server daemon (4.153.228.146:36950). Mar 13 00:37:14.205763 sshd[1788]: Accepted publickey for core from 4.153.228.146 port 36950 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:37:14.208579 sshd-session[1788]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:37:14.219033 systemd-logind[1602]: New session 5 of user core. Mar 13 00:37:14.229716 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 13 00:37:14.467727 sudo[1792]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 13 00:37:14.468420 sudo[1792]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:37:14.493729 sudo[1792]: pam_unix(sudo:session): session closed for user root Mar 13 00:37:14.614720 sshd[1791]: Connection closed by 4.153.228.146 port 36950 Mar 13 00:37:14.616860 sshd-session[1788]: pam_unix(sshd:session): session closed for user core Mar 13 00:37:14.623546 systemd[1]: sshd@4-89.167.87.208:22-4.153.228.146:36950.service: Deactivated successfully. Mar 13 00:37:14.627415 systemd[1]: session-5.scope: Deactivated successfully. Mar 13 00:37:14.630657 systemd-logind[1602]: Session 5 logged out. Waiting for processes to exit. Mar 13 00:37:14.633283 systemd-logind[1602]: Removed session 5. Mar 13 00:37:14.755003 systemd[1]: Started sshd@5-89.167.87.208:22-4.153.228.146:36962.service - OpenSSH per-connection server daemon (4.153.228.146:36962). Mar 13 00:37:15.424560 sshd[1798]: Accepted publickey for core from 4.153.228.146 port 36962 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:37:15.425954 sshd-session[1798]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:37:15.432162 systemd-logind[1602]: New session 6 of user core. Mar 13 00:37:15.437807 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 13 00:37:15.672630 sudo[1803]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 13 00:37:15.673252 sudo[1803]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:37:15.681803 sudo[1803]: pam_unix(sudo:session): session closed for user root Mar 13 00:37:15.692666 sudo[1802]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 13 00:37:15.693274 sudo[1802]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:37:15.711854 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 13 00:37:15.779405 augenrules[1825]: No rules Mar 13 00:37:15.782051 systemd[1]: audit-rules.service: Deactivated successfully. Mar 13 00:37:15.782418 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 13 00:37:15.784383 sudo[1802]: pam_unix(sudo:session): session closed for user root Mar 13 00:37:15.904119 sshd[1801]: Connection closed by 4.153.228.146 port 36962 Mar 13 00:37:15.905767 sshd-session[1798]: pam_unix(sshd:session): session closed for user core Mar 13 00:37:15.911327 systemd[1]: sshd@5-89.167.87.208:22-4.153.228.146:36962.service: Deactivated successfully. Mar 13 00:37:15.915016 systemd[1]: session-6.scope: Deactivated successfully. Mar 13 00:37:15.917209 systemd-logind[1602]: Session 6 logged out. Waiting for processes to exit. Mar 13 00:37:15.920082 systemd-logind[1602]: Removed session 6. Mar 13 00:37:16.037188 systemd[1]: Started sshd@6-89.167.87.208:22-4.153.228.146:36976.service - OpenSSH per-connection server daemon (4.153.228.146:36976). Mar 13 00:37:16.697244 sshd[1834]: Accepted publickey for core from 4.153.228.146 port 36976 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:37:16.699897 sshd-session[1834]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:37:16.708600 systemd-logind[1602]: New session 7 of user core. Mar 13 00:37:16.720732 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 13 00:37:16.948681 sudo[1838]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 13 00:37:16.949303 sudo[1838]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:37:17.325012 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 13 00:37:17.358227 (dockerd)[1856]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 13 00:37:17.545814 dockerd[1856]: time="2026-03-13T00:37:17.545749877Z" level=info msg="Starting up" Mar 13 00:37:17.546808 dockerd[1856]: time="2026-03-13T00:37:17.546736817Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 13 00:37:17.562688 dockerd[1856]: time="2026-03-13T00:37:17.562647811Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 13 00:37:17.587125 systemd[1]: var-lib-docker-metacopy\x2dcheck1676853283-merged.mount: Deactivated successfully. Mar 13 00:37:17.605234 dockerd[1856]: time="2026-03-13T00:37:17.605178266Z" level=info msg="Loading containers: start." Mar 13 00:37:17.616547 kernel: Initializing XFRM netlink socket Mar 13 00:37:17.891557 systemd-networkd[1488]: docker0: Link UP Mar 13 00:37:17.896595 dockerd[1856]: time="2026-03-13T00:37:17.896554889Z" level=info msg="Loading containers: done." Mar 13 00:37:17.907393 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2138661877-merged.mount: Deactivated successfully. Mar 13 00:37:17.914963 dockerd[1856]: time="2026-03-13T00:37:17.914929924Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 13 00:37:17.915061 dockerd[1856]: time="2026-03-13T00:37:17.914989294Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 13 00:37:17.915061 dockerd[1856]: time="2026-03-13T00:37:17.915047924Z" level=info msg="Initializing buildkit" Mar 13 00:37:17.934403 dockerd[1856]: time="2026-03-13T00:37:17.934377750Z" level=info msg="Completed buildkit initialization" Mar 13 00:37:17.938854 dockerd[1856]: time="2026-03-13T00:37:17.938531674Z" level=info msg="Daemon has completed initialization" Mar 13 00:37:17.938854 dockerd[1856]: time="2026-03-13T00:37:17.938564324Z" level=info msg="API listen on /run/docker.sock" Mar 13 00:37:17.938653 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 13 00:37:18.501521 containerd[1633]: time="2026-03-13T00:37:18.501445173Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\"" Mar 13 00:37:19.025032 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2341352319.mount: Deactivated successfully. Mar 13 00:37:20.180370 containerd[1633]: time="2026-03-13T00:37:20.180316251Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:20.183755 containerd[1633]: time="2026-03-13T00:37:20.183578944Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.9: active requests=0, bytes read=30116286" Mar 13 00:37:20.184794 containerd[1633]: time="2026-03-13T00:37:20.184773305Z" level=info msg="ImageCreate event name:\"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:20.187031 containerd[1633]: time="2026-03-13T00:37:20.187014147Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:20.187673 containerd[1633]: time="2026-03-13T00:37:20.187640637Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.9\" with image id \"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\", size \"30112785\" in 1.686097424s" Mar 13 00:37:20.187717 containerd[1633]: time="2026-03-13T00:37:20.187678977Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\" returns image reference \"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\"" Mar 13 00:37:20.188289 containerd[1633]: time="2026-03-13T00:37:20.188270918Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\"" Mar 13 00:37:20.475723 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 13 00:37:20.479569 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:37:20.638100 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:37:20.646741 (kubelet)[2134]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 13 00:37:20.680143 kubelet[2134]: E0313 00:37:20.680071 2134 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 13 00:37:20.687297 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 13 00:37:20.687457 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 13 00:37:20.687929 systemd[1]: kubelet.service: Consumed 173ms CPU time, 108.8M memory peak. Mar 13 00:37:21.433989 containerd[1633]: time="2026-03-13T00:37:21.433927556Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:21.435812 containerd[1633]: time="2026-03-13T00:37:21.435603917Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.9: active requests=0, bytes read=26021832" Mar 13 00:37:21.440370 containerd[1633]: time="2026-03-13T00:37:21.440342711Z" level=info msg="ImageCreate event name:\"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:21.443219 containerd[1633]: time="2026-03-13T00:37:21.443187053Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:21.443938 containerd[1633]: time="2026-03-13T00:37:21.443920094Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.9\" with image id \"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\", size \"27678758\" in 1.255627016s" Mar 13 00:37:21.444004 containerd[1633]: time="2026-03-13T00:37:21.443994114Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\" returns image reference \"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\"" Mar 13 00:37:21.444777 containerd[1633]: time="2026-03-13T00:37:21.444751645Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\"" Mar 13 00:37:22.491438 containerd[1633]: time="2026-03-13T00:37:22.491393787Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:22.492395 containerd[1633]: time="2026-03-13T00:37:22.492370857Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.9: active requests=0, bytes read=20162768" Mar 13 00:37:22.493533 containerd[1633]: time="2026-03-13T00:37:22.493503788Z" level=info msg="ImageCreate event name:\"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:22.495228 containerd[1633]: time="2026-03-13T00:37:22.495198540Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:22.495971 containerd[1633]: time="2026-03-13T00:37:22.495691000Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.9\" with image id \"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\", size \"21819712\" in 1.050850405s" Mar 13 00:37:22.495971 containerd[1633]: time="2026-03-13T00:37:22.495713150Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\" returns image reference \"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\"" Mar 13 00:37:22.496052 containerd[1633]: time="2026-03-13T00:37:22.496035300Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\"" Mar 13 00:37:23.477842 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2731098964.mount: Deactivated successfully. Mar 13 00:37:23.816301 containerd[1633]: time="2026-03-13T00:37:23.816160780Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:23.817408 containerd[1633]: time="2026-03-13T00:37:23.817269831Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.9: active requests=0, bytes read=31828675" Mar 13 00:37:23.818362 containerd[1633]: time="2026-03-13T00:37:23.818334862Z" level=info msg="ImageCreate event name:\"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:23.819967 containerd[1633]: time="2026-03-13T00:37:23.819941123Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:23.820294 containerd[1633]: time="2026-03-13T00:37:23.820276184Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.9\" with image id \"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\", repo tag \"registry.k8s.io/kube-proxy:v1.33.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\", size \"31827666\" in 1.324221174s" Mar 13 00:37:23.820346 containerd[1633]: time="2026-03-13T00:37:23.820336754Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\" returns image reference \"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\"" Mar 13 00:37:23.820945 containerd[1633]: time="2026-03-13T00:37:23.820905924Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Mar 13 00:37:24.338542 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1984371929.mount: Deactivated successfully. Mar 13 00:37:25.108701 containerd[1633]: time="2026-03-13T00:37:25.108654747Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:25.109650 containerd[1633]: time="2026-03-13T00:37:25.109377508Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942332" Mar 13 00:37:25.110328 containerd[1633]: time="2026-03-13T00:37:25.110307038Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:25.112098 containerd[1633]: time="2026-03-13T00:37:25.112072760Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:25.113077 containerd[1633]: time="2026-03-13T00:37:25.113051891Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.292124057s" Mar 13 00:37:25.113166 containerd[1633]: time="2026-03-13T00:37:25.113151281Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Mar 13 00:37:25.113713 containerd[1633]: time="2026-03-13T00:37:25.113675631Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 13 00:37:25.562334 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount20825804.mount: Deactivated successfully. Mar 13 00:37:25.569969 containerd[1633]: time="2026-03-13T00:37:25.569747421Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:37:25.571057 containerd[1633]: time="2026-03-13T00:37:25.570981262Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Mar 13 00:37:25.572673 containerd[1633]: time="2026-03-13T00:37:25.572551893Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:37:25.575859 containerd[1633]: time="2026-03-13T00:37:25.575791336Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:37:25.577624 containerd[1633]: time="2026-03-13T00:37:25.576749667Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 463.010736ms" Mar 13 00:37:25.577624 containerd[1633]: time="2026-03-13T00:37:25.576827137Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Mar 13 00:37:25.577994 containerd[1633]: time="2026-03-13T00:37:25.577951678Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\"" Mar 13 00:37:26.089295 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount697390717.mount: Deactivated successfully. Mar 13 00:37:26.803677 containerd[1633]: time="2026-03-13T00:37:26.803618289Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.24-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:26.804685 containerd[1633]: time="2026-03-13T00:37:26.804487550Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.24-0: active requests=0, bytes read=23718940" Mar 13 00:37:26.805439 containerd[1633]: time="2026-03-13T00:37:26.805418930Z" level=info msg="ImageCreate event name:\"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:26.808305 containerd[1633]: time="2026-03-13T00:37:26.808280643Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:26.808964 containerd[1633]: time="2026-03-13T00:37:26.808945633Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.24-0\" with image id \"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\", repo tag \"registry.k8s.io/etcd:3.5.24-0\", repo digest \"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\", size \"23716032\" in 1.230953785s" Mar 13 00:37:26.809030 containerd[1633]: time="2026-03-13T00:37:26.809019713Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\" returns image reference \"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\"" Mar 13 00:37:28.994001 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:37:28.994156 systemd[1]: kubelet.service: Consumed 173ms CPU time, 108.8M memory peak. Mar 13 00:37:28.996844 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:37:29.021394 systemd[1]: Reload requested from client PID 2306 ('systemctl') (unit session-7.scope)... Mar 13 00:37:29.021409 systemd[1]: Reloading... Mar 13 00:37:29.137507 zram_generator::config[2362]: No configuration found. Mar 13 00:37:29.297389 systemd[1]: Reloading finished in 275 ms. Mar 13 00:37:29.349944 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 13 00:37:29.350029 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 13 00:37:29.350252 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:37:29.350289 systemd[1]: kubelet.service: Consumed 118ms CPU time, 98.3M memory peak. Mar 13 00:37:29.353629 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:37:29.508185 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:37:29.514798 (kubelet)[2404]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 13 00:37:29.543127 kubelet[2404]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 00:37:29.543127 kubelet[2404]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 13 00:37:29.543127 kubelet[2404]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 00:37:29.543127 kubelet[2404]: I0313 00:37:29.543059 2404 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 00:37:30.236734 kubelet[2404]: I0313 00:37:30.236680 2404 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 13 00:37:30.236734 kubelet[2404]: I0313 00:37:30.236701 2404 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 00:37:30.237046 kubelet[2404]: I0313 00:37:30.236931 2404 server.go:956] "Client rotation is on, will bootstrap in background" Mar 13 00:37:30.261759 kubelet[2404]: E0313 00:37:30.261729 2404 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://89.167.87.208:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 89.167.87.208:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 13 00:37:30.263555 kubelet[2404]: I0313 00:37:30.263267 2404 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 13 00:37:30.268116 kubelet[2404]: I0313 00:37:30.268098 2404 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 00:37:30.271045 kubelet[2404]: I0313 00:37:30.271028 2404 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 13 00:37:30.271221 kubelet[2404]: I0313 00:37:30.271194 2404 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 00:37:30.271326 kubelet[2404]: I0313 00:37:30.271214 2404 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-86976195a3","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 00:37:30.271326 kubelet[2404]: I0313 00:37:30.271322 2404 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 00:37:30.271326 kubelet[2404]: I0313 00:37:30.271328 2404 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 00:37:30.271463 kubelet[2404]: I0313 00:37:30.271422 2404 state_mem.go:36] "Initialized new in-memory state store" Mar 13 00:37:30.275815 kubelet[2404]: I0313 00:37:30.275751 2404 kubelet.go:480] "Attempting to sync node with API server" Mar 13 00:37:30.275815 kubelet[2404]: I0313 00:37:30.275776 2404 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 00:37:30.275815 kubelet[2404]: I0313 00:37:30.275797 2404 kubelet.go:386] "Adding apiserver pod source" Mar 13 00:37:30.277424 kubelet[2404]: I0313 00:37:30.277148 2404 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 00:37:30.278179 kubelet[2404]: E0313 00:37:30.278158 2404 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://89.167.87.208:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-n-86976195a3&limit=500&resourceVersion=0\": dial tcp 89.167.87.208:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 13 00:37:30.279301 kubelet[2404]: I0313 00:37:30.279286 2404 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 13 00:37:30.279632 kubelet[2404]: I0313 00:37:30.279620 2404 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 13 00:37:30.280637 kubelet[2404]: W0313 00:37:30.280621 2404 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 13 00:37:30.285225 kubelet[2404]: I0313 00:37:30.284830 2404 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 13 00:37:30.285225 kubelet[2404]: I0313 00:37:30.284858 2404 server.go:1289] "Started kubelet" Mar 13 00:37:30.285225 kubelet[2404]: E0313 00:37:30.284957 2404 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://89.167.87.208:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 89.167.87.208:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 13 00:37:30.286521 kubelet[2404]: I0313 00:37:30.285802 2404 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 00:37:30.287503 kubelet[2404]: I0313 00:37:30.286780 2404 server.go:317] "Adding debug handlers to kubelet server" Mar 13 00:37:30.287503 kubelet[2404]: I0313 00:37:30.286857 2404 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 00:37:30.287503 kubelet[2404]: I0313 00:37:30.287096 2404 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 00:37:30.293489 kubelet[2404]: E0313 00:37:30.291072 2404 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://89.167.87.208:6443/api/v1/namespaces/default/events\": dial tcp 89.167.87.208:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-4-n-86976195a3.189c3f922a50bc55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-4-n-86976195a3,UID:ci-4459-2-4-n-86976195a3,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-86976195a3,},FirstTimestamp:2026-03-13 00:37:30.284842069 +0000 UTC m=+0.765893109,LastTimestamp:2026-03-13 00:37:30.284842069 +0000 UTC m=+0.765893109,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-86976195a3,}" Mar 13 00:37:30.293489 kubelet[2404]: I0313 00:37:30.292554 2404 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 00:37:30.293489 kubelet[2404]: I0313 00:37:30.292974 2404 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 13 00:37:30.296063 kubelet[2404]: I0313 00:37:30.296051 2404 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 13 00:37:30.296272 kubelet[2404]: E0313 00:37:30.296260 2404 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-86976195a3\" not found" Mar 13 00:37:30.296395 kubelet[2404]: I0313 00:37:30.296387 2404 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 13 00:37:30.296545 kubelet[2404]: I0313 00:37:30.296535 2404 reconciler.go:26] "Reconciler: start to sync state" Mar 13 00:37:30.297132 kubelet[2404]: E0313 00:37:30.297117 2404 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://89.167.87.208:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 89.167.87.208:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 13 00:37:30.297244 kubelet[2404]: E0313 00:37:30.297224 2404 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://89.167.87.208:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-86976195a3?timeout=10s\": dial tcp 89.167.87.208:6443: connect: connection refused" interval="200ms" Mar 13 00:37:30.299030 kubelet[2404]: I0313 00:37:30.299001 2404 factory.go:223] Registration of the systemd container factory successfully Mar 13 00:37:30.299210 kubelet[2404]: I0313 00:37:30.299190 2404 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 13 00:37:30.300618 kubelet[2404]: I0313 00:37:30.300249 2404 factory.go:223] Registration of the containerd container factory successfully Mar 13 00:37:30.305384 kubelet[2404]: E0313 00:37:30.305363 2404 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 13 00:37:30.316405 kubelet[2404]: I0313 00:37:30.316365 2404 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 13 00:37:30.317389 kubelet[2404]: I0313 00:37:30.317365 2404 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 13 00:37:30.317389 kubelet[2404]: I0313 00:37:30.317391 2404 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 13 00:37:30.317469 kubelet[2404]: I0313 00:37:30.317408 2404 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 13 00:37:30.317469 kubelet[2404]: I0313 00:37:30.317415 2404 kubelet.go:2436] "Starting kubelet main sync loop" Mar 13 00:37:30.317469 kubelet[2404]: E0313 00:37:30.317445 2404 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 00:37:30.323844 kubelet[2404]: E0313 00:37:30.323681 2404 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://89.167.87.208:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 89.167.87.208:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 13 00:37:30.328405 kubelet[2404]: I0313 00:37:30.328394 2404 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 13 00:37:30.328524 kubelet[2404]: I0313 00:37:30.328512 2404 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 13 00:37:30.328614 kubelet[2404]: I0313 00:37:30.328606 2404 state_mem.go:36] "Initialized new in-memory state store" Mar 13 00:37:30.330687 kubelet[2404]: I0313 00:37:30.330534 2404 policy_none.go:49] "None policy: Start" Mar 13 00:37:30.330687 kubelet[2404]: I0313 00:37:30.330547 2404 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 13 00:37:30.330687 kubelet[2404]: I0313 00:37:30.330557 2404 state_mem.go:35] "Initializing new in-memory state store" Mar 13 00:37:30.336160 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 13 00:37:30.356964 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 13 00:37:30.360041 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 13 00:37:30.371402 kubelet[2404]: E0313 00:37:30.371334 2404 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 13 00:37:30.371514 kubelet[2404]: I0313 00:37:30.371503 2404 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 00:37:30.371551 kubelet[2404]: I0313 00:37:30.371515 2404 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 00:37:30.372579 kubelet[2404]: I0313 00:37:30.372362 2404 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 00:37:30.373537 kubelet[2404]: E0313 00:37:30.373526 2404 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 13 00:37:30.373615 kubelet[2404]: E0313 00:37:30.373607 2404 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-4-n-86976195a3\" not found" Mar 13 00:37:30.439052 systemd[1]: Created slice kubepods-burstable-podb3296c0246d369a29c26aaca40ad972b.slice - libcontainer container kubepods-burstable-podb3296c0246d369a29c26aaca40ad972b.slice. Mar 13 00:37:30.456516 kubelet[2404]: E0313 00:37:30.456416 2404 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-86976195a3\" not found" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.461095 systemd[1]: Created slice kubepods-burstable-podfdaa2810c038f7fa0ac8293411307a64.slice - libcontainer container kubepods-burstable-podfdaa2810c038f7fa0ac8293411307a64.slice. Mar 13 00:37:30.475703 kubelet[2404]: E0313 00:37:30.475677 2404 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-86976195a3\" not found" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.477127 kubelet[2404]: I0313 00:37:30.477105 2404 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.477948 kubelet[2404]: E0313 00:37:30.477890 2404 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://89.167.87.208:6443/api/v1/nodes\": dial tcp 89.167.87.208:6443: connect: connection refused" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.481260 systemd[1]: Created slice kubepods-burstable-podff3219d6ce92562d13b62107d343d221.slice - libcontainer container kubepods-burstable-podff3219d6ce92562d13b62107d343d221.slice. Mar 13 00:37:30.484307 kubelet[2404]: E0313 00:37:30.484276 2404 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-86976195a3\" not found" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.498519 kubelet[2404]: I0313 00:37:30.497603 2404 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fdaa2810c038f7fa0ac8293411307a64-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-86976195a3\" (UID: \"fdaa2810c038f7fa0ac8293411307a64\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.498519 kubelet[2404]: I0313 00:37:30.497679 2404 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fdaa2810c038f7fa0ac8293411307a64-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-86976195a3\" (UID: \"fdaa2810c038f7fa0ac8293411307a64\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.498519 kubelet[2404]: E0313 00:37:30.497700 2404 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://89.167.87.208:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-86976195a3?timeout=10s\": dial tcp 89.167.87.208:6443: connect: connection refused" interval="400ms" Mar 13 00:37:30.498519 kubelet[2404]: I0313 00:37:30.498316 2404 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fdaa2810c038f7fa0ac8293411307a64-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-86976195a3\" (UID: \"fdaa2810c038f7fa0ac8293411307a64\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.498519 kubelet[2404]: I0313 00:37:30.498401 2404 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fdaa2810c038f7fa0ac8293411307a64-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-86976195a3\" (UID: \"fdaa2810c038f7fa0ac8293411307a64\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.498764 kubelet[2404]: I0313 00:37:30.498427 2404 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fdaa2810c038f7fa0ac8293411307a64-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-86976195a3\" (UID: \"fdaa2810c038f7fa0ac8293411307a64\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.498764 kubelet[2404]: I0313 00:37:30.498530 2404 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ff3219d6ce92562d13b62107d343d221-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-86976195a3\" (UID: \"ff3219d6ce92562d13b62107d343d221\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.498764 kubelet[2404]: I0313 00:37:30.498596 2404 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b3296c0246d369a29c26aaca40ad972b-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-86976195a3\" (UID: \"b3296c0246d369a29c26aaca40ad972b\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.498764 kubelet[2404]: I0313 00:37:30.498617 2404 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b3296c0246d369a29c26aaca40ad972b-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-86976195a3\" (UID: \"b3296c0246d369a29c26aaca40ad972b\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.498764 kubelet[2404]: I0313 00:37:30.498681 2404 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b3296c0246d369a29c26aaca40ad972b-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-86976195a3\" (UID: \"b3296c0246d369a29c26aaca40ad972b\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.680586 kubelet[2404]: I0313 00:37:30.680431 2404 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.681302 kubelet[2404]: E0313 00:37:30.680772 2404 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://89.167.87.208:6443/api/v1/nodes\": dial tcp 89.167.87.208:6443: connect: connection refused" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:30.758392 containerd[1633]: time="2026-03-13T00:37:30.758277073Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-86976195a3,Uid:b3296c0246d369a29c26aaca40ad972b,Namespace:kube-system,Attempt:0,}" Mar 13 00:37:30.773735 containerd[1633]: time="2026-03-13T00:37:30.773657876Z" level=info msg="connecting to shim d7c28fd48b3d109295be021f61212ec0016a4c85c38b34cf5f7ab6305ca9a26d" address="unix:///run/containerd/s/65cdfe13fdb7fd8924e02332945cad18fe5db6393092f6329495bbf3421bf82e" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:30.782002 containerd[1633]: time="2026-03-13T00:37:30.781973553Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-86976195a3,Uid:fdaa2810c038f7fa0ac8293411307a64,Namespace:kube-system,Attempt:0,}" Mar 13 00:37:30.786579 containerd[1633]: time="2026-03-13T00:37:30.786350877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-86976195a3,Uid:ff3219d6ce92562d13b62107d343d221,Namespace:kube-system,Attempt:0,}" Mar 13 00:37:30.798824 systemd[1]: Started cri-containerd-d7c28fd48b3d109295be021f61212ec0016a4c85c38b34cf5f7ab6305ca9a26d.scope - libcontainer container d7c28fd48b3d109295be021f61212ec0016a4c85c38b34cf5f7ab6305ca9a26d. Mar 13 00:37:30.807338 containerd[1633]: time="2026-03-13T00:37:30.807287704Z" level=info msg="connecting to shim 9733e54e71f5bcb193c00e8fd100e318f12ab864e6ae2f902d76723f082d2677" address="unix:///run/containerd/s/fbba04a9df17e65e9fffb84e6f6c2eb2fa4c54999b54dab12a6224b9b8d6ba10" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:30.815601 containerd[1633]: time="2026-03-13T00:37:30.815274061Z" level=info msg="connecting to shim 5b75cf094cc77c77d668712426fa51a6afce040b8f5590ebd8f9e5606cb83ebe" address="unix:///run/containerd/s/504e720ce657650fa69c8108c526b54491ab2322dab91a41246a94533dc4fc43" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:30.843584 systemd[1]: Started cri-containerd-5b75cf094cc77c77d668712426fa51a6afce040b8f5590ebd8f9e5606cb83ebe.scope - libcontainer container 5b75cf094cc77c77d668712426fa51a6afce040b8f5590ebd8f9e5606cb83ebe. Mar 13 00:37:30.847221 systemd[1]: Started cri-containerd-9733e54e71f5bcb193c00e8fd100e318f12ab864e6ae2f902d76723f082d2677.scope - libcontainer container 9733e54e71f5bcb193c00e8fd100e318f12ab864e6ae2f902d76723f082d2677. Mar 13 00:37:30.880423 containerd[1633]: time="2026-03-13T00:37:30.880297465Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-86976195a3,Uid:b3296c0246d369a29c26aaca40ad972b,Namespace:kube-system,Attempt:0,} returns sandbox id \"d7c28fd48b3d109295be021f61212ec0016a4c85c38b34cf5f7ab6305ca9a26d\"" Mar 13 00:37:30.887390 containerd[1633]: time="2026-03-13T00:37:30.887251541Z" level=info msg="CreateContainer within sandbox \"d7c28fd48b3d109295be021f61212ec0016a4c85c38b34cf5f7ab6305ca9a26d\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 13 00:37:30.898285 kubelet[2404]: E0313 00:37:30.898250 2404 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://89.167.87.208:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-86976195a3?timeout=10s\": dial tcp 89.167.87.208:6443: connect: connection refused" interval="800ms" Mar 13 00:37:30.899200 containerd[1633]: time="2026-03-13T00:37:30.899161311Z" level=info msg="Container 9cc353eca98e4ac9077bc5f3fd84395c1760597d92a30e5aa2e961e1a8c3f0b0: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:30.902003 containerd[1633]: time="2026-03-13T00:37:30.901968433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-86976195a3,Uid:fdaa2810c038f7fa0ac8293411307a64,Namespace:kube-system,Attempt:0,} returns sandbox id \"9733e54e71f5bcb193c00e8fd100e318f12ab864e6ae2f902d76723f082d2677\"" Mar 13 00:37:30.911571 containerd[1633]: time="2026-03-13T00:37:30.911516371Z" level=info msg="CreateContainer within sandbox \"9733e54e71f5bcb193c00e8fd100e318f12ab864e6ae2f902d76723f082d2677\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 13 00:37:30.912971 containerd[1633]: time="2026-03-13T00:37:30.912923522Z" level=info msg="CreateContainer within sandbox \"d7c28fd48b3d109295be021f61212ec0016a4c85c38b34cf5f7ab6305ca9a26d\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"9cc353eca98e4ac9077bc5f3fd84395c1760597d92a30e5aa2e961e1a8c3f0b0\"" Mar 13 00:37:30.914492 containerd[1633]: time="2026-03-13T00:37:30.914217383Z" level=info msg="StartContainer for \"9cc353eca98e4ac9077bc5f3fd84395c1760597d92a30e5aa2e961e1a8c3f0b0\"" Mar 13 00:37:30.915879 containerd[1633]: time="2026-03-13T00:37:30.915865365Z" level=info msg="connecting to shim 9cc353eca98e4ac9077bc5f3fd84395c1760597d92a30e5aa2e961e1a8c3f0b0" address="unix:///run/containerd/s/65cdfe13fdb7fd8924e02332945cad18fe5db6393092f6329495bbf3421bf82e" protocol=ttrpc version=3 Mar 13 00:37:30.921392 containerd[1633]: time="2026-03-13T00:37:30.921376479Z" level=info msg="Container 267b1a2ee333e933aad955eaff3dcc25a62173231acf15385235b9106514d975: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:30.929979 containerd[1633]: time="2026-03-13T00:37:30.929909266Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-86976195a3,Uid:ff3219d6ce92562d13b62107d343d221,Namespace:kube-system,Attempt:0,} returns sandbox id \"5b75cf094cc77c77d668712426fa51a6afce040b8f5590ebd8f9e5606cb83ebe\"" Mar 13 00:37:30.931350 containerd[1633]: time="2026-03-13T00:37:30.931309438Z" level=info msg="CreateContainer within sandbox \"9733e54e71f5bcb193c00e8fd100e318f12ab864e6ae2f902d76723f082d2677\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"267b1a2ee333e933aad955eaff3dcc25a62173231acf15385235b9106514d975\"" Mar 13 00:37:30.931843 containerd[1633]: time="2026-03-13T00:37:30.931768268Z" level=info msg="StartContainer for \"267b1a2ee333e933aad955eaff3dcc25a62173231acf15385235b9106514d975\"" Mar 13 00:37:30.932644 systemd[1]: Started cri-containerd-9cc353eca98e4ac9077bc5f3fd84395c1760597d92a30e5aa2e961e1a8c3f0b0.scope - libcontainer container 9cc353eca98e4ac9077bc5f3fd84395c1760597d92a30e5aa2e961e1a8c3f0b0. Mar 13 00:37:30.933053 containerd[1633]: time="2026-03-13T00:37:30.933035999Z" level=info msg="connecting to shim 267b1a2ee333e933aad955eaff3dcc25a62173231acf15385235b9106514d975" address="unix:///run/containerd/s/fbba04a9df17e65e9fffb84e6f6c2eb2fa4c54999b54dab12a6224b9b8d6ba10" protocol=ttrpc version=3 Mar 13 00:37:30.935681 containerd[1633]: time="2026-03-13T00:37:30.935665731Z" level=info msg="CreateContainer within sandbox \"5b75cf094cc77c77d668712426fa51a6afce040b8f5590ebd8f9e5606cb83ebe\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 13 00:37:30.943850 containerd[1633]: time="2026-03-13T00:37:30.943833728Z" level=info msg="Container 9a835eb74ceafecb6e1233ff8ac76dcc26dd9e9d116c8941a5b9e8e18c716ead: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:30.952003 containerd[1633]: time="2026-03-13T00:37:30.951982845Z" level=info msg="CreateContainer within sandbox \"5b75cf094cc77c77d668712426fa51a6afce040b8f5590ebd8f9e5606cb83ebe\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"9a835eb74ceafecb6e1233ff8ac76dcc26dd9e9d116c8941a5b9e8e18c716ead\"" Mar 13 00:37:30.957876 containerd[1633]: time="2026-03-13T00:37:30.956461969Z" level=info msg="StartContainer for \"9a835eb74ceafecb6e1233ff8ac76dcc26dd9e9d116c8941a5b9e8e18c716ead\"" Mar 13 00:37:30.959556 containerd[1633]: time="2026-03-13T00:37:30.959508441Z" level=info msg="connecting to shim 9a835eb74ceafecb6e1233ff8ac76dcc26dd9e9d116c8941a5b9e8e18c716ead" address="unix:///run/containerd/s/504e720ce657650fa69c8108c526b54491ab2322dab91a41246a94533dc4fc43" protocol=ttrpc version=3 Mar 13 00:37:30.960597 systemd[1]: Started cri-containerd-267b1a2ee333e933aad955eaff3dcc25a62173231acf15385235b9106514d975.scope - libcontainer container 267b1a2ee333e933aad955eaff3dcc25a62173231acf15385235b9106514d975. Mar 13 00:37:30.979612 systemd[1]: Started cri-containerd-9a835eb74ceafecb6e1233ff8ac76dcc26dd9e9d116c8941a5b9e8e18c716ead.scope - libcontainer container 9a835eb74ceafecb6e1233ff8ac76dcc26dd9e9d116c8941a5b9e8e18c716ead. Mar 13 00:37:31.010522 containerd[1633]: time="2026-03-13T00:37:31.010417723Z" level=info msg="StartContainer for \"9cc353eca98e4ac9077bc5f3fd84395c1760597d92a30e5aa2e961e1a8c3f0b0\" returns successfully" Mar 13 00:37:31.050816 containerd[1633]: time="2026-03-13T00:37:31.050787987Z" level=info msg="StartContainer for \"9a835eb74ceafecb6e1233ff8ac76dcc26dd9e9d116c8941a5b9e8e18c716ead\" returns successfully" Mar 13 00:37:31.069015 containerd[1633]: time="2026-03-13T00:37:31.068567652Z" level=info msg="StartContainer for \"267b1a2ee333e933aad955eaff3dcc25a62173231acf15385235b9106514d975\" returns successfully" Mar 13 00:37:31.083672 kubelet[2404]: I0313 00:37:31.083644 2404 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:31.335035 kubelet[2404]: E0313 00:37:31.334831 2404 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-86976195a3\" not found" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:31.338298 kubelet[2404]: E0313 00:37:31.338166 2404 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-86976195a3\" not found" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:31.339137 kubelet[2404]: E0313 00:37:31.339119 2404 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-86976195a3\" not found" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:32.345269 kubelet[2404]: E0313 00:37:32.345109 2404 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-86976195a3\" not found" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:32.345852 kubelet[2404]: E0313 00:37:32.345784 2404 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-86976195a3\" not found" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:32.353460 kubelet[2404]: E0313 00:37:32.353433 2404 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-4-n-86976195a3\" not found" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:32.424778 kubelet[2404]: I0313 00:37:32.424728 2404 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:32.424778 kubelet[2404]: E0313 00:37:32.424770 2404 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4459-2-4-n-86976195a3\": node \"ci-4459-2-4-n-86976195a3\" not found" Mar 13 00:37:32.497290 kubelet[2404]: I0313 00:37:32.497242 2404 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-86976195a3" Mar 13 00:37:32.504009 kubelet[2404]: E0313 00:37:32.503829 2404 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-n-86976195a3\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-86976195a3" Mar 13 00:37:32.504009 kubelet[2404]: I0313 00:37:32.503851 2404 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-86976195a3" Mar 13 00:37:32.505020 kubelet[2404]: E0313 00:37:32.504811 2404 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-86976195a3\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-4-n-86976195a3" Mar 13 00:37:32.505117 kubelet[2404]: I0313 00:37:32.505103 2404 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-86976195a3" Mar 13 00:37:32.507115 kubelet[2404]: E0313 00:37:32.507078 2404 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-86976195a3\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-4-n-86976195a3" Mar 13 00:37:33.286677 kubelet[2404]: I0313 00:37:33.286546 2404 apiserver.go:52] "Watching apiserver" Mar 13 00:37:33.296721 kubelet[2404]: I0313 00:37:33.296655 2404 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 13 00:37:33.341617 kubelet[2404]: I0313 00:37:33.341583 2404 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-86976195a3" Mar 13 00:37:34.489071 systemd[1]: Reload requested from client PID 2683 ('systemctl') (unit session-7.scope)... Mar 13 00:37:34.489411 systemd[1]: Reloading... Mar 13 00:37:34.594540 zram_generator::config[2729]: No configuration found. Mar 13 00:37:34.774802 systemd[1]: Reloading finished in 284 ms. Mar 13 00:37:34.798534 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:37:34.810256 systemd[1]: kubelet.service: Deactivated successfully. Mar 13 00:37:34.810460 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:37:34.810519 systemd[1]: kubelet.service: Consumed 1.097s CPU time, 130.7M memory peak. Mar 13 00:37:34.812988 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:37:34.973853 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:37:34.983019 (kubelet)[2778]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 13 00:37:35.016465 kubelet[2778]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 00:37:35.016465 kubelet[2778]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 13 00:37:35.016465 kubelet[2778]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 00:37:35.017489 kubelet[2778]: I0313 00:37:35.016862 2778 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 00:37:35.021421 kubelet[2778]: I0313 00:37:35.021407 2778 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 13 00:37:35.021511 kubelet[2778]: I0313 00:37:35.021503 2778 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 00:37:35.021669 kubelet[2778]: I0313 00:37:35.021660 2778 server.go:956] "Client rotation is on, will bootstrap in background" Mar 13 00:37:35.022525 kubelet[2778]: I0313 00:37:35.022509 2778 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 13 00:37:35.027540 kubelet[2778]: I0313 00:37:35.027447 2778 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 13 00:37:35.031562 kubelet[2778]: I0313 00:37:35.031551 2778 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 00:37:35.034603 kubelet[2778]: I0313 00:37:35.034589 2778 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 13 00:37:35.034872 kubelet[2778]: I0313 00:37:35.034854 2778 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 00:37:35.035008 kubelet[2778]: I0313 00:37:35.034916 2778 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-86976195a3","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 00:37:35.035102 kubelet[2778]: I0313 00:37:35.035094 2778 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 00:37:35.035130 kubelet[2778]: I0313 00:37:35.035125 2778 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 00:37:35.035197 kubelet[2778]: I0313 00:37:35.035192 2778 state_mem.go:36] "Initialized new in-memory state store" Mar 13 00:37:35.035407 kubelet[2778]: I0313 00:37:35.035399 2778 kubelet.go:480] "Attempting to sync node with API server" Mar 13 00:37:35.035757 kubelet[2778]: I0313 00:37:35.035748 2778 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 00:37:35.035830 kubelet[2778]: I0313 00:37:35.035823 2778 kubelet.go:386] "Adding apiserver pod source" Mar 13 00:37:35.035871 kubelet[2778]: I0313 00:37:35.035866 2778 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 00:37:35.037352 kubelet[2778]: I0313 00:37:35.036886 2778 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 13 00:37:35.037352 kubelet[2778]: I0313 00:37:35.037196 2778 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 13 00:37:35.041638 kubelet[2778]: I0313 00:37:35.041626 2778 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 13 00:37:35.041698 kubelet[2778]: I0313 00:37:35.041692 2778 server.go:1289] "Started kubelet" Mar 13 00:37:35.042941 kubelet[2778]: I0313 00:37:35.042928 2778 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 00:37:35.051873 kubelet[2778]: I0313 00:37:35.051848 2778 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 00:37:35.054425 kubelet[2778]: I0313 00:37:35.054414 2778 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 13 00:37:35.056056 kubelet[2778]: I0313 00:37:35.056045 2778 server.go:317] "Adding debug handlers to kubelet server" Mar 13 00:37:35.057309 kubelet[2778]: E0313 00:37:35.057291 2778 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-86976195a3\" not found" Mar 13 00:37:35.057521 kubelet[2778]: I0313 00:37:35.057511 2778 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 13 00:37:35.057641 kubelet[2778]: I0313 00:37:35.057635 2778 reconciler.go:26] "Reconciler: start to sync state" Mar 13 00:37:35.063304 kubelet[2778]: I0313 00:37:35.063270 2778 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 00:37:35.063534 kubelet[2778]: I0313 00:37:35.063523 2778 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 00:37:35.064641 kubelet[2778]: I0313 00:37:35.064629 2778 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 13 00:37:35.066492 kubelet[2778]: I0313 00:37:35.065658 2778 factory.go:223] Registration of the systemd container factory successfully Mar 13 00:37:35.066492 kubelet[2778]: I0313 00:37:35.065723 2778 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 13 00:37:35.067101 kubelet[2778]: I0313 00:37:35.067088 2778 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 13 00:37:35.068118 kubelet[2778]: I0313 00:37:35.068107 2778 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 13 00:37:35.068171 kubelet[2778]: I0313 00:37:35.068165 2778 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 13 00:37:35.068248 kubelet[2778]: I0313 00:37:35.068240 2778 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 13 00:37:35.068275 kubelet[2778]: I0313 00:37:35.068270 2778 kubelet.go:2436] "Starting kubelet main sync loop" Mar 13 00:37:35.068336 kubelet[2778]: E0313 00:37:35.068322 2778 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 00:37:35.075614 kubelet[2778]: I0313 00:37:35.075601 2778 factory.go:223] Registration of the containerd container factory successfully Mar 13 00:37:35.118505 kubelet[2778]: I0313 00:37:35.118468 2778 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 13 00:37:35.118626 kubelet[2778]: I0313 00:37:35.118619 2778 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 13 00:37:35.118666 kubelet[2778]: I0313 00:37:35.118660 2778 state_mem.go:36] "Initialized new in-memory state store" Mar 13 00:37:35.118797 kubelet[2778]: I0313 00:37:35.118788 2778 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 13 00:37:35.118840 kubelet[2778]: I0313 00:37:35.118826 2778 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 13 00:37:35.118951 kubelet[2778]: I0313 00:37:35.118860 2778 policy_none.go:49] "None policy: Start" Mar 13 00:37:35.118951 kubelet[2778]: I0313 00:37:35.118919 2778 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 13 00:37:35.118951 kubelet[2778]: I0313 00:37:35.118929 2778 state_mem.go:35] "Initializing new in-memory state store" Mar 13 00:37:35.119082 kubelet[2778]: I0313 00:37:35.119075 2778 state_mem.go:75] "Updated machine memory state" Mar 13 00:37:35.122715 kubelet[2778]: E0313 00:37:35.122686 2778 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 13 00:37:35.123190 kubelet[2778]: I0313 00:37:35.122991 2778 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 00:37:35.123190 kubelet[2778]: I0313 00:37:35.123012 2778 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 00:37:35.123464 kubelet[2778]: I0313 00:37:35.123334 2778 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 00:37:35.127550 kubelet[2778]: E0313 00:37:35.127326 2778 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 13 00:37:35.169750 kubelet[2778]: I0313 00:37:35.169702 2778 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.169991 kubelet[2778]: I0313 00:37:35.169963 2778 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.170152 kubelet[2778]: I0313 00:37:35.169802 2778 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.177579 kubelet[2778]: E0313 00:37:35.177560 2778 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-86976195a3\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.227087 kubelet[2778]: I0313 00:37:35.227054 2778 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.236366 kubelet[2778]: I0313 00:37:35.236291 2778 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.236601 kubelet[2778]: I0313 00:37:35.236552 2778 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.359324 kubelet[2778]: I0313 00:37:35.359095 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b3296c0246d369a29c26aaca40ad972b-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-86976195a3\" (UID: \"b3296c0246d369a29c26aaca40ad972b\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.359324 kubelet[2778]: I0313 00:37:35.359143 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b3296c0246d369a29c26aaca40ad972b-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-86976195a3\" (UID: \"b3296c0246d369a29c26aaca40ad972b\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.359324 kubelet[2778]: I0313 00:37:35.359172 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fdaa2810c038f7fa0ac8293411307a64-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-86976195a3\" (UID: \"fdaa2810c038f7fa0ac8293411307a64\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.359324 kubelet[2778]: I0313 00:37:35.359199 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fdaa2810c038f7fa0ac8293411307a64-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-86976195a3\" (UID: \"fdaa2810c038f7fa0ac8293411307a64\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.359324 kubelet[2778]: I0313 00:37:35.359227 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fdaa2810c038f7fa0ac8293411307a64-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-86976195a3\" (UID: \"fdaa2810c038f7fa0ac8293411307a64\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.359962 kubelet[2778]: I0313 00:37:35.359250 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ff3219d6ce92562d13b62107d343d221-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-86976195a3\" (UID: \"ff3219d6ce92562d13b62107d343d221\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.359962 kubelet[2778]: I0313 00:37:35.359273 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b3296c0246d369a29c26aaca40ad972b-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-86976195a3\" (UID: \"b3296c0246d369a29c26aaca40ad972b\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.359962 kubelet[2778]: I0313 00:37:35.359296 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fdaa2810c038f7fa0ac8293411307a64-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-86976195a3\" (UID: \"fdaa2810c038f7fa0ac8293411307a64\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-86976195a3" Mar 13 00:37:35.359962 kubelet[2778]: I0313 00:37:35.359319 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fdaa2810c038f7fa0ac8293411307a64-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-86976195a3\" (UID: \"fdaa2810c038f7fa0ac8293411307a64\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-86976195a3" Mar 13 00:37:36.041996 kubelet[2778]: I0313 00:37:36.041958 2778 apiserver.go:52] "Watching apiserver" Mar 13 00:37:36.057773 kubelet[2778]: I0313 00:37:36.057719 2778 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 13 00:37:36.099916 kubelet[2778]: I0313 00:37:36.099878 2778 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-86976195a3" Mar 13 00:37:36.101182 kubelet[2778]: I0313 00:37:36.100975 2778 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-86976195a3" Mar 13 00:37:36.108743 kubelet[2778]: E0313 00:37:36.108690 2778 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-86976195a3\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-4-n-86976195a3" Mar 13 00:37:36.110133 kubelet[2778]: E0313 00:37:36.110107 2778 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-86976195a3\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-n-86976195a3" Mar 13 00:37:36.116395 kubelet[2778]: I0313 00:37:36.116342 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-4-n-86976195a3" podStartSLOduration=3.116329937 podStartE2EDuration="3.116329937s" podCreationTimestamp="2026-03-13 00:37:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:37:36.115962487 +0000 UTC m=+1.128917822" watchObservedRunningTime="2026-03-13 00:37:36.116329937 +0000 UTC m=+1.129285272" Mar 13 00:37:36.123456 kubelet[2778]: I0313 00:37:36.123261 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-4-n-86976195a3" podStartSLOduration=1.123249813 podStartE2EDuration="1.123249813s" podCreationTimestamp="2026-03-13 00:37:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:37:36.122717712 +0000 UTC m=+1.135673057" watchObservedRunningTime="2026-03-13 00:37:36.123249813 +0000 UTC m=+1.136205158" Mar 13 00:37:36.133444 kubelet[2778]: I0313 00:37:36.133384 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-86976195a3" podStartSLOduration=1.133368471 podStartE2EDuration="1.133368471s" podCreationTimestamp="2026-03-13 00:37:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:37:36.133123821 +0000 UTC m=+1.146079166" watchObservedRunningTime="2026-03-13 00:37:36.133368471 +0000 UTC m=+1.146323806" Mar 13 00:37:39.197181 kubelet[2778]: I0313 00:37:39.197152 2778 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 13 00:37:39.197772 containerd[1633]: time="2026-03-13T00:37:39.197356614Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 13 00:37:39.198165 kubelet[2778]: I0313 00:37:39.197802 2778 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 13 00:37:39.688874 systemd[1]: Started sshd@7-89.167.87.208:22-60.244.155.109:37982.service - OpenSSH per-connection server daemon (60.244.155.109:37982). Mar 13 00:37:40.283356 systemd[1]: Created slice kubepods-besteffort-podf075d69d_f494_4bef_a29e_6fffedccef00.slice - libcontainer container kubepods-besteffort-podf075d69d_f494_4bef_a29e_6fffedccef00.slice. Mar 13 00:37:40.295773 kubelet[2778]: I0313 00:37:40.295693 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr722\" (UniqueName: \"kubernetes.io/projected/f075d69d-f494-4bef-a29e-6fffedccef00-kube-api-access-mr722\") pod \"kube-proxy-cjd9p\" (UID: \"f075d69d-f494-4bef-a29e-6fffedccef00\") " pod="kube-system/kube-proxy-cjd9p" Mar 13 00:37:40.295773 kubelet[2778]: I0313 00:37:40.295737 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/f075d69d-f494-4bef-a29e-6fffedccef00-kube-proxy\") pod \"kube-proxy-cjd9p\" (UID: \"f075d69d-f494-4bef-a29e-6fffedccef00\") " pod="kube-system/kube-proxy-cjd9p" Mar 13 00:37:40.295773 kubelet[2778]: I0313 00:37:40.295764 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f075d69d-f494-4bef-a29e-6fffedccef00-lib-modules\") pod \"kube-proxy-cjd9p\" (UID: \"f075d69d-f494-4bef-a29e-6fffedccef00\") " pod="kube-system/kube-proxy-cjd9p" Mar 13 00:37:40.295773 kubelet[2778]: I0313 00:37:40.295776 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f075d69d-f494-4bef-a29e-6fffedccef00-xtables-lock\") pod \"kube-proxy-cjd9p\" (UID: \"f075d69d-f494-4bef-a29e-6fffedccef00\") " pod="kube-system/kube-proxy-cjd9p" Mar 13 00:37:40.470242 systemd[1]: Created slice kubepods-besteffort-pod233a5fff_dac5_4b37_94e2_164ed589f546.slice - libcontainer container kubepods-besteffort-pod233a5fff_dac5_4b37_94e2_164ed589f546.slice. Mar 13 00:37:40.497212 kubelet[2778]: I0313 00:37:40.497156 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/233a5fff-dac5-4b37-94e2-164ed589f546-var-lib-calico\") pod \"tigera-operator-6bf85f8dd-gpvgn\" (UID: \"233a5fff-dac5-4b37-94e2-164ed589f546\") " pod="tigera-operator/tigera-operator-6bf85f8dd-gpvgn" Mar 13 00:37:40.497212 kubelet[2778]: I0313 00:37:40.497212 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z8b7\" (UniqueName: \"kubernetes.io/projected/233a5fff-dac5-4b37-94e2-164ed589f546-kube-api-access-7z8b7\") pod \"tigera-operator-6bf85f8dd-gpvgn\" (UID: \"233a5fff-dac5-4b37-94e2-164ed589f546\") " pod="tigera-operator/tigera-operator-6bf85f8dd-gpvgn" Mar 13 00:37:40.591632 containerd[1633]: time="2026-03-13T00:37:40.591297060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cjd9p,Uid:f075d69d-f494-4bef-a29e-6fffedccef00,Namespace:kube-system,Attempt:0,}" Mar 13 00:37:40.628518 containerd[1633]: time="2026-03-13T00:37:40.627836549Z" level=info msg="connecting to shim 19421ed74ce7a5effa0688f67c682b2247c309810236d5f55ad81395bd68d429" address="unix:///run/containerd/s/56ee561e151dddc3b05e2a253c8b19b3519a42fe0bfffcd0d0b8edf71b3c63bd" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:40.671593 systemd[1]: Started cri-containerd-19421ed74ce7a5effa0688f67c682b2247c309810236d5f55ad81395bd68d429.scope - libcontainer container 19421ed74ce7a5effa0688f67c682b2247c309810236d5f55ad81395bd68d429. Mar 13 00:37:40.702885 containerd[1633]: time="2026-03-13T00:37:40.702857543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cjd9p,Uid:f075d69d-f494-4bef-a29e-6fffedccef00,Namespace:kube-system,Attempt:0,} returns sandbox id \"19421ed74ce7a5effa0688f67c682b2247c309810236d5f55ad81395bd68d429\"" Mar 13 00:37:40.707293 containerd[1633]: time="2026-03-13T00:37:40.707256192Z" level=info msg="CreateContainer within sandbox \"19421ed74ce7a5effa0688f67c682b2247c309810236d5f55ad81395bd68d429\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 13 00:37:40.717903 containerd[1633]: time="2026-03-13T00:37:40.716514469Z" level=info msg="Container af9712b63d8433613ccb43daafd08cccb9bc588d81049b2bf7c2a4afe1f354a3: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:40.722671 containerd[1633]: time="2026-03-13T00:37:40.722653117Z" level=info msg="CreateContainer within sandbox \"19421ed74ce7a5effa0688f67c682b2247c309810236d5f55ad81395bd68d429\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"af9712b63d8433613ccb43daafd08cccb9bc588d81049b2bf7c2a4afe1f354a3\"" Mar 13 00:37:40.723335 containerd[1633]: time="2026-03-13T00:37:40.723318942Z" level=info msg="StartContainer for \"af9712b63d8433613ccb43daafd08cccb9bc588d81049b2bf7c2a4afe1f354a3\"" Mar 13 00:37:40.724258 containerd[1633]: time="2026-03-13T00:37:40.724234943Z" level=info msg="connecting to shim af9712b63d8433613ccb43daafd08cccb9bc588d81049b2bf7c2a4afe1f354a3" address="unix:///run/containerd/s/56ee561e151dddc3b05e2a253c8b19b3519a42fe0bfffcd0d0b8edf71b3c63bd" protocol=ttrpc version=3 Mar 13 00:37:40.743843 systemd[1]: Started cri-containerd-af9712b63d8433613ccb43daafd08cccb9bc588d81049b2bf7c2a4afe1f354a3.scope - libcontainer container af9712b63d8433613ccb43daafd08cccb9bc588d81049b2bf7c2a4afe1f354a3. Mar 13 00:37:40.774660 systemd[1]: Started sshd@8-89.167.87.208:22-125.91.16.250:38874.service - OpenSSH per-connection server daemon (125.91.16.250:38874). Mar 13 00:37:40.777261 containerd[1633]: time="2026-03-13T00:37:40.776819853Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-gpvgn,Uid:233a5fff-dac5-4b37-94e2-164ed589f546,Namespace:tigera-operator,Attempt:0,}" Mar 13 00:37:40.795635 containerd[1633]: time="2026-03-13T00:37:40.795591034Z" level=info msg="connecting to shim 653df142b714da5a83195cdf9273f7d325f85e6976e38cc4dcf61d92ceded312" address="unix:///run/containerd/s/8666144c9d49d8abfd125abd2c04920ac5b8771514a392bfa701b75e6f680650" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:40.818700 systemd[1]: Started cri-containerd-653df142b714da5a83195cdf9273f7d325f85e6976e38cc4dcf61d92ceded312.scope - libcontainer container 653df142b714da5a83195cdf9273f7d325f85e6976e38cc4dcf61d92ceded312. Mar 13 00:37:40.823452 containerd[1633]: time="2026-03-13T00:37:40.823425068Z" level=info msg="StartContainer for \"af9712b63d8433613ccb43daafd08cccb9bc588d81049b2bf7c2a4afe1f354a3\" returns successfully" Mar 13 00:37:40.866625 containerd[1633]: time="2026-03-13T00:37:40.866506325Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-gpvgn,Uid:233a5fff-dac5-4b37-94e2-164ed589f546,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"653df142b714da5a83195cdf9273f7d325f85e6976e38cc4dcf61d92ceded312\"" Mar 13 00:37:40.868726 containerd[1633]: time="2026-03-13T00:37:40.868622952Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 13 00:37:41.091137 sshd[2824]: Invalid user inversiones from 60.244.155.109 port 37982 Mar 13 00:37:41.130749 kubelet[2778]: I0313 00:37:41.130569 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-cjd9p" podStartSLOduration=1.130555365 podStartE2EDuration="1.130555365s" podCreationTimestamp="2026-03-13 00:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:37:41.129305528 +0000 UTC m=+6.142260883" watchObservedRunningTime="2026-03-13 00:37:41.130555365 +0000 UTC m=+6.143510710" Mar 13 00:37:41.354547 sshd[2824]: Received disconnect from 60.244.155.109 port 37982:11: Bye Bye [preauth] Mar 13 00:37:41.354547 sshd[2824]: Disconnected from invalid user inversiones 60.244.155.109 port 37982 [preauth] Mar 13 00:37:41.358158 systemd[1]: sshd@7-89.167.87.208:22-60.244.155.109:37982.service: Deactivated successfully. Mar 13 00:37:41.422328 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2587224332.mount: Deactivated successfully. Mar 13 00:37:42.859726 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1947941588.mount: Deactivated successfully. Mar 13 00:37:42.893119 sshd[2894]: Invalid user ts3 from 125.91.16.250 port 38874 Mar 13 00:37:43.103947 sshd[2894]: Received disconnect from 125.91.16.250 port 38874:11: Bye Bye [preauth] Mar 13 00:37:43.104176 sshd[2894]: Disconnected from invalid user ts3 125.91.16.250 port 38874 [preauth] Mar 13 00:37:43.106602 systemd[1]: sshd@8-89.167.87.208:22-125.91.16.250:38874.service: Deactivated successfully. Mar 13 00:37:43.514420 containerd[1633]: time="2026-03-13T00:37:43.514374491Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:43.515574 containerd[1633]: time="2026-03-13T00:37:43.515489801Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 13 00:37:43.516492 containerd[1633]: time="2026-03-13T00:37:43.516458930Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:43.518311 containerd[1633]: time="2026-03-13T00:37:43.518291243Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:43.518734 containerd[1633]: time="2026-03-13T00:37:43.518718012Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 2.650032288s" Mar 13 00:37:43.518845 containerd[1633]: time="2026-03-13T00:37:43.518781883Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 13 00:37:43.522526 containerd[1633]: time="2026-03-13T00:37:43.522500302Z" level=info msg="CreateContainer within sandbox \"653df142b714da5a83195cdf9273f7d325f85e6976e38cc4dcf61d92ceded312\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 13 00:37:43.531151 containerd[1633]: time="2026-03-13T00:37:43.529880590Z" level=info msg="Container 8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:43.533411 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2847855075.mount: Deactivated successfully. Mar 13 00:37:43.547275 containerd[1633]: time="2026-03-13T00:37:43.547234193Z" level=info msg="CreateContainer within sandbox \"653df142b714da5a83195cdf9273f7d325f85e6976e38cc4dcf61d92ceded312\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f\"" Mar 13 00:37:43.547755 containerd[1633]: time="2026-03-13T00:37:43.547730502Z" level=info msg="StartContainer for \"8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f\"" Mar 13 00:37:43.548734 containerd[1633]: time="2026-03-13T00:37:43.548633680Z" level=info msg="connecting to shim 8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f" address="unix:///run/containerd/s/8666144c9d49d8abfd125abd2c04920ac5b8771514a392bfa701b75e6f680650" protocol=ttrpc version=3 Mar 13 00:37:43.568607 systemd[1]: Started cri-containerd-8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f.scope - libcontainer container 8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f. Mar 13 00:37:43.594391 containerd[1633]: time="2026-03-13T00:37:43.594287850Z" level=info msg="StartContainer for \"8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f\" returns successfully" Mar 13 00:37:47.326914 kubelet[2778]: I0313 00:37:47.326549 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6bf85f8dd-gpvgn" podStartSLOduration=4.675075786 podStartE2EDuration="7.326529113s" podCreationTimestamp="2026-03-13 00:37:40 +0000 UTC" firstStartedPulling="2026-03-13 00:37:40.867953727 +0000 UTC m=+5.880909072" lastFinishedPulling="2026-03-13 00:37:43.519407064 +0000 UTC m=+8.532362399" observedRunningTime="2026-03-13 00:37:44.136929042 +0000 UTC m=+9.149884387" watchObservedRunningTime="2026-03-13 00:37:47.326529113 +0000 UTC m=+12.339484488" Mar 13 00:37:48.752714 sudo[1838]: pam_unix(sudo:session): session closed for user root Mar 13 00:37:48.873207 sshd[1837]: Connection closed by 4.153.228.146 port 36976 Mar 13 00:37:48.874805 sshd-session[1834]: pam_unix(sshd:session): session closed for user core Mar 13 00:37:48.878901 systemd[1]: sshd@6-89.167.87.208:22-4.153.228.146:36976.service: Deactivated successfully. Mar 13 00:37:48.881754 systemd[1]: session-7.scope: Deactivated successfully. Mar 13 00:37:48.882106 systemd[1]: session-7.scope: Consumed 4.148s CPU time, 234.5M memory peak. Mar 13 00:37:48.884648 systemd-logind[1602]: Session 7 logged out. Waiting for processes to exit. Mar 13 00:37:48.887166 systemd-logind[1602]: Removed session 7. Mar 13 00:37:50.477517 systemd[1]: Created slice kubepods-besteffort-podcffbf933_c7cf_44d7_b429_cec7eaf8bb5c.slice - libcontainer container kubepods-besteffort-podcffbf933_c7cf_44d7_b429_cec7eaf8bb5c.slice. Mar 13 00:37:50.551283 systemd[1]: Created slice kubepods-besteffort-podfd1ec457_df47_4303_a013_e8c88c06aec1.slice - libcontainer container kubepods-besteffort-podfd1ec457_df47_4303_a013_e8c88c06aec1.slice. Mar 13 00:37:50.561055 kubelet[2778]: I0313 00:37:50.561020 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/fd1ec457-df47-4303-a013-e8c88c06aec1-flexvol-driver-host\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.561055 kubelet[2778]: I0313 00:37:50.561053 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/fd1ec457-df47-4303-a013-e8c88c06aec1-policysync\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.561055 kubelet[2778]: I0313 00:37:50.561067 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/fd1ec457-df47-4303-a013-e8c88c06aec1-cni-bin-dir\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.561440 kubelet[2778]: I0313 00:37:50.561079 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd1ec457-df47-4303-a013-e8c88c06aec1-tigera-ca-bundle\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.561440 kubelet[2778]: I0313 00:37:50.561091 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fd1ec457-df47-4303-a013-e8c88c06aec1-xtables-lock\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.561440 kubelet[2778]: I0313 00:37:50.561104 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/fd1ec457-df47-4303-a013-e8c88c06aec1-cni-net-dir\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.561440 kubelet[2778]: I0313 00:37:50.561128 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/fd1ec457-df47-4303-a013-e8c88c06aec1-node-certs\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.561440 kubelet[2778]: I0313 00:37:50.561139 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fd1ec457-df47-4303-a013-e8c88c06aec1-sys-fs\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.562198 kubelet[2778]: I0313 00:37:50.561148 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/fd1ec457-df47-4303-a013-e8c88c06aec1-var-run-calico\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.562198 kubelet[2778]: I0313 00:37:50.561159 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4q6l\" (UniqueName: \"kubernetes.io/projected/fd1ec457-df47-4303-a013-e8c88c06aec1-kube-api-access-s4q6l\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.562198 kubelet[2778]: I0313 00:37:50.561170 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/fd1ec457-df47-4303-a013-e8c88c06aec1-cni-log-dir\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.562198 kubelet[2778]: I0313 00:37:50.561182 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/fd1ec457-df47-4303-a013-e8c88c06aec1-nodeproc\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.562198 kubelet[2778]: I0313 00:37:50.561195 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fd1ec457-df47-4303-a013-e8c88c06aec1-var-lib-calico\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.562321 kubelet[2778]: I0313 00:37:50.561206 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/cffbf933-c7cf-44d7-b429-cec7eaf8bb5c-typha-certs\") pod \"calico-typha-784cdff998-fk6dw\" (UID: \"cffbf933-c7cf-44d7-b429-cec7eaf8bb5c\") " pod="calico-system/calico-typha-784cdff998-fk6dw" Mar 13 00:37:50.562321 kubelet[2778]: I0313 00:37:50.561216 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd1ec457-df47-4303-a013-e8c88c06aec1-lib-modules\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.562321 kubelet[2778]: I0313 00:37:50.561234 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cffbf933-c7cf-44d7-b429-cec7eaf8bb5c-tigera-ca-bundle\") pod \"calico-typha-784cdff998-fk6dw\" (UID: \"cffbf933-c7cf-44d7-b429-cec7eaf8bb5c\") " pod="calico-system/calico-typha-784cdff998-fk6dw" Mar 13 00:37:50.562321 kubelet[2778]: I0313 00:37:50.561244 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cthx\" (UniqueName: \"kubernetes.io/projected/cffbf933-c7cf-44d7-b429-cec7eaf8bb5c-kube-api-access-5cthx\") pod \"calico-typha-784cdff998-fk6dw\" (UID: \"cffbf933-c7cf-44d7-b429-cec7eaf8bb5c\") " pod="calico-system/calico-typha-784cdff998-fk6dw" Mar 13 00:37:50.562321 kubelet[2778]: I0313 00:37:50.561255 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/fd1ec457-df47-4303-a013-e8c88c06aec1-bpffs\") pod \"calico-node-mxq97\" (UID: \"fd1ec457-df47-4303-a013-e8c88c06aec1\") " pod="calico-system/calico-node-mxq97" Mar 13 00:37:50.656438 kubelet[2778]: E0313 00:37:50.656386 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8jxzz" podUID="61d0fe9a-7940-4074-923f-15bb8359aff2" Mar 13 00:37:50.669550 kubelet[2778]: E0313 00:37:50.668652 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.669550 kubelet[2778]: W0313 00:37:50.668679 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.669550 kubelet[2778]: E0313 00:37:50.668699 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.669550 kubelet[2778]: E0313 00:37:50.668880 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.669550 kubelet[2778]: W0313 00:37:50.668889 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.669550 kubelet[2778]: E0313 00:37:50.668899 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.669550 kubelet[2778]: E0313 00:37:50.669053 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.669550 kubelet[2778]: W0313 00:37:50.669061 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.669550 kubelet[2778]: E0313 00:37:50.669069 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.670580 kubelet[2778]: E0313 00:37:50.670544 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.670580 kubelet[2778]: W0313 00:37:50.670560 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.670580 kubelet[2778]: E0313 00:37:50.670570 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.672566 kubelet[2778]: E0313 00:37:50.671763 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.672566 kubelet[2778]: W0313 00:37:50.671772 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.672566 kubelet[2778]: E0313 00:37:50.671782 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.679555 kubelet[2778]: E0313 00:37:50.679522 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.680141 kubelet[2778]: W0313 00:37:50.679620 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.680141 kubelet[2778]: E0313 00:37:50.679756 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.695095 kubelet[2778]: E0313 00:37:50.695058 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.695095 kubelet[2778]: W0313 00:37:50.695084 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.695095 kubelet[2778]: E0313 00:37:50.695102 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.709772 kubelet[2778]: E0313 00:37:50.709727 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.709772 kubelet[2778]: W0313 00:37:50.709752 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.709772 kubelet[2778]: E0313 00:37:50.709767 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.750952 kubelet[2778]: E0313 00:37:50.750826 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.750952 kubelet[2778]: W0313 00:37:50.750852 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.750952 kubelet[2778]: E0313 00:37:50.750871 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.752016 kubelet[2778]: E0313 00:37:50.751061 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.752016 kubelet[2778]: W0313 00:37:50.751071 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.752016 kubelet[2778]: E0313 00:37:50.751079 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.752016 kubelet[2778]: E0313 00:37:50.751230 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.752016 kubelet[2778]: W0313 00:37:50.751237 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.752016 kubelet[2778]: E0313 00:37:50.751244 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.752016 kubelet[2778]: E0313 00:37:50.751404 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.752016 kubelet[2778]: W0313 00:37:50.751408 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.752016 kubelet[2778]: E0313 00:37:50.751414 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.752016 kubelet[2778]: E0313 00:37:50.751578 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.752427 kubelet[2778]: W0313 00:37:50.751585 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.752427 kubelet[2778]: E0313 00:37:50.751593 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.752427 kubelet[2778]: E0313 00:37:50.751738 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.752427 kubelet[2778]: W0313 00:37:50.751743 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.752427 kubelet[2778]: E0313 00:37:50.751749 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.752427 kubelet[2778]: E0313 00:37:50.751880 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.752427 kubelet[2778]: W0313 00:37:50.751885 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.752427 kubelet[2778]: E0313 00:37:50.751892 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.752427 kubelet[2778]: E0313 00:37:50.752049 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.752427 kubelet[2778]: W0313 00:37:50.752056 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.753008 kubelet[2778]: E0313 00:37:50.752062 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.753008 kubelet[2778]: E0313 00:37:50.752244 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.753008 kubelet[2778]: W0313 00:37:50.752250 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.753008 kubelet[2778]: E0313 00:37:50.752257 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.753008 kubelet[2778]: E0313 00:37:50.752401 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.753008 kubelet[2778]: W0313 00:37:50.752407 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.753008 kubelet[2778]: E0313 00:37:50.752412 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.753008 kubelet[2778]: E0313 00:37:50.752572 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.753008 kubelet[2778]: W0313 00:37:50.752578 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.753008 kubelet[2778]: E0313 00:37:50.752583 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.753363 kubelet[2778]: E0313 00:37:50.752708 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.753363 kubelet[2778]: W0313 00:37:50.752713 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.753363 kubelet[2778]: E0313 00:37:50.752718 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.753363 kubelet[2778]: E0313 00:37:50.752914 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.753363 kubelet[2778]: W0313 00:37:50.752920 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.753363 kubelet[2778]: E0313 00:37:50.752926 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.753363 kubelet[2778]: E0313 00:37:50.753098 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.753363 kubelet[2778]: W0313 00:37:50.753104 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.753363 kubelet[2778]: E0313 00:37:50.753110 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.753363 kubelet[2778]: E0313 00:37:50.753281 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.753871 kubelet[2778]: W0313 00:37:50.753287 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.753871 kubelet[2778]: E0313 00:37:50.753293 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.753871 kubelet[2778]: E0313 00:37:50.753429 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.753871 kubelet[2778]: W0313 00:37:50.753435 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.753871 kubelet[2778]: E0313 00:37:50.753442 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.754104 kubelet[2778]: E0313 00:37:50.753904 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.754104 kubelet[2778]: W0313 00:37:50.753911 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.754104 kubelet[2778]: E0313 00:37:50.753920 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.754104 kubelet[2778]: E0313 00:37:50.754082 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.754104 kubelet[2778]: W0313 00:37:50.754087 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.754104 kubelet[2778]: E0313 00:37:50.754094 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.754504 kubelet[2778]: E0313 00:37:50.754463 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.754504 kubelet[2778]: W0313 00:37:50.754502 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.754572 kubelet[2778]: E0313 00:37:50.754509 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.754709 kubelet[2778]: E0313 00:37:50.754657 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.754709 kubelet[2778]: W0313 00:37:50.754665 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.754709 kubelet[2778]: E0313 00:37:50.754671 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.763407 kubelet[2778]: E0313 00:37:50.763290 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.763407 kubelet[2778]: W0313 00:37:50.763327 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.763407 kubelet[2778]: E0313 00:37:50.763347 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.763407 kubelet[2778]: I0313 00:37:50.763379 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/61d0fe9a-7940-4074-923f-15bb8359aff2-registration-dir\") pod \"csi-node-driver-8jxzz\" (UID: \"61d0fe9a-7940-4074-923f-15bb8359aff2\") " pod="calico-system/csi-node-driver-8jxzz" Mar 13 00:37:50.763680 kubelet[2778]: E0313 00:37:50.763660 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.763680 kubelet[2778]: W0313 00:37:50.763670 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.763837 kubelet[2778]: E0313 00:37:50.763680 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.763837 kubelet[2778]: I0313 00:37:50.763726 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jqwm\" (UniqueName: \"kubernetes.io/projected/61d0fe9a-7940-4074-923f-15bb8359aff2-kube-api-access-2jqwm\") pod \"csi-node-driver-8jxzz\" (UID: \"61d0fe9a-7940-4074-923f-15bb8359aff2\") " pod="calico-system/csi-node-driver-8jxzz" Mar 13 00:37:50.764004 kubelet[2778]: E0313 00:37:50.763962 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.764004 kubelet[2778]: W0313 00:37:50.763982 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.764004 kubelet[2778]: E0313 00:37:50.763995 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.764244 kubelet[2778]: E0313 00:37:50.764207 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.764244 kubelet[2778]: W0313 00:37:50.764219 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.764244 kubelet[2778]: E0313 00:37:50.764228 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.764503 kubelet[2778]: E0313 00:37:50.764461 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.764549 kubelet[2778]: W0313 00:37:50.764510 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.764549 kubelet[2778]: E0313 00:37:50.764518 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.764605 kubelet[2778]: I0313 00:37:50.764553 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/61d0fe9a-7940-4074-923f-15bb8359aff2-varrun\") pod \"csi-node-driver-8jxzz\" (UID: \"61d0fe9a-7940-4074-923f-15bb8359aff2\") " pod="calico-system/csi-node-driver-8jxzz" Mar 13 00:37:50.764863 kubelet[2778]: E0313 00:37:50.764838 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.764863 kubelet[2778]: W0313 00:37:50.764851 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.764863 kubelet[2778]: E0313 00:37:50.764860 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.764960 kubelet[2778]: I0313 00:37:50.764883 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61d0fe9a-7940-4074-923f-15bb8359aff2-kubelet-dir\") pod \"csi-node-driver-8jxzz\" (UID: \"61d0fe9a-7940-4074-923f-15bb8359aff2\") " pod="calico-system/csi-node-driver-8jxzz" Mar 13 00:37:50.765135 kubelet[2778]: E0313 00:37:50.765101 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.765135 kubelet[2778]: W0313 00:37:50.765123 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.765135 kubelet[2778]: E0313 00:37:50.765132 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.765240 kubelet[2778]: I0313 00:37:50.765219 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/61d0fe9a-7940-4074-923f-15bb8359aff2-socket-dir\") pod \"csi-node-driver-8jxzz\" (UID: \"61d0fe9a-7940-4074-923f-15bb8359aff2\") " pod="calico-system/csi-node-driver-8jxzz" Mar 13 00:37:50.765417 kubelet[2778]: E0313 00:37:50.765391 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.765417 kubelet[2778]: W0313 00:37:50.765405 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.765417 kubelet[2778]: E0313 00:37:50.765413 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.765624 kubelet[2778]: E0313 00:37:50.765608 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.765624 kubelet[2778]: W0313 00:37:50.765620 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.765681 kubelet[2778]: E0313 00:37:50.765629 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.765894 kubelet[2778]: E0313 00:37:50.765865 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.765894 kubelet[2778]: W0313 00:37:50.765879 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.765894 kubelet[2778]: E0313 00:37:50.765888 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.766122 kubelet[2778]: E0313 00:37:50.766087 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.766122 kubelet[2778]: W0313 00:37:50.766099 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.766122 kubelet[2778]: E0313 00:37:50.766106 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.766375 kubelet[2778]: E0313 00:37:50.766353 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.766375 kubelet[2778]: W0313 00:37:50.766365 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.766375 kubelet[2778]: E0313 00:37:50.766371 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.766640 kubelet[2778]: E0313 00:37:50.766624 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.766640 kubelet[2778]: W0313 00:37:50.766635 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.766704 kubelet[2778]: E0313 00:37:50.766643 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.766907 kubelet[2778]: E0313 00:37:50.766889 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.766907 kubelet[2778]: W0313 00:37:50.766901 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.766969 kubelet[2778]: E0313 00:37:50.766910 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.767198 kubelet[2778]: E0313 00:37:50.767179 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.767198 kubelet[2778]: W0313 00:37:50.767191 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.767198 kubelet[2778]: E0313 00:37:50.767198 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.792932 containerd[1633]: time="2026-03-13T00:37:50.792876110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-784cdff998-fk6dw,Uid:cffbf933-c7cf-44d7-b429-cec7eaf8bb5c,Namespace:calico-system,Attempt:0,}" Mar 13 00:37:50.814758 containerd[1633]: time="2026-03-13T00:37:50.814716297Z" level=info msg="connecting to shim 4beb96c4b30a7f0705e1abdc8a4c3d492589ff1b18646f02595657ab44263cb6" address="unix:///run/containerd/s/6d77f6f36a486990dab3406bd3463a1c323d5ffee740de7c7c4b242c9f7a80c3" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:50.840652 systemd[1]: Started cri-containerd-4beb96c4b30a7f0705e1abdc8a4c3d492589ff1b18646f02595657ab44263cb6.scope - libcontainer container 4beb96c4b30a7f0705e1abdc8a4c3d492589ff1b18646f02595657ab44263cb6. Mar 13 00:37:50.856904 containerd[1633]: time="2026-03-13T00:37:50.856850949Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mxq97,Uid:fd1ec457-df47-4303-a013-e8c88c06aec1,Namespace:calico-system,Attempt:0,}" Mar 13 00:37:50.867495 kubelet[2778]: E0313 00:37:50.866195 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.867495 kubelet[2778]: W0313 00:37:50.866259 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.867495 kubelet[2778]: E0313 00:37:50.866276 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.867495 kubelet[2778]: E0313 00:37:50.866573 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.867495 kubelet[2778]: W0313 00:37:50.866665 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.867495 kubelet[2778]: E0313 00:37:50.866674 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.867495 kubelet[2778]: E0313 00:37:50.867044 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.867495 kubelet[2778]: W0313 00:37:50.867054 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.867495 kubelet[2778]: E0313 00:37:50.867064 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.867495 kubelet[2778]: E0313 00:37:50.867403 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.867880 kubelet[2778]: W0313 00:37:50.867411 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.867880 kubelet[2778]: E0313 00:37:50.867421 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.867880 kubelet[2778]: E0313 00:37:50.867856 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.867880 kubelet[2778]: W0313 00:37:50.867866 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.867880 kubelet[2778]: E0313 00:37:50.867876 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.868271 kubelet[2778]: E0313 00:37:50.868245 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.868361 kubelet[2778]: W0313 00:37:50.868345 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.868416 kubelet[2778]: E0313 00:37:50.868405 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.868781 kubelet[2778]: E0313 00:37:50.868744 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.868781 kubelet[2778]: W0313 00:37:50.868756 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.868781 kubelet[2778]: E0313 00:37:50.868765 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.869235 kubelet[2778]: E0313 00:37:50.869141 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.869235 kubelet[2778]: W0313 00:37:50.869152 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.869235 kubelet[2778]: E0313 00:37:50.869161 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.869694 kubelet[2778]: E0313 00:37:50.869659 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.869694 kubelet[2778]: W0313 00:37:50.869670 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.869694 kubelet[2778]: E0313 00:37:50.869679 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.870075 kubelet[2778]: E0313 00:37:50.870043 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.870075 kubelet[2778]: W0313 00:37:50.870054 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.870075 kubelet[2778]: E0313 00:37:50.870062 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.870405 kubelet[2778]: E0313 00:37:50.870374 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.870405 kubelet[2778]: W0313 00:37:50.870384 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.870405 kubelet[2778]: E0313 00:37:50.870393 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.870799 kubelet[2778]: E0313 00:37:50.870737 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.870799 kubelet[2778]: W0313 00:37:50.870747 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.870799 kubelet[2778]: E0313 00:37:50.870755 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.871123 kubelet[2778]: E0313 00:37:50.871043 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.871123 kubelet[2778]: W0313 00:37:50.871053 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.871123 kubelet[2778]: E0313 00:37:50.871062 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.874264 kubelet[2778]: E0313 00:37:50.874136 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.874264 kubelet[2778]: W0313 00:37:50.874148 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.874264 kubelet[2778]: E0313 00:37:50.874159 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.874548 kubelet[2778]: E0313 00:37:50.874416 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.874548 kubelet[2778]: W0313 00:37:50.874425 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.874548 kubelet[2778]: E0313 00:37:50.874433 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.874810 kubelet[2778]: E0313 00:37:50.874697 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.874810 kubelet[2778]: W0313 00:37:50.874706 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.874810 kubelet[2778]: E0313 00:37:50.874714 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.875308 kubelet[2778]: E0313 00:37:50.874945 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.875308 kubelet[2778]: W0313 00:37:50.874954 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.875308 kubelet[2778]: E0313 00:37:50.874962 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.875308 kubelet[2778]: E0313 00:37:50.875171 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.875308 kubelet[2778]: W0313 00:37:50.875180 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.875308 kubelet[2778]: E0313 00:37:50.875188 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.876852 kubelet[2778]: E0313 00:37:50.875546 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.876852 kubelet[2778]: W0313 00:37:50.875556 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.876852 kubelet[2778]: E0313 00:37:50.875565 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.876852 kubelet[2778]: E0313 00:37:50.875958 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.876852 kubelet[2778]: W0313 00:37:50.875967 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.876852 kubelet[2778]: E0313 00:37:50.875978 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.876852 kubelet[2778]: E0313 00:37:50.876263 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.876852 kubelet[2778]: W0313 00:37:50.876272 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.876852 kubelet[2778]: E0313 00:37:50.876282 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.876852 kubelet[2778]: E0313 00:37:50.876523 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.877147 kubelet[2778]: W0313 00:37:50.876530 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.877147 kubelet[2778]: E0313 00:37:50.876538 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.877147 kubelet[2778]: E0313 00:37:50.876823 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.877147 kubelet[2778]: W0313 00:37:50.876830 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.877147 kubelet[2778]: E0313 00:37:50.876838 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.879149 kubelet[2778]: E0313 00:37:50.878977 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.879149 kubelet[2778]: W0313 00:37:50.878988 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.879149 kubelet[2778]: E0313 00:37:50.878999 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.879425 kubelet[2778]: E0313 00:37:50.879387 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.879425 kubelet[2778]: W0313 00:37:50.879397 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.879425 kubelet[2778]: E0313 00:37:50.879406 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.888696 kubelet[2778]: E0313 00:37:50.888681 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:50.888813 kubelet[2778]: W0313 00:37:50.888803 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:50.888867 containerd[1633]: time="2026-03-13T00:37:50.888832988Z" level=info msg="connecting to shim 5dedde40c500f6f10ed2b34550b9e94ea1eaaf0cd546c5f2e8a0353736850260" address="unix:///run/containerd/s/9d31bade4de5d081f36fa858e7b9702abbaa6cd01d83d37123ed3af2ae152355" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:50.888922 kubelet[2778]: E0313 00:37:50.888854 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:50.901936 containerd[1633]: time="2026-03-13T00:37:50.901889327Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-784cdff998-fk6dw,Uid:cffbf933-c7cf-44d7-b429-cec7eaf8bb5c,Namespace:calico-system,Attempt:0,} returns sandbox id \"4beb96c4b30a7f0705e1abdc8a4c3d492589ff1b18646f02595657ab44263cb6\"" Mar 13 00:37:50.903622 containerd[1633]: time="2026-03-13T00:37:50.903602928Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 13 00:37:50.915623 systemd[1]: Started cri-containerd-5dedde40c500f6f10ed2b34550b9e94ea1eaaf0cd546c5f2e8a0353736850260.scope - libcontainer container 5dedde40c500f6f10ed2b34550b9e94ea1eaaf0cd546c5f2e8a0353736850260. Mar 13 00:37:50.942891 containerd[1633]: time="2026-03-13T00:37:50.942804144Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mxq97,Uid:fd1ec457-df47-4303-a013-e8c88c06aec1,Namespace:calico-system,Attempt:0,} returns sandbox id \"5dedde40c500f6f10ed2b34550b9e94ea1eaaf0cd546c5f2e8a0353736850260\"" Mar 13 00:37:52.069603 kubelet[2778]: E0313 00:37:52.069501 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8jxzz" podUID="61d0fe9a-7940-4074-923f-15bb8359aff2" Mar 13 00:37:52.806960 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount477063644.mount: Deactivated successfully. Mar 13 00:37:53.463168 update_engine[1607]: I20260313 00:37:53.463056 1607 update_attempter.cc:509] Updating boot flags... Mar 13 00:37:54.018023 containerd[1633]: time="2026-03-13T00:37:54.017970736Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:54.019142 containerd[1633]: time="2026-03-13T00:37:54.019022746Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 13 00:37:54.020113 containerd[1633]: time="2026-03-13T00:37:54.020088317Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:54.022530 containerd[1633]: time="2026-03-13T00:37:54.022020356Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:54.022530 containerd[1633]: time="2026-03-13T00:37:54.022400089Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 3.118451317s" Mar 13 00:37:54.022530 containerd[1633]: time="2026-03-13T00:37:54.022427499Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 13 00:37:54.024037 containerd[1633]: time="2026-03-13T00:37:54.023947633Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 13 00:37:54.039421 containerd[1633]: time="2026-03-13T00:37:54.039399132Z" level=info msg="CreateContainer within sandbox \"4beb96c4b30a7f0705e1abdc8a4c3d492589ff1b18646f02595657ab44263cb6\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 13 00:37:54.046676 containerd[1633]: time="2026-03-13T00:37:54.046579651Z" level=info msg="Container be9cf06943c015aea640d183124ef8d248d965ee8664725f59e299921d7289e3: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:54.056498 containerd[1633]: time="2026-03-13T00:37:54.056442025Z" level=info msg="CreateContainer within sandbox \"4beb96c4b30a7f0705e1abdc8a4c3d492589ff1b18646f02595657ab44263cb6\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"be9cf06943c015aea640d183124ef8d248d965ee8664725f59e299921d7289e3\"" Mar 13 00:37:54.057956 containerd[1633]: time="2026-03-13T00:37:54.056830819Z" level=info msg="StartContainer for \"be9cf06943c015aea640d183124ef8d248d965ee8664725f59e299921d7289e3\"" Mar 13 00:37:54.057956 containerd[1633]: time="2026-03-13T00:37:54.057657817Z" level=info msg="connecting to shim be9cf06943c015aea640d183124ef8d248d965ee8664725f59e299921d7289e3" address="unix:///run/containerd/s/6d77f6f36a486990dab3406bd3463a1c323d5ffee740de7c7c4b242c9f7a80c3" protocol=ttrpc version=3 Mar 13 00:37:54.069142 kubelet[2778]: E0313 00:37:54.069113 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8jxzz" podUID="61d0fe9a-7940-4074-923f-15bb8359aff2" Mar 13 00:37:54.076735 systemd[1]: Started cri-containerd-be9cf06943c015aea640d183124ef8d248d965ee8664725f59e299921d7289e3.scope - libcontainer container be9cf06943c015aea640d183124ef8d248d965ee8664725f59e299921d7289e3. Mar 13 00:37:54.121684 containerd[1633]: time="2026-03-13T00:37:54.121648111Z" level=info msg="StartContainer for \"be9cf06943c015aea640d183124ef8d248d965ee8664725f59e299921d7289e3\" returns successfully" Mar 13 00:37:54.178985 kubelet[2778]: E0313 00:37:54.178950 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.178985 kubelet[2778]: W0313 00:37:54.178973 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.178985 kubelet[2778]: E0313 00:37:54.178991 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.179489 kubelet[2778]: E0313 00:37:54.179453 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.179588 kubelet[2778]: W0313 00:37:54.179554 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.179588 kubelet[2778]: E0313 00:37:54.179588 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.180223 kubelet[2778]: E0313 00:37:54.180199 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.180223 kubelet[2778]: W0313 00:37:54.180212 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.180223 kubelet[2778]: E0313 00:37:54.180221 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.182425 kubelet[2778]: E0313 00:37:54.181509 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.182425 kubelet[2778]: W0313 00:37:54.181530 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.182425 kubelet[2778]: E0313 00:37:54.181543 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.182425 kubelet[2778]: E0313 00:37:54.181767 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.182425 kubelet[2778]: W0313 00:37:54.181773 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.182425 kubelet[2778]: E0313 00:37:54.181779 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.182425 kubelet[2778]: E0313 00:37:54.181953 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.182425 kubelet[2778]: W0313 00:37:54.181959 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.182425 kubelet[2778]: E0313 00:37:54.181965 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.182425 kubelet[2778]: E0313 00:37:54.182117 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.182643 kubelet[2778]: W0313 00:37:54.182123 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.182643 kubelet[2778]: E0313 00:37:54.182128 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.182643 kubelet[2778]: E0313 00:37:54.182303 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.182643 kubelet[2778]: W0313 00:37:54.182308 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.182643 kubelet[2778]: E0313 00:37:54.182314 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.182643 kubelet[2778]: E0313 00:37:54.182498 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.182643 kubelet[2778]: W0313 00:37:54.182504 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.182643 kubelet[2778]: E0313 00:37:54.182510 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.183333 kubelet[2778]: E0313 00:37:54.183219 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.183333 kubelet[2778]: W0313 00:37:54.183238 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.183333 kubelet[2778]: E0313 00:37:54.183263 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.183897 kubelet[2778]: E0313 00:37:54.183879 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.184072 kubelet[2778]: W0313 00:37:54.183978 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.184072 kubelet[2778]: E0313 00:37:54.183989 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.184360 kubelet[2778]: E0313 00:37:54.184351 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.184405 kubelet[2778]: W0313 00:37:54.184397 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.184535 kubelet[2778]: E0313 00:37:54.184441 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.185055 kubelet[2778]: E0313 00:37:54.184916 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.185055 kubelet[2778]: W0313 00:37:54.184933 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.185055 kubelet[2778]: E0313 00:37:54.184942 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.185538 kubelet[2778]: E0313 00:37:54.185440 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.185538 kubelet[2778]: W0313 00:37:54.185448 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.185538 kubelet[2778]: E0313 00:37:54.185455 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.186025 kubelet[2778]: E0313 00:37:54.185906 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.186025 kubelet[2778]: W0313 00:37:54.185914 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.186137 kubelet[2778]: E0313 00:37:54.186084 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.196446 kubelet[2778]: E0313 00:37:54.196350 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.196446 kubelet[2778]: W0313 00:37:54.196361 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.196446 kubelet[2778]: E0313 00:37:54.196389 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.197355 kubelet[2778]: E0313 00:37:54.196725 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.197355 kubelet[2778]: W0313 00:37:54.196733 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.197355 kubelet[2778]: E0313 00:37:54.196741 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.197355 kubelet[2778]: E0313 00:37:54.196969 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.197355 kubelet[2778]: W0313 00:37:54.196982 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.197355 kubelet[2778]: E0313 00:37:54.196993 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.197355 kubelet[2778]: E0313 00:37:54.197198 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.197355 kubelet[2778]: W0313 00:37:54.197206 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.197355 kubelet[2778]: E0313 00:37:54.197212 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.197589 kubelet[2778]: E0313 00:37:54.197385 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.197589 kubelet[2778]: W0313 00:37:54.197392 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.197589 kubelet[2778]: E0313 00:37:54.197398 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.197788 kubelet[2778]: E0313 00:37:54.197769 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.197788 kubelet[2778]: W0313 00:37:54.197782 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.197788 kubelet[2778]: E0313 00:37:54.197789 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.198197 kubelet[2778]: E0313 00:37:54.198165 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.198197 kubelet[2778]: W0313 00:37:54.198177 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.198197 kubelet[2778]: E0313 00:37:54.198184 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.198509 kubelet[2778]: E0313 00:37:54.198493 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.198509 kubelet[2778]: W0313 00:37:54.198504 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.198551 kubelet[2778]: E0313 00:37:54.198511 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.198889 kubelet[2778]: E0313 00:37:54.198870 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.198889 kubelet[2778]: W0313 00:37:54.198884 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.198929 kubelet[2778]: E0313 00:37:54.198891 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.199531 kubelet[2778]: E0313 00:37:54.199512 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.199531 kubelet[2778]: W0313 00:37:54.199525 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.199578 kubelet[2778]: E0313 00:37:54.199541 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.200009 kubelet[2778]: E0313 00:37:54.199970 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.200009 kubelet[2778]: W0313 00:37:54.199995 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.200009 kubelet[2778]: E0313 00:37:54.200001 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.200335 kubelet[2778]: E0313 00:37:54.200315 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.200335 kubelet[2778]: W0313 00:37:54.200327 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.200335 kubelet[2778]: E0313 00:37:54.200335 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.201302 kubelet[2778]: E0313 00:37:54.201277 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.201302 kubelet[2778]: W0313 00:37:54.201292 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.201302 kubelet[2778]: E0313 00:37:54.201300 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.201520 kubelet[2778]: E0313 00:37:54.201504 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.201520 kubelet[2778]: W0313 00:37:54.201517 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.201563 kubelet[2778]: E0313 00:37:54.201524 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.201808 kubelet[2778]: E0313 00:37:54.201703 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.201808 kubelet[2778]: W0313 00:37:54.201712 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.201808 kubelet[2778]: E0313 00:37:54.201718 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.201891 kubelet[2778]: E0313 00:37:54.201886 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.201911 kubelet[2778]: W0313 00:37:54.201892 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.201911 kubelet[2778]: E0313 00:37:54.201898 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.202067 kubelet[2778]: E0313 00:37:54.202044 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.202067 kubelet[2778]: W0313 00:37:54.202055 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.202067 kubelet[2778]: E0313 00:37:54.202061 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:54.204119 kubelet[2778]: E0313 00:37:54.203581 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:54.204119 kubelet[2778]: W0313 00:37:54.203593 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:54.204119 kubelet[2778]: E0313 00:37:54.203601 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.152485 kubelet[2778]: I0313 00:37:55.152404 2778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:37:55.194104 kubelet[2778]: E0313 00:37:55.194077 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.194104 kubelet[2778]: W0313 00:37:55.194104 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.194232 kubelet[2778]: E0313 00:37:55.194129 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.194637 kubelet[2778]: E0313 00:37:55.194615 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.194686 kubelet[2778]: W0313 00:37:55.194636 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.194686 kubelet[2778]: E0313 00:37:55.194653 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.194961 kubelet[2778]: E0313 00:37:55.194930 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.195039 kubelet[2778]: W0313 00:37:55.195018 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.195068 kubelet[2778]: E0313 00:37:55.195039 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.195369 kubelet[2778]: E0313 00:37:55.195348 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.195411 kubelet[2778]: W0313 00:37:55.195367 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.195411 kubelet[2778]: E0313 00:37:55.195382 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.195707 kubelet[2778]: E0313 00:37:55.195686 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.195707 kubelet[2778]: W0313 00:37:55.195704 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.195767 kubelet[2778]: E0313 00:37:55.195718 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.195999 kubelet[2778]: E0313 00:37:55.195980 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.196024 kubelet[2778]: W0313 00:37:55.195998 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.196516 kubelet[2778]: E0313 00:37:55.196346 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.196740 kubelet[2778]: E0313 00:37:55.196719 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.196765 kubelet[2778]: W0313 00:37:55.196741 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.196781 kubelet[2778]: E0313 00:37:55.196767 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.197111 kubelet[2778]: E0313 00:37:55.197089 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.197132 kubelet[2778]: W0313 00:37:55.197111 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.197151 kubelet[2778]: E0313 00:37:55.197133 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.197539 kubelet[2778]: E0313 00:37:55.197521 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.197663 kubelet[2778]: W0313 00:37:55.197542 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.197663 kubelet[2778]: E0313 00:37:55.197556 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.197967 kubelet[2778]: E0313 00:37:55.197943 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.198004 kubelet[2778]: W0313 00:37:55.197971 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.198004 kubelet[2778]: E0313 00:37:55.197988 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.198351 kubelet[2778]: E0313 00:37:55.198332 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.198374 kubelet[2778]: W0313 00:37:55.198354 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.198374 kubelet[2778]: E0313 00:37:55.198368 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.198735 kubelet[2778]: E0313 00:37:55.198717 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.198775 kubelet[2778]: W0313 00:37:55.198738 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.198775 kubelet[2778]: E0313 00:37:55.198757 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.199403 kubelet[2778]: E0313 00:37:55.199053 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.199403 kubelet[2778]: W0313 00:37:55.199074 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.199403 kubelet[2778]: E0313 00:37:55.199086 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.199403 kubelet[2778]: E0313 00:37:55.199397 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.199528 kubelet[2778]: W0313 00:37:55.199408 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.199528 kubelet[2778]: E0313 00:37:55.199421 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.199784 kubelet[2778]: E0313 00:37:55.199766 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.199811 kubelet[2778]: W0313 00:37:55.199786 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.199811 kubelet[2778]: E0313 00:37:55.199799 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.203189 kubelet[2778]: E0313 00:37:55.203178 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.203252 kubelet[2778]: W0313 00:37:55.203244 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.203289 kubelet[2778]: E0313 00:37:55.203282 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.203614 kubelet[2778]: E0313 00:37:55.203592 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.203645 kubelet[2778]: W0313 00:37:55.203613 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.203645 kubelet[2778]: E0313 00:37:55.203630 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.203983 kubelet[2778]: E0313 00:37:55.203964 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.204010 kubelet[2778]: W0313 00:37:55.203983 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.204010 kubelet[2778]: E0313 00:37:55.203996 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.204460 kubelet[2778]: E0313 00:37:55.204439 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.204498 kubelet[2778]: W0313 00:37:55.204460 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.204523 kubelet[2778]: E0313 00:37:55.204512 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.204847 kubelet[2778]: E0313 00:37:55.204827 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.204847 kubelet[2778]: W0313 00:37:55.204846 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.204902 kubelet[2778]: E0313 00:37:55.204859 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.205181 kubelet[2778]: E0313 00:37:55.205150 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.205209 kubelet[2778]: W0313 00:37:55.205185 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.205209 kubelet[2778]: E0313 00:37:55.205199 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.205527 kubelet[2778]: E0313 00:37:55.205508 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.205556 kubelet[2778]: W0313 00:37:55.205527 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.205556 kubelet[2778]: E0313 00:37:55.205542 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.205988 kubelet[2778]: E0313 00:37:55.205966 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.206015 kubelet[2778]: W0313 00:37:55.205987 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.206015 kubelet[2778]: E0313 00:37:55.206001 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.206664 kubelet[2778]: E0313 00:37:55.206615 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.206697 kubelet[2778]: W0313 00:37:55.206686 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.206721 kubelet[2778]: E0313 00:37:55.206701 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.208088 kubelet[2778]: E0313 00:37:55.207975 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.208088 kubelet[2778]: W0313 00:37:55.207984 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.208088 kubelet[2778]: E0313 00:37:55.207991 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.208660 kubelet[2778]: E0313 00:37:55.208648 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.208806 kubelet[2778]: W0313 00:37:55.208712 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.208806 kubelet[2778]: E0313 00:37:55.208724 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.209126 kubelet[2778]: E0313 00:37:55.209019 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.209126 kubelet[2778]: W0313 00:37:55.209030 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.209126 kubelet[2778]: E0313 00:37:55.209040 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.209322 kubelet[2778]: E0313 00:37:55.209312 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.209355 kubelet[2778]: W0313 00:37:55.209348 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.209489 kubelet[2778]: E0313 00:37:55.209390 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.209722 kubelet[2778]: E0313 00:37:55.209713 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.209754 kubelet[2778]: W0313 00:37:55.209748 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.209782 kubelet[2778]: E0313 00:37:55.209776 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.210121 kubelet[2778]: E0313 00:37:55.210096 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.210154 kubelet[2778]: W0313 00:37:55.210121 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.210154 kubelet[2778]: E0313 00:37:55.210136 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.210671 kubelet[2778]: E0313 00:37:55.210624 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.210706 kubelet[2778]: W0313 00:37:55.210673 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.210736 kubelet[2778]: E0313 00:37:55.210687 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.212592 kubelet[2778]: E0313 00:37:55.212547 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.212592 kubelet[2778]: W0313 00:37:55.212559 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.212592 kubelet[2778]: E0313 00:37:55.212568 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.213383 kubelet[2778]: E0313 00:37:55.213354 2778 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:37:55.213421 kubelet[2778]: W0313 00:37:55.213382 2778 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:37:55.213450 kubelet[2778]: E0313 00:37:55.213422 2778 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:37:55.572267 containerd[1633]: time="2026-03-13T00:37:55.572214683Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:55.573323 containerd[1633]: time="2026-03-13T00:37:55.573151822Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 13 00:37:55.574301 containerd[1633]: time="2026-03-13T00:37:55.574280712Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:55.576646 containerd[1633]: time="2026-03-13T00:37:55.576625414Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:55.577042 containerd[1633]: time="2026-03-13T00:37:55.577024917Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.552610869s" Mar 13 00:37:55.577088 containerd[1633]: time="2026-03-13T00:37:55.577079467Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 13 00:37:55.581386 containerd[1633]: time="2026-03-13T00:37:55.581353956Z" level=info msg="CreateContainer within sandbox \"5dedde40c500f6f10ed2b34550b9e94ea1eaaf0cd546c5f2e8a0353736850260\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 13 00:37:55.592513 containerd[1633]: time="2026-03-13T00:37:55.591842270Z" level=info msg="Container 15ff5ef068ff61116f6b724be9dd9d6237214db173e59d37b657c7f2a4cb5a5e: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:55.607497 containerd[1633]: time="2026-03-13T00:37:55.607447371Z" level=info msg="CreateContainer within sandbox \"5dedde40c500f6f10ed2b34550b9e94ea1eaaf0cd546c5f2e8a0353736850260\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"15ff5ef068ff61116f6b724be9dd9d6237214db173e59d37b657c7f2a4cb5a5e\"" Mar 13 00:37:55.607867 containerd[1633]: time="2026-03-13T00:37:55.607840906Z" level=info msg="StartContainer for \"15ff5ef068ff61116f6b724be9dd9d6237214db173e59d37b657c7f2a4cb5a5e\"" Mar 13 00:37:55.609300 containerd[1633]: time="2026-03-13T00:37:55.609279749Z" level=info msg="connecting to shim 15ff5ef068ff61116f6b724be9dd9d6237214db173e59d37b657c7f2a4cb5a5e" address="unix:///run/containerd/s/9d31bade4de5d081f36fa858e7b9702abbaa6cd01d83d37123ed3af2ae152355" protocol=ttrpc version=3 Mar 13 00:37:55.626568 systemd[1]: Started cri-containerd-15ff5ef068ff61116f6b724be9dd9d6237214db173e59d37b657c7f2a4cb5a5e.scope - libcontainer container 15ff5ef068ff61116f6b724be9dd9d6237214db173e59d37b657c7f2a4cb5a5e. Mar 13 00:37:55.696967 containerd[1633]: time="2026-03-13T00:37:55.696921151Z" level=info msg="StartContainer for \"15ff5ef068ff61116f6b724be9dd9d6237214db173e59d37b657c7f2a4cb5a5e\" returns successfully" Mar 13 00:37:55.708136 systemd[1]: cri-containerd-15ff5ef068ff61116f6b724be9dd9d6237214db173e59d37b657c7f2a4cb5a5e.scope: Deactivated successfully. Mar 13 00:37:55.710117 containerd[1633]: time="2026-03-13T00:37:55.710087450Z" level=info msg="received container exit event container_id:\"15ff5ef068ff61116f6b724be9dd9d6237214db173e59d37b657c7f2a4cb5a5e\" id:\"15ff5ef068ff61116f6b724be9dd9d6237214db173e59d37b657c7f2a4cb5a5e\" pid:3509 exited_at:{seconds:1773362275 nanos:709802837}" Mar 13 00:37:55.728183 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-15ff5ef068ff61116f6b724be9dd9d6237214db173e59d37b657c7f2a4cb5a5e-rootfs.mount: Deactivated successfully. Mar 13 00:37:56.069872 kubelet[2778]: E0313 00:37:56.069752 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8jxzz" podUID="61d0fe9a-7940-4074-923f-15bb8359aff2" Mar 13 00:37:56.157116 containerd[1633]: time="2026-03-13T00:37:56.157072650Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 13 00:37:56.181891 kubelet[2778]: I0313 00:37:56.181781 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-784cdff998-fk6dw" podStartSLOduration=3.06198581 podStartE2EDuration="6.181761731s" podCreationTimestamp="2026-03-13 00:37:50 +0000 UTC" firstStartedPulling="2026-03-13 00:37:50.903353444 +0000 UTC m=+15.916308779" lastFinishedPulling="2026-03-13 00:37:54.023129365 +0000 UTC m=+19.036084700" observedRunningTime="2026-03-13 00:37:54.163302689 +0000 UTC m=+19.176258024" watchObservedRunningTime="2026-03-13 00:37:56.181761731 +0000 UTC m=+21.194717096" Mar 13 00:37:58.068903 kubelet[2778]: E0313 00:37:58.068837 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8jxzz" podUID="61d0fe9a-7940-4074-923f-15bb8359aff2" Mar 13 00:37:59.505685 kubelet[2778]: I0313 00:37:59.505653 2778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:38:00.009618 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3106102106.mount: Deactivated successfully. Mar 13 00:38:00.039867 containerd[1633]: time="2026-03-13T00:38:00.039798733Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:00.041066 containerd[1633]: time="2026-03-13T00:38:00.041038421Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 13 00:38:00.041817 containerd[1633]: time="2026-03-13T00:38:00.041763576Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:00.043432 containerd[1633]: time="2026-03-13T00:38:00.043401917Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:00.044073 containerd[1633]: time="2026-03-13T00:38:00.043717409Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 3.886615749s" Mar 13 00:38:00.044073 containerd[1633]: time="2026-03-13T00:38:00.043739589Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 13 00:38:00.047456 containerd[1633]: time="2026-03-13T00:38:00.047420514Z" level=info msg="CreateContainer within sandbox \"5dedde40c500f6f10ed2b34550b9e94ea1eaaf0cd546c5f2e8a0353736850260\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 13 00:38:00.058932 containerd[1633]: time="2026-03-13T00:38:00.056189303Z" level=info msg="Container 020accab899b31235ba5437c5c4fcf7b17c2726fb6015b973fee5ad16ea72423: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:38:00.067111 containerd[1633]: time="2026-03-13T00:38:00.067079237Z" level=info msg="CreateContainer within sandbox \"5dedde40c500f6f10ed2b34550b9e94ea1eaaf0cd546c5f2e8a0353736850260\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"020accab899b31235ba5437c5c4fcf7b17c2726fb6015b973fee5ad16ea72423\"" Mar 13 00:38:00.068035 containerd[1633]: time="2026-03-13T00:38:00.068014694Z" level=info msg="StartContainer for \"020accab899b31235ba5437c5c4fcf7b17c2726fb6015b973fee5ad16ea72423\"" Mar 13 00:38:00.069232 kubelet[2778]: E0313 00:38:00.069187 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8jxzz" podUID="61d0fe9a-7940-4074-923f-15bb8359aff2" Mar 13 00:38:00.069355 containerd[1633]: time="2026-03-13T00:38:00.069284562Z" level=info msg="connecting to shim 020accab899b31235ba5437c5c4fcf7b17c2726fb6015b973fee5ad16ea72423" address="unix:///run/containerd/s/9d31bade4de5d081f36fa858e7b9702abbaa6cd01d83d37123ed3af2ae152355" protocol=ttrpc version=3 Mar 13 00:38:00.089605 systemd[1]: Started cri-containerd-020accab899b31235ba5437c5c4fcf7b17c2726fb6015b973fee5ad16ea72423.scope - libcontainer container 020accab899b31235ba5437c5c4fcf7b17c2726fb6015b973fee5ad16ea72423. Mar 13 00:38:00.146659 containerd[1633]: time="2026-03-13T00:38:00.146624617Z" level=info msg="StartContainer for \"020accab899b31235ba5437c5c4fcf7b17c2726fb6015b973fee5ad16ea72423\" returns successfully" Mar 13 00:38:00.187466 systemd[1]: cri-containerd-020accab899b31235ba5437c5c4fcf7b17c2726fb6015b973fee5ad16ea72423.scope: Deactivated successfully. Mar 13 00:38:00.189335 containerd[1633]: time="2026-03-13T00:38:00.189303065Z" level=info msg="received container exit event container_id:\"020accab899b31235ba5437c5c4fcf7b17c2726fb6015b973fee5ad16ea72423\" id:\"020accab899b31235ba5437c5c4fcf7b17c2726fb6015b973fee5ad16ea72423\" pid:3569 exited_at:{seconds:1773362280 nanos:189141624}" Mar 13 00:38:01.011371 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-020accab899b31235ba5437c5c4fcf7b17c2726fb6015b973fee5ad16ea72423-rootfs.mount: Deactivated successfully. Mar 13 00:38:01.176449 containerd[1633]: time="2026-03-13T00:38:01.176372169Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 13 00:38:02.069758 kubelet[2778]: E0313 00:38:02.069609 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8jxzz" podUID="61d0fe9a-7940-4074-923f-15bb8359aff2" Mar 13 00:38:03.595377 containerd[1633]: time="2026-03-13T00:38:03.595158222Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:03.596063 containerd[1633]: time="2026-03-13T00:38:03.596037486Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 13 00:38:03.596799 containerd[1633]: time="2026-03-13T00:38:03.596763721Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:03.598307 containerd[1633]: time="2026-03-13T00:38:03.598270699Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:03.598631 containerd[1633]: time="2026-03-13T00:38:03.598610251Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 2.422183591s" Mar 13 00:38:03.598678 containerd[1633]: time="2026-03-13T00:38:03.598633931Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 13 00:38:03.602006 containerd[1633]: time="2026-03-13T00:38:03.601489588Z" level=info msg="CreateContainer within sandbox \"5dedde40c500f6f10ed2b34550b9e94ea1eaaf0cd546c5f2e8a0353736850260\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 13 00:38:03.607615 containerd[1633]: time="2026-03-13T00:38:03.607593623Z" level=info msg="Container 24b14ad3301ef3d16709e89337c8ab4ba7cd799f7a746ffbd0c266fe001e5c0a: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:38:03.616570 containerd[1633]: time="2026-03-13T00:38:03.616544393Z" level=info msg="CreateContainer within sandbox \"5dedde40c500f6f10ed2b34550b9e94ea1eaaf0cd546c5f2e8a0353736850260\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"24b14ad3301ef3d16709e89337c8ab4ba7cd799f7a746ffbd0c266fe001e5c0a\"" Mar 13 00:38:03.617062 containerd[1633]: time="2026-03-13T00:38:03.616958116Z" level=info msg="StartContainer for \"24b14ad3301ef3d16709e89337c8ab4ba7cd799f7a746ffbd0c266fe001e5c0a\"" Mar 13 00:38:03.618546 containerd[1633]: time="2026-03-13T00:38:03.618524775Z" level=info msg="connecting to shim 24b14ad3301ef3d16709e89337c8ab4ba7cd799f7a746ffbd0c266fe001e5c0a" address="unix:///run/containerd/s/9d31bade4de5d081f36fa858e7b9702abbaa6cd01d83d37123ed3af2ae152355" protocol=ttrpc version=3 Mar 13 00:38:03.655594 systemd[1]: Started cri-containerd-24b14ad3301ef3d16709e89337c8ab4ba7cd799f7a746ffbd0c266fe001e5c0a.scope - libcontainer container 24b14ad3301ef3d16709e89337c8ab4ba7cd799f7a746ffbd0c266fe001e5c0a. Mar 13 00:38:03.727636 containerd[1633]: time="2026-03-13T00:38:03.727605600Z" level=info msg="StartContainer for \"24b14ad3301ef3d16709e89337c8ab4ba7cd799f7a746ffbd0c266fe001e5c0a\" returns successfully" Mar 13 00:38:04.069783 kubelet[2778]: E0313 00:38:04.069516 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8jxzz" podUID="61d0fe9a-7940-4074-923f-15bb8359aff2" Mar 13 00:38:04.136344 containerd[1633]: time="2026-03-13T00:38:04.136300140Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 13 00:38:04.138865 systemd[1]: cri-containerd-24b14ad3301ef3d16709e89337c8ab4ba7cd799f7a746ffbd0c266fe001e5c0a.scope: Deactivated successfully. Mar 13 00:38:04.139336 systemd[1]: cri-containerd-24b14ad3301ef3d16709e89337c8ab4ba7cd799f7a746ffbd0c266fe001e5c0a.scope: Consumed 373ms CPU time, 193.6M memory peak, 1.7M read from disk, 177M written to disk. Mar 13 00:38:04.140249 containerd[1633]: time="2026-03-13T00:38:04.140080401Z" level=info msg="received container exit event container_id:\"24b14ad3301ef3d16709e89337c8ab4ba7cd799f7a746ffbd0c266fe001e5c0a\" id:\"24b14ad3301ef3d16709e89337c8ab4ba7cd799f7a746ffbd0c266fe001e5c0a\" pid:3626 exited_at:{seconds:1773362284 nanos:139633619}" Mar 13 00:38:04.160582 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-24b14ad3301ef3d16709e89337c8ab4ba7cd799f7a746ffbd0c266fe001e5c0a-rootfs.mount: Deactivated successfully. Mar 13 00:38:04.216025 kubelet[2778]: I0313 00:38:04.215978 2778 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Mar 13 00:38:04.253237 systemd[1]: Created slice kubepods-burstable-pod8d70918b_f6c4_496d_b3bf_681bf2962192.slice - libcontainer container kubepods-burstable-pod8d70918b_f6c4_496d_b3bf_681bf2962192.slice. Mar 13 00:38:04.265435 systemd[1]: Created slice kubepods-burstable-pod5d90ef75_1dc2_4876_8272_eded34842bc1.slice - libcontainer container kubepods-burstable-pod5d90ef75_1dc2_4876_8272_eded34842bc1.slice. Mar 13 00:38:04.271315 kubelet[2778]: I0313 00:38:04.271249 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d70918b-f6c4-496d-b3bf-681bf2962192-config-volume\") pod \"coredns-674b8bbfcf-zqjwk\" (UID: \"8d70918b-f6c4-496d-b3bf-681bf2962192\") " pod="kube-system/coredns-674b8bbfcf-zqjwk" Mar 13 00:38:04.271836 kubelet[2778]: I0313 00:38:04.271809 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d90ef75-1dc2-4876-8272-eded34842bc1-config-volume\") pod \"coredns-674b8bbfcf-nprbp\" (UID: \"5d90ef75-1dc2-4876-8272-eded34842bc1\") " pod="kube-system/coredns-674b8bbfcf-nprbp" Mar 13 00:38:04.271836 kubelet[2778]: I0313 00:38:04.271834 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/20f68c7b-0d72-4da1-897b-e7098297c805-calico-apiserver-certs\") pod \"calico-apiserver-59cfd78dd5-nddx7\" (UID: \"20f68c7b-0d72-4da1-897b-e7098297c805\") " pod="calico-system/calico-apiserver-59cfd78dd5-nddx7" Mar 13 00:38:04.271906 kubelet[2778]: I0313 00:38:04.271850 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2k65\" (UniqueName: \"kubernetes.io/projected/ae5a2a4e-998d-40a0-a028-10322687d711-kube-api-access-v2k65\") pod \"whisker-cf6ff987d-26f8v\" (UID: \"ae5a2a4e-998d-40a0-a028-10322687d711\") " pod="calico-system/whisker-cf6ff987d-26f8v" Mar 13 00:38:04.271906 kubelet[2778]: I0313 00:38:04.271869 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5761654e-dacf-4b91-a01a-1840500b12d9-tigera-ca-bundle\") pod \"calico-kube-controllers-7948d5d9db-xhktz\" (UID: \"5761654e-dacf-4b91-a01a-1840500b12d9\") " pod="calico-system/calico-kube-controllers-7948d5d9db-xhktz" Mar 13 00:38:04.271906 kubelet[2778]: I0313 00:38:04.271880 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5dq7\" (UniqueName: \"kubernetes.io/projected/20f68c7b-0d72-4da1-897b-e7098297c805-kube-api-access-r5dq7\") pod \"calico-apiserver-59cfd78dd5-nddx7\" (UID: \"20f68c7b-0d72-4da1-897b-e7098297c805\") " pod="calico-system/calico-apiserver-59cfd78dd5-nddx7" Mar 13 00:38:04.271906 kubelet[2778]: I0313 00:38:04.271896 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/46bd0b0e-292a-4246-85e0-9eb19df0f28d-calico-apiserver-certs\") pod \"calico-apiserver-59cfd78dd5-q9vph\" (UID: \"46bd0b0e-292a-4246-85e0-9eb19df0f28d\") " pod="calico-system/calico-apiserver-59cfd78dd5-q9vph" Mar 13 00:38:04.271980 kubelet[2778]: I0313 00:38:04.271908 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbckc\" (UniqueName: \"kubernetes.io/projected/46bd0b0e-292a-4246-85e0-9eb19df0f28d-kube-api-access-wbckc\") pod \"calico-apiserver-59cfd78dd5-q9vph\" (UID: \"46bd0b0e-292a-4246-85e0-9eb19df0f28d\") " pod="calico-system/calico-apiserver-59cfd78dd5-q9vph" Mar 13 00:38:04.271980 kubelet[2778]: I0313 00:38:04.271933 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxk6g\" (UniqueName: \"kubernetes.io/projected/5d90ef75-1dc2-4876-8272-eded34842bc1-kube-api-access-sxk6g\") pod \"coredns-674b8bbfcf-nprbp\" (UID: \"5d90ef75-1dc2-4876-8272-eded34842bc1\") " pod="kube-system/coredns-674b8bbfcf-nprbp" Mar 13 00:38:04.271980 kubelet[2778]: I0313 00:38:04.271944 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ae5a2a4e-998d-40a0-a028-10322687d711-whisker-backend-key-pair\") pod \"whisker-cf6ff987d-26f8v\" (UID: \"ae5a2a4e-998d-40a0-a028-10322687d711\") " pod="calico-system/whisker-cf6ff987d-26f8v" Mar 13 00:38:04.271980 kubelet[2778]: I0313 00:38:04.271959 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnthk\" (UniqueName: \"kubernetes.io/projected/8d70918b-f6c4-496d-b3bf-681bf2962192-kube-api-access-pnthk\") pod \"coredns-674b8bbfcf-zqjwk\" (UID: \"8d70918b-f6c4-496d-b3bf-681bf2962192\") " pod="kube-system/coredns-674b8bbfcf-zqjwk" Mar 13 00:38:04.271980 kubelet[2778]: I0313 00:38:04.271971 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/ae5a2a4e-998d-40a0-a028-10322687d711-nginx-config\") pod \"whisker-cf6ff987d-26f8v\" (UID: \"ae5a2a4e-998d-40a0-a028-10322687d711\") " pod="calico-system/whisker-cf6ff987d-26f8v" Mar 13 00:38:04.272065 kubelet[2778]: I0313 00:38:04.271986 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tv5j\" (UniqueName: \"kubernetes.io/projected/5761654e-dacf-4b91-a01a-1840500b12d9-kube-api-access-6tv5j\") pod \"calico-kube-controllers-7948d5d9db-xhktz\" (UID: \"5761654e-dacf-4b91-a01a-1840500b12d9\") " pod="calico-system/calico-kube-controllers-7948d5d9db-xhktz" Mar 13 00:38:04.272065 kubelet[2778]: I0313 00:38:04.272001 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae5a2a4e-998d-40a0-a028-10322687d711-whisker-ca-bundle\") pod \"whisker-cf6ff987d-26f8v\" (UID: \"ae5a2a4e-998d-40a0-a028-10322687d711\") " pod="calico-system/whisker-cf6ff987d-26f8v" Mar 13 00:38:04.279394 systemd[1]: Created slice kubepods-besteffort-podae5a2a4e_998d_40a0_a028_10322687d711.slice - libcontainer container kubepods-besteffort-podae5a2a4e_998d_40a0_a028_10322687d711.slice. Mar 13 00:38:04.286557 systemd[1]: Created slice kubepods-besteffort-pod46bd0b0e_292a_4246_85e0_9eb19df0f28d.slice - libcontainer container kubepods-besteffort-pod46bd0b0e_292a_4246_85e0_9eb19df0f28d.slice. Mar 13 00:38:04.292828 systemd[1]: Created slice kubepods-besteffort-pod20f68c7b_0d72_4da1_897b_e7098297c805.slice - libcontainer container kubepods-besteffort-pod20f68c7b_0d72_4da1_897b_e7098297c805.slice. Mar 13 00:38:04.298447 systemd[1]: Created slice kubepods-besteffort-poda6077e04_8bca_4d2d_8663_d83dccf74534.slice - libcontainer container kubepods-besteffort-poda6077e04_8bca_4d2d_8663_d83dccf74534.slice. Mar 13 00:38:04.305012 systemd[1]: Created slice kubepods-besteffort-pod5761654e_dacf_4b91_a01a_1840500b12d9.slice - libcontainer container kubepods-besteffort-pod5761654e_dacf_4b91_a01a_1840500b12d9.slice. Mar 13 00:38:04.374596 kubelet[2778]: I0313 00:38:04.372495 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/a6077e04-8bca-4d2d-8663-d83dccf74534-goldmane-key-pair\") pod \"goldmane-5b85766d88-67z6c\" (UID: \"a6077e04-8bca-4d2d-8663-d83dccf74534\") " pod="calico-system/goldmane-5b85766d88-67z6c" Mar 13 00:38:04.374596 kubelet[2778]: I0313 00:38:04.372602 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6077e04-8bca-4d2d-8663-d83dccf74534-goldmane-ca-bundle\") pod \"goldmane-5b85766d88-67z6c\" (UID: \"a6077e04-8bca-4d2d-8663-d83dccf74534\") " pod="calico-system/goldmane-5b85766d88-67z6c" Mar 13 00:38:04.374596 kubelet[2778]: I0313 00:38:04.372671 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6077e04-8bca-4d2d-8663-d83dccf74534-config\") pod \"goldmane-5b85766d88-67z6c\" (UID: \"a6077e04-8bca-4d2d-8663-d83dccf74534\") " pod="calico-system/goldmane-5b85766d88-67z6c" Mar 13 00:38:04.374596 kubelet[2778]: I0313 00:38:04.372693 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-792mn\" (UniqueName: \"kubernetes.io/projected/a6077e04-8bca-4d2d-8663-d83dccf74534-kube-api-access-792mn\") pod \"goldmane-5b85766d88-67z6c\" (UID: \"a6077e04-8bca-4d2d-8663-d83dccf74534\") " pod="calico-system/goldmane-5b85766d88-67z6c" Mar 13 00:38:04.564654 containerd[1633]: time="2026-03-13T00:38:04.564598593Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zqjwk,Uid:8d70918b-f6c4-496d-b3bf-681bf2962192,Namespace:kube-system,Attempt:0,}" Mar 13 00:38:04.575341 containerd[1633]: time="2026-03-13T00:38:04.575295641Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-nprbp,Uid:5d90ef75-1dc2-4876-8272-eded34842bc1,Namespace:kube-system,Attempt:0,}" Mar 13 00:38:04.583315 containerd[1633]: time="2026-03-13T00:38:04.583161465Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-cf6ff987d-26f8v,Uid:ae5a2a4e-998d-40a0-a028-10322687d711,Namespace:calico-system,Attempt:0,}" Mar 13 00:38:04.590190 containerd[1633]: time="2026-03-13T00:38:04.590157132Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59cfd78dd5-q9vph,Uid:46bd0b0e-292a-4246-85e0-9eb19df0f28d,Namespace:calico-system,Attempt:0,}" Mar 13 00:38:04.598397 containerd[1633]: time="2026-03-13T00:38:04.598360876Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59cfd78dd5-nddx7,Uid:20f68c7b-0d72-4da1-897b-e7098297c805,Namespace:calico-system,Attempt:0,}" Mar 13 00:38:04.602852 containerd[1633]: time="2026-03-13T00:38:04.602623820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-67z6c,Uid:a6077e04-8bca-4d2d-8663-d83dccf74534,Namespace:calico-system,Attempt:0,}" Mar 13 00:38:04.611797 containerd[1633]: time="2026-03-13T00:38:04.610104060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7948d5d9db-xhktz,Uid:5761654e-dacf-4b91-a01a-1840500b12d9,Namespace:calico-system,Attempt:0,}" Mar 13 00:38:04.764662 containerd[1633]: time="2026-03-13T00:38:04.764614439Z" level=error msg="Failed to destroy network for sandbox \"13a65dd69837fc48aa3ce47175976d3ad1d4591da9fe736087b7bcf4e70e103b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.767383 systemd[1]: run-netns-cni\x2d9560e83c\x2de166\x2d3041\x2d9673\x2d1489eff29314.mount: Deactivated successfully. Mar 13 00:38:04.770328 containerd[1633]: time="2026-03-13T00:38:04.770306820Z" level=error msg="Failed to destroy network for sandbox \"4a84126e9d800546fb3103a46946ca8dd84e02c7656d2401207f43c2f91d2d4b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.771517 containerd[1633]: time="2026-03-13T00:38:04.770715452Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-nprbp,Uid:5d90ef75-1dc2-4876-8272-eded34842bc1,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"13a65dd69837fc48aa3ce47175976d3ad1d4591da9fe736087b7bcf4e70e103b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.772788 systemd[1]: run-netns-cni\x2d49cb4afd\x2d7571\x2d52b1\x2dfb7f\x2d81f10d0779a0.mount: Deactivated successfully. Mar 13 00:38:04.773259 containerd[1633]: time="2026-03-13T00:38:04.773230495Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59cfd78dd5-q9vph,Uid:46bd0b0e-292a-4246-85e0-9eb19df0f28d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a84126e9d800546fb3103a46946ca8dd84e02c7656d2401207f43c2f91d2d4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.773918 kubelet[2778]: E0313 00:38:04.773710 2778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a84126e9d800546fb3103a46946ca8dd84e02c7656d2401207f43c2f91d2d4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.773918 kubelet[2778]: E0313 00:38:04.773774 2778 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a84126e9d800546fb3103a46946ca8dd84e02c7656d2401207f43c2f91d2d4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-59cfd78dd5-q9vph" Mar 13 00:38:04.773918 kubelet[2778]: E0313 00:38:04.773791 2778 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a84126e9d800546fb3103a46946ca8dd84e02c7656d2401207f43c2f91d2d4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-59cfd78dd5-q9vph" Mar 13 00:38:04.774041 kubelet[2778]: E0313 00:38:04.773846 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-59cfd78dd5-q9vph_calico-system(46bd0b0e-292a-4246-85e0-9eb19df0f28d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-59cfd78dd5-q9vph_calico-system(46bd0b0e-292a-4246-85e0-9eb19df0f28d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4a84126e9d800546fb3103a46946ca8dd84e02c7656d2401207f43c2f91d2d4b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-59cfd78dd5-q9vph" podUID="46bd0b0e-292a-4246-85e0-9eb19df0f28d" Mar 13 00:38:04.776328 kubelet[2778]: E0313 00:38:04.775755 2778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13a65dd69837fc48aa3ce47175976d3ad1d4591da9fe736087b7bcf4e70e103b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.776328 kubelet[2778]: E0313 00:38:04.775786 2778 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13a65dd69837fc48aa3ce47175976d3ad1d4591da9fe736087b7bcf4e70e103b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-nprbp" Mar 13 00:38:04.776328 kubelet[2778]: E0313 00:38:04.775798 2778 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13a65dd69837fc48aa3ce47175976d3ad1d4591da9fe736087b7bcf4e70e103b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-nprbp" Mar 13 00:38:04.776423 kubelet[2778]: E0313 00:38:04.775827 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-nprbp_kube-system(5d90ef75-1dc2-4876-8272-eded34842bc1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-nprbp_kube-system(5d90ef75-1dc2-4876-8272-eded34842bc1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"13a65dd69837fc48aa3ce47175976d3ad1d4591da9fe736087b7bcf4e70e103b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-nprbp" podUID="5d90ef75-1dc2-4876-8272-eded34842bc1" Mar 13 00:38:04.776464 containerd[1633]: time="2026-03-13T00:38:04.776406453Z" level=error msg="Failed to destroy network for sandbox \"c75de5907ce40cd0cb0bf603fa339b345f2c872c9d656f85fc5287161bcf5de6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.778119 containerd[1633]: time="2026-03-13T00:38:04.778094351Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-cf6ff987d-26f8v,Uid:ae5a2a4e-998d-40a0-a028-10322687d711,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c75de5907ce40cd0cb0bf603fa339b345f2c872c9d656f85fc5287161bcf5de6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.778407 kubelet[2778]: E0313 00:38:04.778391 2778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c75de5907ce40cd0cb0bf603fa339b345f2c872c9d656f85fc5287161bcf5de6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.778504 kubelet[2778]: E0313 00:38:04.778457 2778 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c75de5907ce40cd0cb0bf603fa339b345f2c872c9d656f85fc5287161bcf5de6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-cf6ff987d-26f8v" Mar 13 00:38:04.778557 kubelet[2778]: E0313 00:38:04.778543 2778 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c75de5907ce40cd0cb0bf603fa339b345f2c872c9d656f85fc5287161bcf5de6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-cf6ff987d-26f8v" Mar 13 00:38:04.778621 kubelet[2778]: E0313 00:38:04.778607 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-cf6ff987d-26f8v_calico-system(ae5a2a4e-998d-40a0-a028-10322687d711)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-cf6ff987d-26f8v_calico-system(ae5a2a4e-998d-40a0-a028-10322687d711)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c75de5907ce40cd0cb0bf603fa339b345f2c872c9d656f85fc5287161bcf5de6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-cf6ff987d-26f8v" podUID="ae5a2a4e-998d-40a0-a028-10322687d711" Mar 13 00:38:04.783154 containerd[1633]: time="2026-03-13T00:38:04.783130209Z" level=error msg="Failed to destroy network for sandbox \"fed5faa5b3bf10401442e99b0413d6b81c7ef0a29777676a26f34190c140ecf8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.784194 containerd[1633]: time="2026-03-13T00:38:04.784175584Z" level=error msg="Failed to destroy network for sandbox \"8facfefe33651c49e225f16cd409546808ca88d049c6f78a53a45379624eb702\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.784657 containerd[1633]: time="2026-03-13T00:38:04.784634897Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zqjwk,Uid:8d70918b-f6c4-496d-b3bf-681bf2962192,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fed5faa5b3bf10401442e99b0413d6b81c7ef0a29777676a26f34190c140ecf8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.785181 kubelet[2778]: E0313 00:38:04.785135 2778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fed5faa5b3bf10401442e99b0413d6b81c7ef0a29777676a26f34190c140ecf8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.785229 kubelet[2778]: E0313 00:38:04.785184 2778 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fed5faa5b3bf10401442e99b0413d6b81c7ef0a29777676a26f34190c140ecf8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-zqjwk" Mar 13 00:38:04.785229 kubelet[2778]: E0313 00:38:04.785196 2778 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fed5faa5b3bf10401442e99b0413d6b81c7ef0a29777676a26f34190c140ecf8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-zqjwk" Mar 13 00:38:04.785282 kubelet[2778]: E0313 00:38:04.785260 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-zqjwk_kube-system(8d70918b-f6c4-496d-b3bf-681bf2962192)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-zqjwk_kube-system(8d70918b-f6c4-496d-b3bf-681bf2962192)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fed5faa5b3bf10401442e99b0413d6b81c7ef0a29777676a26f34190c140ecf8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-zqjwk" podUID="8d70918b-f6c4-496d-b3bf-681bf2962192" Mar 13 00:38:04.786075 containerd[1633]: time="2026-03-13T00:38:04.786054925Z" level=error msg="Failed to destroy network for sandbox \"c3973ac0d1d8ae5abaa4884f60515ec31bb8c149dc71b633d2ad3967c5be30f0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.786457 containerd[1633]: time="2026-03-13T00:38:04.786433897Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7948d5d9db-xhktz,Uid:5761654e-dacf-4b91-a01a-1840500b12d9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8facfefe33651c49e225f16cd409546808ca88d049c6f78a53a45379624eb702\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.787097 kubelet[2778]: E0313 00:38:04.786715 2778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8facfefe33651c49e225f16cd409546808ca88d049c6f78a53a45379624eb702\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.787097 kubelet[2778]: E0313 00:38:04.786736 2778 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8facfefe33651c49e225f16cd409546808ca88d049c6f78a53a45379624eb702\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7948d5d9db-xhktz" Mar 13 00:38:04.787097 kubelet[2778]: E0313 00:38:04.786780 2778 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8facfefe33651c49e225f16cd409546808ca88d049c6f78a53a45379624eb702\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7948d5d9db-xhktz" Mar 13 00:38:04.787164 kubelet[2778]: E0313 00:38:04.786811 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7948d5d9db-xhktz_calico-system(5761654e-dacf-4b91-a01a-1840500b12d9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7948d5d9db-xhktz_calico-system(5761654e-dacf-4b91-a01a-1840500b12d9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8facfefe33651c49e225f16cd409546808ca88d049c6f78a53a45379624eb702\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7948d5d9db-xhktz" podUID="5761654e-dacf-4b91-a01a-1840500b12d9" Mar 13 00:38:04.788015 containerd[1633]: time="2026-03-13T00:38:04.787854844Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-67z6c,Uid:a6077e04-8bca-4d2d-8663-d83dccf74534,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3973ac0d1d8ae5abaa4884f60515ec31bb8c149dc71b633d2ad3967c5be30f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.788079 kubelet[2778]: E0313 00:38:04.787941 2778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3973ac0d1d8ae5abaa4884f60515ec31bb8c149dc71b633d2ad3967c5be30f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.788079 kubelet[2778]: E0313 00:38:04.787962 2778 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3973ac0d1d8ae5abaa4884f60515ec31bb8c149dc71b633d2ad3967c5be30f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-67z6c" Mar 13 00:38:04.788079 kubelet[2778]: E0313 00:38:04.787972 2778 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3973ac0d1d8ae5abaa4884f60515ec31bb8c149dc71b633d2ad3967c5be30f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-67z6c" Mar 13 00:38:04.788137 kubelet[2778]: E0313 00:38:04.787994 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-5b85766d88-67z6c_calico-system(a6077e04-8bca-4d2d-8663-d83dccf74534)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-5b85766d88-67z6c_calico-system(a6077e04-8bca-4d2d-8663-d83dccf74534)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c3973ac0d1d8ae5abaa4884f60515ec31bb8c149dc71b633d2ad3967c5be30f0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-5b85766d88-67z6c" podUID="a6077e04-8bca-4d2d-8663-d83dccf74534" Mar 13 00:38:04.788912 containerd[1633]: time="2026-03-13T00:38:04.788887000Z" level=error msg="Failed to destroy network for sandbox \"f20e1751cd3be16e1c67d4b28cfc14bea60a0765890eb4af8f62f54a5e4d4eae\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.790095 containerd[1633]: time="2026-03-13T00:38:04.790071476Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59cfd78dd5-nddx7,Uid:20f68c7b-0d72-4da1-897b-e7098297c805,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f20e1751cd3be16e1c67d4b28cfc14bea60a0765890eb4af8f62f54a5e4d4eae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.790455 kubelet[2778]: E0313 00:38:04.790172 2778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f20e1751cd3be16e1c67d4b28cfc14bea60a0765890eb4af8f62f54a5e4d4eae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:38:04.790455 kubelet[2778]: E0313 00:38:04.790190 2778 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f20e1751cd3be16e1c67d4b28cfc14bea60a0765890eb4af8f62f54a5e4d4eae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-59cfd78dd5-nddx7" Mar 13 00:38:04.790455 kubelet[2778]: E0313 00:38:04.790202 2778 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f20e1751cd3be16e1c67d4b28cfc14bea60a0765890eb4af8f62f54a5e4d4eae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-59cfd78dd5-nddx7" Mar 13 00:38:04.790640 kubelet[2778]: E0313 00:38:04.790241 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-59cfd78dd5-nddx7_calico-system(20f68c7b-0d72-4da1-897b-e7098297c805)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-59cfd78dd5-nddx7_calico-system(20f68c7b-0d72-4da1-897b-e7098297c805)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f20e1751cd3be16e1c67d4b28cfc14bea60a0765890eb4af8f62f54a5e4d4eae\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-59cfd78dd5-nddx7" podUID="20f68c7b-0d72-4da1-897b-e7098297c805" Mar 13 00:38:05.210460 containerd[1633]: time="2026-03-13T00:38:05.210308836Z" level=info msg="CreateContainer within sandbox \"5dedde40c500f6f10ed2b34550b9e94ea1eaaf0cd546c5f2e8a0353736850260\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 13 00:38:05.228213 containerd[1633]: time="2026-03-13T00:38:05.228172948Z" level=info msg="Container c057bc25a1e5c9f01447e2709305b43140eace5f116b7e611f4c6f82c6678b62: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:38:05.234835 containerd[1633]: time="2026-03-13T00:38:05.234807591Z" level=info msg="CreateContainer within sandbox \"5dedde40c500f6f10ed2b34550b9e94ea1eaaf0cd546c5f2e8a0353736850260\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c057bc25a1e5c9f01447e2709305b43140eace5f116b7e611f4c6f82c6678b62\"" Mar 13 00:38:05.236347 containerd[1633]: time="2026-03-13T00:38:05.235337244Z" level=info msg="StartContainer for \"c057bc25a1e5c9f01447e2709305b43140eace5f116b7e611f4c6f82c6678b62\"" Mar 13 00:38:05.236504 containerd[1633]: time="2026-03-13T00:38:05.236452320Z" level=info msg="connecting to shim c057bc25a1e5c9f01447e2709305b43140eace5f116b7e611f4c6f82c6678b62" address="unix:///run/containerd/s/9d31bade4de5d081f36fa858e7b9702abbaa6cd01d83d37123ed3af2ae152355" protocol=ttrpc version=3 Mar 13 00:38:05.254602 systemd[1]: Started cri-containerd-c057bc25a1e5c9f01447e2709305b43140eace5f116b7e611f4c6f82c6678b62.scope - libcontainer container c057bc25a1e5c9f01447e2709305b43140eace5f116b7e611f4c6f82c6678b62. Mar 13 00:38:05.321171 containerd[1633]: time="2026-03-13T00:38:05.321129905Z" level=info msg="StartContainer for \"c057bc25a1e5c9f01447e2709305b43140eace5f116b7e611f4c6f82c6678b62\" returns successfully" Mar 13 00:38:05.479894 kubelet[2778]: I0313 00:38:05.479782 2778 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2k65\" (UniqueName: \"kubernetes.io/projected/ae5a2a4e-998d-40a0-a028-10322687d711-kube-api-access-v2k65\") pod \"ae5a2a4e-998d-40a0-a028-10322687d711\" (UID: \"ae5a2a4e-998d-40a0-a028-10322687d711\") " Mar 13 00:38:05.479894 kubelet[2778]: I0313 00:38:05.479838 2778 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae5a2a4e-998d-40a0-a028-10322687d711-whisker-ca-bundle\") pod \"ae5a2a4e-998d-40a0-a028-10322687d711\" (UID: \"ae5a2a4e-998d-40a0-a028-10322687d711\") " Mar 13 00:38:05.479894 kubelet[2778]: I0313 00:38:05.479869 2778 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ae5a2a4e-998d-40a0-a028-10322687d711-whisker-backend-key-pair\") pod \"ae5a2a4e-998d-40a0-a028-10322687d711\" (UID: \"ae5a2a4e-998d-40a0-a028-10322687d711\") " Mar 13 00:38:05.480857 kubelet[2778]: I0313 00:38:05.479997 2778 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/ae5a2a4e-998d-40a0-a028-10322687d711-nginx-config\") pod \"ae5a2a4e-998d-40a0-a028-10322687d711\" (UID: \"ae5a2a4e-998d-40a0-a028-10322687d711\") " Mar 13 00:38:05.480857 kubelet[2778]: I0313 00:38:05.480526 2778 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5a2a4e-998d-40a0-a028-10322687d711-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "ae5a2a4e-998d-40a0-a028-10322687d711" (UID: "ae5a2a4e-998d-40a0-a028-10322687d711"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 13 00:38:05.480992 kubelet[2778]: I0313 00:38:05.480951 2778 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5a2a4e-998d-40a0-a028-10322687d711-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "ae5a2a4e-998d-40a0-a028-10322687d711" (UID: "ae5a2a4e-998d-40a0-a028-10322687d711"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 13 00:38:05.485231 kubelet[2778]: I0313 00:38:05.485173 2778 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae5a2a4e-998d-40a0-a028-10322687d711-kube-api-access-v2k65" (OuterVolumeSpecName: "kube-api-access-v2k65") pod "ae5a2a4e-998d-40a0-a028-10322687d711" (UID: "ae5a2a4e-998d-40a0-a028-10322687d711"). InnerVolumeSpecName "kube-api-access-v2k65". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 13 00:38:05.485960 kubelet[2778]: I0313 00:38:05.485931 2778 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae5a2a4e-998d-40a0-a028-10322687d711-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "ae5a2a4e-998d-40a0-a028-10322687d711" (UID: "ae5a2a4e-998d-40a0-a028-10322687d711"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 13 00:38:05.581279 kubelet[2778]: I0313 00:38:05.581237 2778 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ae5a2a4e-998d-40a0-a028-10322687d711-whisker-backend-key-pair\") on node \"ci-4459-2-4-n-86976195a3\" DevicePath \"\"" Mar 13 00:38:05.581279 kubelet[2778]: I0313 00:38:05.581277 2778 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/ae5a2a4e-998d-40a0-a028-10322687d711-nginx-config\") on node \"ci-4459-2-4-n-86976195a3\" DevicePath \"\"" Mar 13 00:38:05.581279 kubelet[2778]: I0313 00:38:05.581284 2778 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v2k65\" (UniqueName: \"kubernetes.io/projected/ae5a2a4e-998d-40a0-a028-10322687d711-kube-api-access-v2k65\") on node \"ci-4459-2-4-n-86976195a3\" DevicePath \"\"" Mar 13 00:38:05.581432 kubelet[2778]: I0313 00:38:05.581291 2778 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae5a2a4e-998d-40a0-a028-10322687d711-whisker-ca-bundle\") on node \"ci-4459-2-4-n-86976195a3\" DevicePath \"\"" Mar 13 00:38:05.611198 systemd[1]: run-netns-cni\x2d13e94739\x2deecc\x2d80c5\x2dc5c8\x2d505c9903a91d.mount: Deactivated successfully. Mar 13 00:38:05.611428 systemd[1]: run-netns-cni\x2d6f09b35f\x2d6c69\x2d29fb\x2d4d45\x2de906c4cc30e4.mount: Deactivated successfully. Mar 13 00:38:05.611562 systemd[1]: run-netns-cni\x2d5ac85de6\x2d1aad\x2d8186\x2dc85c\x2d087de82f5f13.mount: Deactivated successfully. Mar 13 00:38:05.611696 systemd[1]: run-netns-cni\x2d76cfe78e\x2deae2\x2d3308\x2d2af0\x2d74429d89d218.mount: Deactivated successfully. Mar 13 00:38:05.611830 systemd[1]: run-netns-cni\x2d9b11a6a9\x2dadff\x2d164f\x2d6a1d\x2d247ddaa673aa.mount: Deactivated successfully. Mar 13 00:38:05.611982 systemd[1]: var-lib-kubelet-pods-ae5a2a4e\x2d998d\x2d40a0\x2da028\x2d10322687d711-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dv2k65.mount: Deactivated successfully. Mar 13 00:38:05.612104 systemd[1]: var-lib-kubelet-pods-ae5a2a4e\x2d998d\x2d40a0\x2da028\x2d10322687d711-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 13 00:38:06.078503 systemd[1]: Created slice kubepods-besteffort-pod61d0fe9a_7940_4074_923f_15bb8359aff2.slice - libcontainer container kubepods-besteffort-pod61d0fe9a_7940_4074_923f_15bb8359aff2.slice. Mar 13 00:38:06.083511 containerd[1633]: time="2026-03-13T00:38:06.083428138Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8jxzz,Uid:61d0fe9a-7940-4074-923f-15bb8359aff2,Namespace:calico-system,Attempt:0,}" Mar 13 00:38:06.216602 systemd[1]: Removed slice kubepods-besteffort-podae5a2a4e_998d_40a0_a028_10322687d711.slice - libcontainer container kubepods-besteffort-podae5a2a4e_998d_40a0_a028_10322687d711.slice. Mar 13 00:38:06.223806 kubelet[2778]: I0313 00:38:06.223595 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-mxq97" podStartSLOduration=3.568311634 podStartE2EDuration="16.223582551s" podCreationTimestamp="2026-03-13 00:37:50 +0000 UTC" firstStartedPulling="2026-03-13 00:37:50.943956188 +0000 UTC m=+15.956911523" lastFinishedPulling="2026-03-13 00:38:03.599227105 +0000 UTC m=+28.612182440" observedRunningTime="2026-03-13 00:38:06.220176294 +0000 UTC m=+31.233131639" watchObservedRunningTime="2026-03-13 00:38:06.223582551 +0000 UTC m=+31.236537886" Mar 13 00:38:06.245417 systemd-networkd[1488]: cali58df81597c3: Link UP Mar 13 00:38:06.246660 systemd-networkd[1488]: cali58df81597c3: Gained carrier Mar 13 00:38:06.283736 containerd[1633]: 2026-03-13 00:38:06.133 [ERROR][3908] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 13 00:38:06.283736 containerd[1633]: 2026-03-13 00:38:06.155 [INFO][3908] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--86976195a3-k8s-csi--node--driver--8jxzz-eth0 csi-node-driver- calico-system 61d0fe9a-7940-4074-923f-15bb8359aff2 683 0 2026-03-13 00:37:50 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6d9d697c7c k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-4-n-86976195a3 csi-node-driver-8jxzz eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali58df81597c3 [] [] }} ContainerID="83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" Namespace="calico-system" Pod="csi-node-driver-8jxzz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-csi--node--driver--8jxzz-" Mar 13 00:38:06.283736 containerd[1633]: 2026-03-13 00:38:06.155 [INFO][3908] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" Namespace="calico-system" Pod="csi-node-driver-8jxzz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-csi--node--driver--8jxzz-eth0" Mar 13 00:38:06.283736 containerd[1633]: 2026-03-13 00:38:06.180 [INFO][3919] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" HandleID="k8s-pod-network.83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" Workload="ci--4459--2--4--n--86976195a3-k8s-csi--node--driver--8jxzz-eth0" Mar 13 00:38:06.283990 containerd[1633]: 2026-03-13 00:38:06.185 [INFO][3919] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" HandleID="k8s-pod-network.83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" Workload="ci--4459--2--4--n--86976195a3-k8s-csi--node--driver--8jxzz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e9e80), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-86976195a3", "pod":"csi-node-driver-8jxzz", "timestamp":"2026-03-13 00:38:06.180105949 +0000 UTC"}, Hostname:"ci-4459-2-4-n-86976195a3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003c0f20)} Mar 13 00:38:06.283990 containerd[1633]: 2026-03-13 00:38:06.185 [INFO][3919] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:38:06.283990 containerd[1633]: 2026-03-13 00:38:06.185 [INFO][3919] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:38:06.283990 containerd[1633]: 2026-03-13 00:38:06.185 [INFO][3919] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-86976195a3' Mar 13 00:38:06.283990 containerd[1633]: 2026-03-13 00:38:06.187 [INFO][3919] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.283990 containerd[1633]: 2026-03-13 00:38:06.191 [INFO][3919] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.283990 containerd[1633]: 2026-03-13 00:38:06.198 [INFO][3919] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.283990 containerd[1633]: 2026-03-13 00:38:06.199 [INFO][3919] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.283990 containerd[1633]: 2026-03-13 00:38:06.201 [INFO][3919] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.284130 containerd[1633]: 2026-03-13 00:38:06.201 [INFO][3919] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.284130 containerd[1633]: 2026-03-13 00:38:06.203 [INFO][3919] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f Mar 13 00:38:06.284130 containerd[1633]: 2026-03-13 00:38:06.214 [INFO][3919] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.284130 containerd[1633]: 2026-03-13 00:38:06.226 [INFO][3919] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.65/26] block=192.168.119.64/26 handle="k8s-pod-network.83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.284130 containerd[1633]: 2026-03-13 00:38:06.226 [INFO][3919] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.65/26] handle="k8s-pod-network.83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.284130 containerd[1633]: 2026-03-13 00:38:06.226 [INFO][3919] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:38:06.284130 containerd[1633]: 2026-03-13 00:38:06.226 [INFO][3919] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.65/26] IPv6=[] ContainerID="83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" HandleID="k8s-pod-network.83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" Workload="ci--4459--2--4--n--86976195a3-k8s-csi--node--driver--8jxzz-eth0" Mar 13 00:38:06.284233 containerd[1633]: 2026-03-13 00:38:06.234 [INFO][3908] cni-plugin/k8s.go 418: Populated endpoint ContainerID="83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" Namespace="calico-system" Pod="csi-node-driver-8jxzz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-csi--node--driver--8jxzz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-csi--node--driver--8jxzz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"61d0fe9a-7940-4074-923f-15bb8359aff2", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"", Pod:"csi-node-driver-8jxzz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.119.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali58df81597c3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:06.284281 containerd[1633]: 2026-03-13 00:38:06.234 [INFO][3908] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.65/32] ContainerID="83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" Namespace="calico-system" Pod="csi-node-driver-8jxzz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-csi--node--driver--8jxzz-eth0" Mar 13 00:38:06.284281 containerd[1633]: 2026-03-13 00:38:06.234 [INFO][3908] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali58df81597c3 ContainerID="83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" Namespace="calico-system" Pod="csi-node-driver-8jxzz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-csi--node--driver--8jxzz-eth0" Mar 13 00:38:06.284281 containerd[1633]: 2026-03-13 00:38:06.243 [INFO][3908] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" Namespace="calico-system" Pod="csi-node-driver-8jxzz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-csi--node--driver--8jxzz-eth0" Mar 13 00:38:06.284329 containerd[1633]: 2026-03-13 00:38:06.243 [INFO][3908] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" Namespace="calico-system" Pod="csi-node-driver-8jxzz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-csi--node--driver--8jxzz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-csi--node--driver--8jxzz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"61d0fe9a-7940-4074-923f-15bb8359aff2", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f", Pod:"csi-node-driver-8jxzz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.119.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali58df81597c3", MAC:"d6:08:cf:5c:c5:a5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:06.284371 containerd[1633]: 2026-03-13 00:38:06.270 [INFO][3908] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" Namespace="calico-system" Pod="csi-node-driver-8jxzz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-csi--node--driver--8jxzz-eth0" Mar 13 00:38:06.324462 systemd[1]: Created slice kubepods-besteffort-pod1ffb29a0_7c98_4b29_a634_89c2a6ac3af6.slice - libcontainer container kubepods-besteffort-pod1ffb29a0_7c98_4b29_a634_89c2a6ac3af6.slice. Mar 13 00:38:06.335374 containerd[1633]: time="2026-03-13T00:38:06.334643352Z" level=info msg="connecting to shim 83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f" address="unix:///run/containerd/s/89c7a61084bc50a76193f37428709a2195b21f4594e6c3611c2c03631978a2d2" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:38:06.361647 systemd[1]: Started cri-containerd-83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f.scope - libcontainer container 83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f. Mar 13 00:38:06.385090 containerd[1633]: time="2026-03-13T00:38:06.385063687Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8jxzz,Uid:61d0fe9a-7940-4074-923f-15bb8359aff2,Namespace:calico-system,Attempt:0,} returns sandbox id \"83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f\"" Mar 13 00:38:06.386384 containerd[1633]: time="2026-03-13T00:38:06.386369554Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 13 00:38:06.387814 kubelet[2778]: I0313 00:38:06.387779 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ffb29a0-7c98-4b29-a634-89c2a6ac3af6-whisker-ca-bundle\") pod \"whisker-7b8598b69f-qcdsb\" (UID: \"1ffb29a0-7c98-4b29-a634-89c2a6ac3af6\") " pod="calico-system/whisker-7b8598b69f-qcdsb" Mar 13 00:38:06.388163 kubelet[2778]: I0313 00:38:06.388093 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1ffb29a0-7c98-4b29-a634-89c2a6ac3af6-whisker-backend-key-pair\") pod \"whisker-7b8598b69f-qcdsb\" (UID: \"1ffb29a0-7c98-4b29-a634-89c2a6ac3af6\") " pod="calico-system/whisker-7b8598b69f-qcdsb" Mar 13 00:38:06.388163 kubelet[2778]: I0313 00:38:06.388126 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwx6k\" (UniqueName: \"kubernetes.io/projected/1ffb29a0-7c98-4b29-a634-89c2a6ac3af6-kube-api-access-zwx6k\") pod \"whisker-7b8598b69f-qcdsb\" (UID: \"1ffb29a0-7c98-4b29-a634-89c2a6ac3af6\") " pod="calico-system/whisker-7b8598b69f-qcdsb" Mar 13 00:38:06.388163 kubelet[2778]: I0313 00:38:06.388140 2778 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/1ffb29a0-7c98-4b29-a634-89c2a6ac3af6-nginx-config\") pod \"whisker-7b8598b69f-qcdsb\" (UID: \"1ffb29a0-7c98-4b29-a634-89c2a6ac3af6\") " pod="calico-system/whisker-7b8598b69f-qcdsb" Mar 13 00:38:06.630086 containerd[1633]: time="2026-03-13T00:38:06.629981499Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7b8598b69f-qcdsb,Uid:1ffb29a0-7c98-4b29-a634-89c2a6ac3af6,Namespace:calico-system,Attempt:0,}" Mar 13 00:38:06.765290 systemd-networkd[1488]: calib75464e219f: Link UP Mar 13 00:38:06.767070 systemd-networkd[1488]: calib75464e219f: Gained carrier Mar 13 00:38:06.791500 containerd[1633]: 2026-03-13 00:38:06.668 [ERROR][4078] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 13 00:38:06.791500 containerd[1633]: 2026-03-13 00:38:06.680 [INFO][4078] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--86976195a3-k8s-whisker--7b8598b69f--qcdsb-eth0 whisker-7b8598b69f- calico-system 1ffb29a0-7c98-4b29-a634-89c2a6ac3af6 878 0 2026-03-13 00:38:06 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7b8598b69f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-4-n-86976195a3 whisker-7b8598b69f-qcdsb eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calib75464e219f [] [] }} ContainerID="c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" Namespace="calico-system" Pod="whisker-7b8598b69f-qcdsb" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-whisker--7b8598b69f--qcdsb-" Mar 13 00:38:06.791500 containerd[1633]: 2026-03-13 00:38:06.681 [INFO][4078] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" Namespace="calico-system" Pod="whisker-7b8598b69f-qcdsb" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-whisker--7b8598b69f--qcdsb-eth0" Mar 13 00:38:06.791500 containerd[1633]: 2026-03-13 00:38:06.722 [INFO][4108] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" HandleID="k8s-pod-network.c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" Workload="ci--4459--2--4--n--86976195a3-k8s-whisker--7b8598b69f--qcdsb-eth0" Mar 13 00:38:06.791720 containerd[1633]: 2026-03-13 00:38:06.730 [INFO][4108] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" HandleID="k8s-pod-network.c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" Workload="ci--4459--2--4--n--86976195a3-k8s-whisker--7b8598b69f--qcdsb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277470), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-86976195a3", "pod":"whisker-7b8598b69f-qcdsb", "timestamp":"2026-03-13 00:38:06.722702701 +0000 UTC"}, Hostname:"ci-4459-2-4-n-86976195a3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00032cdc0)} Mar 13 00:38:06.791720 containerd[1633]: 2026-03-13 00:38:06.730 [INFO][4108] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:38:06.791720 containerd[1633]: 2026-03-13 00:38:06.730 [INFO][4108] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:38:06.791720 containerd[1633]: 2026-03-13 00:38:06.730 [INFO][4108] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-86976195a3' Mar 13 00:38:06.791720 containerd[1633]: 2026-03-13 00:38:06.732 [INFO][4108] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.791720 containerd[1633]: 2026-03-13 00:38:06.737 [INFO][4108] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.791720 containerd[1633]: 2026-03-13 00:38:06.742 [INFO][4108] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.791720 containerd[1633]: 2026-03-13 00:38:06.744 [INFO][4108] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.791720 containerd[1633]: 2026-03-13 00:38:06.746 [INFO][4108] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.791866 containerd[1633]: 2026-03-13 00:38:06.746 [INFO][4108] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.791866 containerd[1633]: 2026-03-13 00:38:06.747 [INFO][4108] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5 Mar 13 00:38:06.791866 containerd[1633]: 2026-03-13 00:38:06.752 [INFO][4108] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.791866 containerd[1633]: 2026-03-13 00:38:06.756 [INFO][4108] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.66/26] block=192.168.119.64/26 handle="k8s-pod-network.c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.791866 containerd[1633]: 2026-03-13 00:38:06.757 [INFO][4108] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.66/26] handle="k8s-pod-network.c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:06.791866 containerd[1633]: 2026-03-13 00:38:06.757 [INFO][4108] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:38:06.791866 containerd[1633]: 2026-03-13 00:38:06.757 [INFO][4108] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.66/26] IPv6=[] ContainerID="c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" HandleID="k8s-pod-network.c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" Workload="ci--4459--2--4--n--86976195a3-k8s-whisker--7b8598b69f--qcdsb-eth0" Mar 13 00:38:06.791980 containerd[1633]: 2026-03-13 00:38:06.760 [INFO][4078] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" Namespace="calico-system" Pod="whisker-7b8598b69f-qcdsb" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-whisker--7b8598b69f--qcdsb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-whisker--7b8598b69f--qcdsb-eth0", GenerateName:"whisker-7b8598b69f-", Namespace:"calico-system", SelfLink:"", UID:"1ffb29a0-7c98-4b29-a634-89c2a6ac3af6", ResourceVersion:"878", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 38, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7b8598b69f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"", Pod:"whisker-7b8598b69f-qcdsb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.119.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib75464e219f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:06.791980 containerd[1633]: 2026-03-13 00:38:06.760 [INFO][4078] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.66/32] ContainerID="c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" Namespace="calico-system" Pod="whisker-7b8598b69f-qcdsb" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-whisker--7b8598b69f--qcdsb-eth0" Mar 13 00:38:06.792046 containerd[1633]: 2026-03-13 00:38:06.760 [INFO][4078] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib75464e219f ContainerID="c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" Namespace="calico-system" Pod="whisker-7b8598b69f-qcdsb" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-whisker--7b8598b69f--qcdsb-eth0" Mar 13 00:38:06.792046 containerd[1633]: 2026-03-13 00:38:06.777 [INFO][4078] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" Namespace="calico-system" Pod="whisker-7b8598b69f-qcdsb" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-whisker--7b8598b69f--qcdsb-eth0" Mar 13 00:38:06.792079 containerd[1633]: 2026-03-13 00:38:06.778 [INFO][4078] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" Namespace="calico-system" Pod="whisker-7b8598b69f-qcdsb" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-whisker--7b8598b69f--qcdsb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-whisker--7b8598b69f--qcdsb-eth0", GenerateName:"whisker-7b8598b69f-", Namespace:"calico-system", SelfLink:"", UID:"1ffb29a0-7c98-4b29-a634-89c2a6ac3af6", ResourceVersion:"878", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 38, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7b8598b69f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5", Pod:"whisker-7b8598b69f-qcdsb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.119.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib75464e219f", MAC:"f6:a1:90:df:e7:7a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:06.792119 containerd[1633]: 2026-03-13 00:38:06.785 [INFO][4078] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" Namespace="calico-system" Pod="whisker-7b8598b69f-qcdsb" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-whisker--7b8598b69f--qcdsb-eth0" Mar 13 00:38:06.826186 containerd[1633]: time="2026-03-13T00:38:06.826022643Z" level=info msg="connecting to shim c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5" address="unix:///run/containerd/s/f3370652a75deff2a013b6ebad2cd1aab700fd76de177924b0a3f63567b3efaa" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:38:06.870596 systemd[1]: Started cri-containerd-c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5.scope - libcontainer container c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5. Mar 13 00:38:06.929450 containerd[1633]: time="2026-03-13T00:38:06.929329446Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7b8598b69f-qcdsb,Uid:1ffb29a0-7c98-4b29-a634-89c2a6ac3af6,Namespace:calico-system,Attempt:0,} returns sandbox id \"c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5\"" Mar 13 00:38:07.071865 kubelet[2778]: I0313 00:38:07.071818 2778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae5a2a4e-998d-40a0-a028-10322687d711" path="/var/lib/kubelet/pods/ae5a2a4e-998d-40a0-a028-10322687d711/volumes" Mar 13 00:38:07.391527 systemd-networkd[1488]: vxlan.calico: Link UP Mar 13 00:38:07.391534 systemd-networkd[1488]: vxlan.calico: Gained carrier Mar 13 00:38:07.826439 systemd-networkd[1488]: cali58df81597c3: Gained IPv6LL Mar 13 00:38:08.049838 containerd[1633]: time="2026-03-13T00:38:08.049785666Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:08.051048 containerd[1633]: time="2026-03-13T00:38:08.050935990Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 13 00:38:08.051941 containerd[1633]: time="2026-03-13T00:38:08.051915954Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:08.053846 containerd[1633]: time="2026-03-13T00:38:08.053819713Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:08.054319 containerd[1633]: time="2026-03-13T00:38:08.054288315Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 1.667760451s" Mar 13 00:38:08.054373 containerd[1633]: time="2026-03-13T00:38:08.054363255Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 13 00:38:08.055794 containerd[1633]: time="2026-03-13T00:38:08.055768662Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 13 00:38:08.057812 containerd[1633]: time="2026-03-13T00:38:08.057785840Z" level=info msg="CreateContainer within sandbox \"83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 13 00:38:08.068542 containerd[1633]: time="2026-03-13T00:38:08.067863105Z" level=info msg="Container 75c04f99cba427ef19a9aa4af4feb84b35d5ba88a9e1a163df6542a67e2cd2ad: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:38:08.072146 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1433451317.mount: Deactivated successfully. Mar 13 00:38:08.083664 containerd[1633]: time="2026-03-13T00:38:08.083538743Z" level=info msg="CreateContainer within sandbox \"83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"75c04f99cba427ef19a9aa4af4feb84b35d5ba88a9e1a163df6542a67e2cd2ad\"" Mar 13 00:38:08.084110 containerd[1633]: time="2026-03-13T00:38:08.084087475Z" level=info msg="StartContainer for \"75c04f99cba427ef19a9aa4af4feb84b35d5ba88a9e1a163df6542a67e2cd2ad\"" Mar 13 00:38:08.086365 containerd[1633]: time="2026-03-13T00:38:08.086326695Z" level=info msg="connecting to shim 75c04f99cba427ef19a9aa4af4feb84b35d5ba88a9e1a163df6542a67e2cd2ad" address="unix:///run/containerd/s/89c7a61084bc50a76193f37428709a2195b21f4594e6c3611c2c03631978a2d2" protocol=ttrpc version=3 Mar 13 00:38:08.106595 systemd[1]: Started cri-containerd-75c04f99cba427ef19a9aa4af4feb84b35d5ba88a9e1a163df6542a67e2cd2ad.scope - libcontainer container 75c04f99cba427ef19a9aa4af4feb84b35d5ba88a9e1a163df6542a67e2cd2ad. Mar 13 00:38:08.169917 containerd[1633]: time="2026-03-13T00:38:08.169861952Z" level=info msg="StartContainer for \"75c04f99cba427ef19a9aa4af4feb84b35d5ba88a9e1a163df6542a67e2cd2ad\" returns successfully" Mar 13 00:38:08.465833 systemd-networkd[1488]: calib75464e219f: Gained IPv6LL Mar 13 00:38:09.297788 systemd-networkd[1488]: vxlan.calico: Gained IPv6LL Mar 13 00:38:09.918597 containerd[1633]: time="2026-03-13T00:38:09.918551446Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:09.919635 containerd[1633]: time="2026-03-13T00:38:09.919491040Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 13 00:38:09.920511 containerd[1633]: time="2026-03-13T00:38:09.920494585Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:09.922204 containerd[1633]: time="2026-03-13T00:38:09.922182691Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:09.922668 containerd[1633]: time="2026-03-13T00:38:09.922648414Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 1.866855682s" Mar 13 00:38:09.922731 containerd[1633]: time="2026-03-13T00:38:09.922721074Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 13 00:38:09.923469 containerd[1633]: time="2026-03-13T00:38:09.923428406Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 13 00:38:09.926106 containerd[1633]: time="2026-03-13T00:38:09.926014998Z" level=info msg="CreateContainer within sandbox \"c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 13 00:38:09.934513 containerd[1633]: time="2026-03-13T00:38:09.933823690Z" level=info msg="Container 354edc18bc032b2a1be39d8bf75fe7902ca181bbfb094617edeaef959f602ea8: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:38:09.943373 containerd[1633]: time="2026-03-13T00:38:09.943337440Z" level=info msg="CreateContainer within sandbox \"c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"354edc18bc032b2a1be39d8bf75fe7902ca181bbfb094617edeaef959f602ea8\"" Mar 13 00:38:09.944104 containerd[1633]: time="2026-03-13T00:38:09.944085903Z" level=info msg="StartContainer for \"354edc18bc032b2a1be39d8bf75fe7902ca181bbfb094617edeaef959f602ea8\"" Mar 13 00:38:09.945984 containerd[1633]: time="2026-03-13T00:38:09.945957480Z" level=info msg="connecting to shim 354edc18bc032b2a1be39d8bf75fe7902ca181bbfb094617edeaef959f602ea8" address="unix:///run/containerd/s/f3370652a75deff2a013b6ebad2cd1aab700fd76de177924b0a3f63567b3efaa" protocol=ttrpc version=3 Mar 13 00:38:09.974813 systemd[1]: Started cri-containerd-354edc18bc032b2a1be39d8bf75fe7902ca181bbfb094617edeaef959f602ea8.scope - libcontainer container 354edc18bc032b2a1be39d8bf75fe7902ca181bbfb094617edeaef959f602ea8. Mar 13 00:38:10.026902 containerd[1633]: time="2026-03-13T00:38:10.026855882Z" level=info msg="StartContainer for \"354edc18bc032b2a1be39d8bf75fe7902ca181bbfb094617edeaef959f602ea8\" returns successfully" Mar 13 00:38:11.636308 containerd[1633]: time="2026-03-13T00:38:11.636257415Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:11.637195 containerd[1633]: time="2026-03-13T00:38:11.637040238Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 13 00:38:11.638123 containerd[1633]: time="2026-03-13T00:38:11.638108182Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:11.639822 containerd[1633]: time="2026-03-13T00:38:11.639806268Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:11.640222 containerd[1633]: time="2026-03-13T00:38:11.640199811Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 1.716755023s" Mar 13 00:38:11.640263 containerd[1633]: time="2026-03-13T00:38:11.640223621Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 13 00:38:11.641412 containerd[1633]: time="2026-03-13T00:38:11.641351145Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 13 00:38:11.643766 containerd[1633]: time="2026-03-13T00:38:11.643685603Z" level=info msg="CreateContainer within sandbox \"83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 13 00:38:11.654845 containerd[1633]: time="2026-03-13T00:38:11.651911904Z" level=info msg="Container e2e30b6c47c109b2b319898bc0ce8c1e196862c5ba6fabdc86fdc24a93229069: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:38:11.670330 containerd[1633]: time="2026-03-13T00:38:11.670301493Z" level=info msg="CreateContainer within sandbox \"83216e19e904ec4d141a357fcd29a477bc47941f77611016d52be0daa7f2283f\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"e2e30b6c47c109b2b319898bc0ce8c1e196862c5ba6fabdc86fdc24a93229069\"" Mar 13 00:38:11.671505 containerd[1633]: time="2026-03-13T00:38:11.670850325Z" level=info msg="StartContainer for \"e2e30b6c47c109b2b319898bc0ce8c1e196862c5ba6fabdc86fdc24a93229069\"" Mar 13 00:38:11.671945 containerd[1633]: time="2026-03-13T00:38:11.671918169Z" level=info msg="connecting to shim e2e30b6c47c109b2b319898bc0ce8c1e196862c5ba6fabdc86fdc24a93229069" address="unix:///run/containerd/s/89c7a61084bc50a76193f37428709a2195b21f4594e6c3611c2c03631978a2d2" protocol=ttrpc version=3 Mar 13 00:38:11.689594 systemd[1]: Started cri-containerd-e2e30b6c47c109b2b319898bc0ce8c1e196862c5ba6fabdc86fdc24a93229069.scope - libcontainer container e2e30b6c47c109b2b319898bc0ce8c1e196862c5ba6fabdc86fdc24a93229069. Mar 13 00:38:11.762932 containerd[1633]: time="2026-03-13T00:38:11.762875071Z" level=info msg="StartContainer for \"e2e30b6c47c109b2b319898bc0ce8c1e196862c5ba6fabdc86fdc24a93229069\" returns successfully" Mar 13 00:38:12.148874 kubelet[2778]: I0313 00:38:12.148813 2778 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 13 00:38:12.148874 kubelet[2778]: I0313 00:38:12.148876 2778 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 13 00:38:12.235358 kubelet[2778]: I0313 00:38:12.235271 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-8jxzz" podStartSLOduration=16.980408971 podStartE2EDuration="22.235259471s" podCreationTimestamp="2026-03-13 00:37:50 +0000 UTC" firstStartedPulling="2026-03-13 00:38:06.386196163 +0000 UTC m=+31.399151498" lastFinishedPulling="2026-03-13 00:38:11.641046663 +0000 UTC m=+36.654001998" observedRunningTime="2026-03-13 00:38:12.234340079 +0000 UTC m=+37.247295424" watchObservedRunningTime="2026-03-13 00:38:12.235259471 +0000 UTC m=+37.248214806" Mar 13 00:38:14.096886 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3318236282.mount: Deactivated successfully. Mar 13 00:38:14.119364 containerd[1633]: time="2026-03-13T00:38:14.119323624Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:14.120312 containerd[1633]: time="2026-03-13T00:38:14.120205187Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 13 00:38:14.120957 containerd[1633]: time="2026-03-13T00:38:14.120933579Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:14.122894 containerd[1633]: time="2026-03-13T00:38:14.122866465Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:14.124432 containerd[1633]: time="2026-03-13T00:38:14.124377810Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 2.482812395s" Mar 13 00:38:14.124432 containerd[1633]: time="2026-03-13T00:38:14.124409390Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 13 00:38:14.129744 containerd[1633]: time="2026-03-13T00:38:14.129703417Z" level=info msg="CreateContainer within sandbox \"c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 13 00:38:14.139165 containerd[1633]: time="2026-03-13T00:38:14.139129548Z" level=info msg="Container 8732f0dcd710becb81d18128daa17f748dd3f070f6d15676a8500f158b569210: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:38:14.145886 containerd[1633]: time="2026-03-13T00:38:14.145854110Z" level=info msg="CreateContainer within sandbox \"c412421fd506b5e9118ba1659ca58bb2cb410d570b104677b6c6d092164039a5\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"8732f0dcd710becb81d18128daa17f748dd3f070f6d15676a8500f158b569210\"" Mar 13 00:38:14.147210 containerd[1633]: time="2026-03-13T00:38:14.146187211Z" level=info msg="StartContainer for \"8732f0dcd710becb81d18128daa17f748dd3f070f6d15676a8500f158b569210\"" Mar 13 00:38:14.147210 containerd[1633]: time="2026-03-13T00:38:14.147154584Z" level=info msg="connecting to shim 8732f0dcd710becb81d18128daa17f748dd3f070f6d15676a8500f158b569210" address="unix:///run/containerd/s/f3370652a75deff2a013b6ebad2cd1aab700fd76de177924b0a3f63567b3efaa" protocol=ttrpc version=3 Mar 13 00:38:14.169604 systemd[1]: Started cri-containerd-8732f0dcd710becb81d18128daa17f748dd3f070f6d15676a8500f158b569210.scope - libcontainer container 8732f0dcd710becb81d18128daa17f748dd3f070f6d15676a8500f158b569210. Mar 13 00:38:14.213593 containerd[1633]: time="2026-03-13T00:38:14.213524059Z" level=info msg="StartContainer for \"8732f0dcd710becb81d18128daa17f748dd3f070f6d15676a8500f158b569210\" returns successfully" Mar 13 00:38:14.243983 kubelet[2778]: I0313 00:38:14.243778 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7b8598b69f-qcdsb" podStartSLOduration=1.048493934 podStartE2EDuration="8.243766177s" podCreationTimestamp="2026-03-13 00:38:06 +0000 UTC" firstStartedPulling="2026-03-13 00:38:06.930273921 +0000 UTC m=+31.943229266" lastFinishedPulling="2026-03-13 00:38:14.125546164 +0000 UTC m=+39.138501509" observedRunningTime="2026-03-13 00:38:14.243609066 +0000 UTC m=+39.256564401" watchObservedRunningTime="2026-03-13 00:38:14.243766177 +0000 UTC m=+39.256721512" Mar 13 00:38:16.070325 containerd[1633]: time="2026-03-13T00:38:16.070209225Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59cfd78dd5-nddx7,Uid:20f68c7b-0d72-4da1-897b-e7098297c805,Namespace:calico-system,Attempt:0,}" Mar 13 00:38:16.071211 containerd[1633]: time="2026-03-13T00:38:16.070237605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-nprbp,Uid:5d90ef75-1dc2-4876-8272-eded34842bc1,Namespace:kube-system,Attempt:0,}" Mar 13 00:38:16.206035 systemd-networkd[1488]: cali97f0dcbd806: Link UP Mar 13 00:38:16.206300 systemd-networkd[1488]: cali97f0dcbd806: Gained carrier Mar 13 00:38:16.222887 containerd[1633]: 2026-03-13 00:38:16.137 [INFO][4481] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--nddx7-eth0 calico-apiserver-59cfd78dd5- calico-system 20f68c7b-0d72-4da1-897b-e7098297c805 819 0 2026-03-13 00:37:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:59cfd78dd5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-86976195a3 calico-apiserver-59cfd78dd5-nddx7 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali97f0dcbd806 [] [] }} ContainerID="5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-nddx7" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--nddx7-" Mar 13 00:38:16.222887 containerd[1633]: 2026-03-13 00:38:16.138 [INFO][4481] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-nddx7" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--nddx7-eth0" Mar 13 00:38:16.222887 containerd[1633]: 2026-03-13 00:38:16.170 [INFO][4505] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" HandleID="k8s-pod-network.5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" Workload="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--nddx7-eth0" Mar 13 00:38:16.223052 containerd[1633]: 2026-03-13 00:38:16.176 [INFO][4505] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" HandleID="k8s-pod-network.5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" Workload="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--nddx7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fd4c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-86976195a3", "pod":"calico-apiserver-59cfd78dd5-nddx7", "timestamp":"2026-03-13 00:38:16.170752112 +0000 UTC"}, Hostname:"ci-4459-2-4-n-86976195a3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000283600)} Mar 13 00:38:16.223052 containerd[1633]: 2026-03-13 00:38:16.176 [INFO][4505] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:38:16.223052 containerd[1633]: 2026-03-13 00:38:16.176 [INFO][4505] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:38:16.223052 containerd[1633]: 2026-03-13 00:38:16.176 [INFO][4505] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-86976195a3' Mar 13 00:38:16.223052 containerd[1633]: 2026-03-13 00:38:16.179 [INFO][4505] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.223052 containerd[1633]: 2026-03-13 00:38:16.184 [INFO][4505] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.223052 containerd[1633]: 2026-03-13 00:38:16.188 [INFO][4505] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.223052 containerd[1633]: 2026-03-13 00:38:16.189 [INFO][4505] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.223052 containerd[1633]: 2026-03-13 00:38:16.191 [INFO][4505] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.223198 containerd[1633]: 2026-03-13 00:38:16.191 [INFO][4505] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.223198 containerd[1633]: 2026-03-13 00:38:16.192 [INFO][4505] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589 Mar 13 00:38:16.223198 containerd[1633]: 2026-03-13 00:38:16.196 [INFO][4505] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.223198 containerd[1633]: 2026-03-13 00:38:16.201 [INFO][4505] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.67/26] block=192.168.119.64/26 handle="k8s-pod-network.5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.223198 containerd[1633]: 2026-03-13 00:38:16.201 [INFO][4505] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.67/26] handle="k8s-pod-network.5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.223198 containerd[1633]: 2026-03-13 00:38:16.201 [INFO][4505] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:38:16.223198 containerd[1633]: 2026-03-13 00:38:16.201 [INFO][4505] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.67/26] IPv6=[] ContainerID="5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" HandleID="k8s-pod-network.5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" Workload="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--nddx7-eth0" Mar 13 00:38:16.223367 containerd[1633]: 2026-03-13 00:38:16.203 [INFO][4481] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-nddx7" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--nddx7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--nddx7-eth0", GenerateName:"calico-apiserver-59cfd78dd5-", Namespace:"calico-system", SelfLink:"", UID:"20f68c7b-0d72-4da1-897b-e7098297c805", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59cfd78dd5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"", Pod:"calico-apiserver-59cfd78dd5-nddx7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.119.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali97f0dcbd806", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:16.223427 containerd[1633]: 2026-03-13 00:38:16.203 [INFO][4481] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.67/32] ContainerID="5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-nddx7" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--nddx7-eth0" Mar 13 00:38:16.223427 containerd[1633]: 2026-03-13 00:38:16.203 [INFO][4481] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali97f0dcbd806 ContainerID="5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-nddx7" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--nddx7-eth0" Mar 13 00:38:16.223427 containerd[1633]: 2026-03-13 00:38:16.206 [INFO][4481] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-nddx7" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--nddx7-eth0" Mar 13 00:38:16.223499 containerd[1633]: 2026-03-13 00:38:16.206 [INFO][4481] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-nddx7" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--nddx7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--nddx7-eth0", GenerateName:"calico-apiserver-59cfd78dd5-", Namespace:"calico-system", SelfLink:"", UID:"20f68c7b-0d72-4da1-897b-e7098297c805", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59cfd78dd5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589", Pod:"calico-apiserver-59cfd78dd5-nddx7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.119.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali97f0dcbd806", MAC:"a6:c1:87:e4:ac:3a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:16.223541 containerd[1633]: 2026-03-13 00:38:16.215 [INFO][4481] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-nddx7" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--nddx7-eth0" Mar 13 00:38:16.248067 containerd[1633]: time="2026-03-13T00:38:16.247927989Z" level=info msg="connecting to shim 5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589" address="unix:///run/containerd/s/d40950b181d22610906cda41f025b2f965272c568477dfd14bc37ed4eba0eda8" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:38:16.268608 systemd[1]: Started cri-containerd-5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589.scope - libcontainer container 5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589. Mar 13 00:38:16.319974 systemd-networkd[1488]: calif87d9797087: Link UP Mar 13 00:38:16.323613 systemd-networkd[1488]: calif87d9797087: Gained carrier Mar 13 00:38:16.335526 containerd[1633]: time="2026-03-13T00:38:16.335411237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59cfd78dd5-nddx7,Uid:20f68c7b-0d72-4da1-897b-e7098297c805,Namespace:calico-system,Attempt:0,} returns sandbox id \"5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589\"" Mar 13 00:38:16.338653 containerd[1633]: time="2026-03-13T00:38:16.338524047Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 13 00:38:16.342320 containerd[1633]: 2026-03-13 00:38:16.155 [INFO][4488] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--nprbp-eth0 coredns-674b8bbfcf- kube-system 5d90ef75-1dc2-4876-8272-eded34842bc1 818 0 2026-03-13 00:37:40 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-86976195a3 coredns-674b8bbfcf-nprbp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif87d9797087 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" Namespace="kube-system" Pod="coredns-674b8bbfcf-nprbp" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--nprbp-" Mar 13 00:38:16.342320 containerd[1633]: 2026-03-13 00:38:16.156 [INFO][4488] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" Namespace="kube-system" Pod="coredns-674b8bbfcf-nprbp" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--nprbp-eth0" Mar 13 00:38:16.342320 containerd[1633]: 2026-03-13 00:38:16.181 [INFO][4510] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" HandleID="k8s-pod-network.a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" Workload="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--nprbp-eth0" Mar 13 00:38:16.342469 containerd[1633]: 2026-03-13 00:38:16.186 [INFO][4510] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" HandleID="k8s-pod-network.a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" Workload="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--nprbp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000449440), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-86976195a3", "pod":"coredns-674b8bbfcf-nprbp", "timestamp":"2026-03-13 00:38:16.181310953 +0000 UTC"}, Hostname:"ci-4459-2-4-n-86976195a3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000200580)} Mar 13 00:38:16.342469 containerd[1633]: 2026-03-13 00:38:16.186 [INFO][4510] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:38:16.342469 containerd[1633]: 2026-03-13 00:38:16.201 [INFO][4510] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:38:16.342469 containerd[1633]: 2026-03-13 00:38:16.201 [INFO][4510] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-86976195a3' Mar 13 00:38:16.342469 containerd[1633]: 2026-03-13 00:38:16.280 [INFO][4510] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.342469 containerd[1633]: 2026-03-13 00:38:16.285 [INFO][4510] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.342469 containerd[1633]: 2026-03-13 00:38:16.289 [INFO][4510] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.342469 containerd[1633]: 2026-03-13 00:38:16.290 [INFO][4510] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.342469 containerd[1633]: 2026-03-13 00:38:16.292 [INFO][4510] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.342668 containerd[1633]: 2026-03-13 00:38:16.292 [INFO][4510] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.342668 containerd[1633]: 2026-03-13 00:38:16.296 [INFO][4510] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35 Mar 13 00:38:16.342668 containerd[1633]: 2026-03-13 00:38:16.302 [INFO][4510] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.342668 containerd[1633]: 2026-03-13 00:38:16.310 [INFO][4510] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.68/26] block=192.168.119.64/26 handle="k8s-pod-network.a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.342668 containerd[1633]: 2026-03-13 00:38:16.310 [INFO][4510] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.68/26] handle="k8s-pod-network.a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:16.342668 containerd[1633]: 2026-03-13 00:38:16.310 [INFO][4510] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:38:16.342668 containerd[1633]: 2026-03-13 00:38:16.310 [INFO][4510] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.68/26] IPv6=[] ContainerID="a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" HandleID="k8s-pod-network.a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" Workload="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--nprbp-eth0" Mar 13 00:38:16.342808 containerd[1633]: 2026-03-13 00:38:16.315 [INFO][4488] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" Namespace="kube-system" Pod="coredns-674b8bbfcf-nprbp" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--nprbp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--nprbp-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"5d90ef75-1dc2-4876-8272-eded34842bc1", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"", Pod:"coredns-674b8bbfcf-nprbp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.119.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif87d9797087", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:16.342808 containerd[1633]: 2026-03-13 00:38:16.315 [INFO][4488] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.68/32] ContainerID="a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" Namespace="kube-system" Pod="coredns-674b8bbfcf-nprbp" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--nprbp-eth0" Mar 13 00:38:16.342808 containerd[1633]: 2026-03-13 00:38:16.315 [INFO][4488] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif87d9797087 ContainerID="a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" Namespace="kube-system" Pod="coredns-674b8bbfcf-nprbp" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--nprbp-eth0" Mar 13 00:38:16.342808 containerd[1633]: 2026-03-13 00:38:16.321 [INFO][4488] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" Namespace="kube-system" Pod="coredns-674b8bbfcf-nprbp" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--nprbp-eth0" Mar 13 00:38:16.342808 containerd[1633]: 2026-03-13 00:38:16.321 [INFO][4488] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" Namespace="kube-system" Pod="coredns-674b8bbfcf-nprbp" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--nprbp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--nprbp-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"5d90ef75-1dc2-4876-8272-eded34842bc1", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35", Pod:"coredns-674b8bbfcf-nprbp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.119.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif87d9797087", MAC:"06:ed:a2:cd:29:a5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:16.342808 containerd[1633]: 2026-03-13 00:38:16.334 [INFO][4488] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" Namespace="kube-system" Pod="coredns-674b8bbfcf-nprbp" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--nprbp-eth0" Mar 13 00:38:16.363550 containerd[1633]: time="2026-03-13T00:38:16.363452910Z" level=info msg="connecting to shim a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35" address="unix:///run/containerd/s/523bc237c96cab62cc545ee466fce0ac6615c1d8f35c931b86f9832b47b298a7" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:38:16.389606 systemd[1]: Started cri-containerd-a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35.scope - libcontainer container a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35. Mar 13 00:38:16.435837 containerd[1633]: time="2026-03-13T00:38:16.435787074Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-nprbp,Uid:5d90ef75-1dc2-4876-8272-eded34842bc1,Namespace:kube-system,Attempt:0,} returns sandbox id \"a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35\"" Mar 13 00:38:16.440162 containerd[1633]: time="2026-03-13T00:38:16.439907156Z" level=info msg="CreateContainer within sandbox \"a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 13 00:38:16.446866 containerd[1633]: time="2026-03-13T00:38:16.446845356Z" level=info msg="Container e7779c1a333fde6a756d56c1bbfe4cc6b32f9b45c21098b4f5a6e95fe0c9764d: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:38:16.453948 containerd[1633]: time="2026-03-13T00:38:16.453923017Z" level=info msg="CreateContainer within sandbox \"a21baf7a5afab1ea98e23e4ab9f2508e41176973303187bd4d4feac8acf0ba35\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e7779c1a333fde6a756d56c1bbfe4cc6b32f9b45c21098b4f5a6e95fe0c9764d\"" Mar 13 00:38:16.454343 containerd[1633]: time="2026-03-13T00:38:16.454273428Z" level=info msg="StartContainer for \"e7779c1a333fde6a756d56c1bbfe4cc6b32f9b45c21098b4f5a6e95fe0c9764d\"" Mar 13 00:38:16.455058 containerd[1633]: time="2026-03-13T00:38:16.454993270Z" level=info msg="connecting to shim e7779c1a333fde6a756d56c1bbfe4cc6b32f9b45c21098b4f5a6e95fe0c9764d" address="unix:///run/containerd/s/523bc237c96cab62cc545ee466fce0ac6615c1d8f35c931b86f9832b47b298a7" protocol=ttrpc version=3 Mar 13 00:38:16.472592 systemd[1]: Started cri-containerd-e7779c1a333fde6a756d56c1bbfe4cc6b32f9b45c21098b4f5a6e95fe0c9764d.scope - libcontainer container e7779c1a333fde6a756d56c1bbfe4cc6b32f9b45c21098b4f5a6e95fe0c9764d. Mar 13 00:38:16.498615 containerd[1633]: time="2026-03-13T00:38:16.498541909Z" level=info msg="StartContainer for \"e7779c1a333fde6a756d56c1bbfe4cc6b32f9b45c21098b4f5a6e95fe0c9764d\" returns successfully" Mar 13 00:38:17.070067 containerd[1633]: time="2026-03-13T00:38:17.070011015Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59cfd78dd5-q9vph,Uid:46bd0b0e-292a-4246-85e0-9eb19df0f28d,Namespace:calico-system,Attempt:0,}" Mar 13 00:38:17.195406 systemd-networkd[1488]: cali0dac7fc3939: Link UP Mar 13 00:38:17.196260 systemd-networkd[1488]: cali0dac7fc3939: Gained carrier Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.133 [INFO][4684] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--q9vph-eth0 calico-apiserver-59cfd78dd5- calico-system 46bd0b0e-292a-4246-85e0-9eb19df0f28d 821 0 2026-03-13 00:37:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:59cfd78dd5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-86976195a3 calico-apiserver-59cfd78dd5-q9vph eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali0dac7fc3939 [] [] }} ContainerID="654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-q9vph" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--q9vph-" Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.133 [INFO][4684] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-q9vph" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--q9vph-eth0" Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.153 [INFO][4696] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" HandleID="k8s-pod-network.654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" Workload="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--q9vph-eth0" Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.161 [INFO][4696] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" HandleID="k8s-pod-network.654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" Workload="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--q9vph-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277330), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-86976195a3", "pod":"calico-apiserver-59cfd78dd5-q9vph", "timestamp":"2026-03-13 00:38:17.153252599 +0000 UTC"}, Hostname:"ci-4459-2-4-n-86976195a3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003a8f20)} Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.161 [INFO][4696] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.161 [INFO][4696] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.161 [INFO][4696] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-86976195a3' Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.163 [INFO][4696] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.167 [INFO][4696] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.173 [INFO][4696] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.175 [INFO][4696] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.177 [INFO][4696] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.177 [INFO][4696] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.179 [INFO][4696] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2 Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.182 [INFO][4696] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.188 [INFO][4696] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.69/26] block=192.168.119.64/26 handle="k8s-pod-network.654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.188 [INFO][4696] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.69/26] handle="k8s-pod-network.654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.188 [INFO][4696] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:38:17.208671 containerd[1633]: 2026-03-13 00:38:17.188 [INFO][4696] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.69/26] IPv6=[] ContainerID="654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" HandleID="k8s-pod-network.654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" Workload="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--q9vph-eth0" Mar 13 00:38:17.209286 containerd[1633]: 2026-03-13 00:38:17.191 [INFO][4684] cni-plugin/k8s.go 418: Populated endpoint ContainerID="654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-q9vph" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--q9vph-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--q9vph-eth0", GenerateName:"calico-apiserver-59cfd78dd5-", Namespace:"calico-system", SelfLink:"", UID:"46bd0b0e-292a-4246-85e0-9eb19df0f28d", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59cfd78dd5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"", Pod:"calico-apiserver-59cfd78dd5-q9vph", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.119.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali0dac7fc3939", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:17.209286 containerd[1633]: 2026-03-13 00:38:17.192 [INFO][4684] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.69/32] ContainerID="654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-q9vph" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--q9vph-eth0" Mar 13 00:38:17.209286 containerd[1633]: 2026-03-13 00:38:17.192 [INFO][4684] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0dac7fc3939 ContainerID="654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-q9vph" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--q9vph-eth0" Mar 13 00:38:17.209286 containerd[1633]: 2026-03-13 00:38:17.196 [INFO][4684] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-q9vph" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--q9vph-eth0" Mar 13 00:38:17.209286 containerd[1633]: 2026-03-13 00:38:17.197 [INFO][4684] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-q9vph" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--q9vph-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--q9vph-eth0", GenerateName:"calico-apiserver-59cfd78dd5-", Namespace:"calico-system", SelfLink:"", UID:"46bd0b0e-292a-4246-85e0-9eb19df0f28d", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59cfd78dd5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2", Pod:"calico-apiserver-59cfd78dd5-q9vph", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.119.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali0dac7fc3939", MAC:"72:01:8b:5f:62:dc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:17.209286 containerd[1633]: 2026-03-13 00:38:17.205 [INFO][4684] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" Namespace="calico-system" Pod="calico-apiserver-59cfd78dd5-q9vph" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--apiserver--59cfd78dd5--q9vph-eth0" Mar 13 00:38:17.235694 containerd[1633]: time="2026-03-13T00:38:17.235227200Z" level=info msg="connecting to shim 654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2" address="unix:///run/containerd/s/d0f023a48f4ca5ff5e1a76558341b4a3e2612eeb716b78b32301b47487795d7b" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:38:17.260682 systemd[1]: Started cri-containerd-654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2.scope - libcontainer container 654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2. Mar 13 00:38:17.278213 kubelet[2778]: I0313 00:38:17.277943 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-nprbp" podStartSLOduration=37.27792927 podStartE2EDuration="37.27792927s" podCreationTimestamp="2026-03-13 00:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:38:17.255972019 +0000 UTC m=+42.268927354" watchObservedRunningTime="2026-03-13 00:38:17.27792927 +0000 UTC m=+42.290884605" Mar 13 00:38:17.297821 systemd-networkd[1488]: cali97f0dcbd806: Gained IPv6LL Mar 13 00:38:17.327973 containerd[1633]: time="2026-03-13T00:38:17.327847831Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59cfd78dd5-q9vph,Uid:46bd0b0e-292a-4246-85e0-9eb19df0f28d,Namespace:calico-system,Attempt:0,} returns sandbox id \"654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2\"" Mar 13 00:38:17.809859 systemd-networkd[1488]: calif87d9797087: Gained IPv6LL Mar 13 00:38:18.069737 containerd[1633]: time="2026-03-13T00:38:18.069395042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-67z6c,Uid:a6077e04-8bca-4d2d-8663-d83dccf74534,Namespace:calico-system,Attempt:0,}" Mar 13 00:38:18.071222 containerd[1633]: time="2026-03-13T00:38:18.071182377Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7948d5d9db-xhktz,Uid:5761654e-dacf-4b91-a01a-1840500b12d9,Namespace:calico-system,Attempt:0,}" Mar 13 00:38:18.199142 systemd-networkd[1488]: cali3add975f274: Link UP Mar 13 00:38:18.199784 systemd-networkd[1488]: cali3add975f274: Gained carrier Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.135 [INFO][4790] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--86976195a3-k8s-calico--kube--controllers--7948d5d9db--xhktz-eth0 calico-kube-controllers-7948d5d9db- calico-system 5761654e-dacf-4b91-a01a-1840500b12d9 822 0 2026-03-13 00:37:50 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7948d5d9db projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-4-n-86976195a3 calico-kube-controllers-7948d5d9db-xhktz eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali3add975f274 [] [] }} ContainerID="cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" Namespace="calico-system" Pod="calico-kube-controllers-7948d5d9db-xhktz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--kube--controllers--7948d5d9db--xhktz-" Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.135 [INFO][4790] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" Namespace="calico-system" Pod="calico-kube-controllers-7948d5d9db-xhktz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--kube--controllers--7948d5d9db--xhktz-eth0" Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.160 [INFO][4817] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" HandleID="k8s-pod-network.cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" Workload="ci--4459--2--4--n--86976195a3-k8s-calico--kube--controllers--7948d5d9db--xhktz-eth0" Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.166 [INFO][4817] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" HandleID="k8s-pod-network.cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" Workload="ci--4459--2--4--n--86976195a3-k8s-calico--kube--controllers--7948d5d9db--xhktz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fddc0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-86976195a3", "pod":"calico-kube-controllers-7948d5d9db-xhktz", "timestamp":"2026-03-13 00:38:18.160119816 +0000 UTC"}, Hostname:"ci-4459-2-4-n-86976195a3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000301a20)} Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.166 [INFO][4817] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.166 [INFO][4817] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.166 [INFO][4817] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-86976195a3' Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.169 [INFO][4817] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.173 [INFO][4817] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.177 [INFO][4817] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.178 [INFO][4817] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.180 [INFO][4817] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.180 [INFO][4817] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.181 [INFO][4817] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.186 [INFO][4817] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.191 [INFO][4817] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.70/26] block=192.168.119.64/26 handle="k8s-pod-network.cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.191 [INFO][4817] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.70/26] handle="k8s-pod-network.cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.191 [INFO][4817] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:38:18.212339 containerd[1633]: 2026-03-13 00:38:18.191 [INFO][4817] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.70/26] IPv6=[] ContainerID="cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" HandleID="k8s-pod-network.cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" Workload="ci--4459--2--4--n--86976195a3-k8s-calico--kube--controllers--7948d5d9db--xhktz-eth0" Mar 13 00:38:18.214945 containerd[1633]: 2026-03-13 00:38:18.194 [INFO][4790] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" Namespace="calico-system" Pod="calico-kube-controllers-7948d5d9db-xhktz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--kube--controllers--7948d5d9db--xhktz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-calico--kube--controllers--7948d5d9db--xhktz-eth0", GenerateName:"calico-kube-controllers-7948d5d9db-", Namespace:"calico-system", SelfLink:"", UID:"5761654e-dacf-4b91-a01a-1840500b12d9", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7948d5d9db", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"", Pod:"calico-kube-controllers-7948d5d9db-xhktz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.119.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3add975f274", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:18.214945 containerd[1633]: 2026-03-13 00:38:18.194 [INFO][4790] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.70/32] ContainerID="cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" Namespace="calico-system" Pod="calico-kube-controllers-7948d5d9db-xhktz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--kube--controllers--7948d5d9db--xhktz-eth0" Mar 13 00:38:18.214945 containerd[1633]: 2026-03-13 00:38:18.194 [INFO][4790] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3add975f274 ContainerID="cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" Namespace="calico-system" Pod="calico-kube-controllers-7948d5d9db-xhktz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--kube--controllers--7948d5d9db--xhktz-eth0" Mar 13 00:38:18.214945 containerd[1633]: 2026-03-13 00:38:18.200 [INFO][4790] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" Namespace="calico-system" Pod="calico-kube-controllers-7948d5d9db-xhktz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--kube--controllers--7948d5d9db--xhktz-eth0" Mar 13 00:38:18.214945 containerd[1633]: 2026-03-13 00:38:18.200 [INFO][4790] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" Namespace="calico-system" Pod="calico-kube-controllers-7948d5d9db-xhktz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--kube--controllers--7948d5d9db--xhktz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-calico--kube--controllers--7948d5d9db--xhktz-eth0", GenerateName:"calico-kube-controllers-7948d5d9db-", Namespace:"calico-system", SelfLink:"", UID:"5761654e-dacf-4b91-a01a-1840500b12d9", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7948d5d9db", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f", Pod:"calico-kube-controllers-7948d5d9db-xhktz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.119.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3add975f274", MAC:"de:ec:a2:85:fa:ab", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:18.214945 containerd[1633]: 2026-03-13 00:38:18.207 [INFO][4790] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" Namespace="calico-system" Pod="calico-kube-controllers-7948d5d9db-xhktz" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-calico--kube--controllers--7948d5d9db--xhktz-eth0" Mar 13 00:38:18.251145 containerd[1633]: time="2026-03-13T00:38:18.251111121Z" level=info msg="connecting to shim cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f" address="unix:///run/containerd/s/f1bb272c6144263f9ea79779ab3cd6e2374df32f984a753a5c8158f71fbc9a74" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:38:18.278845 systemd[1]: Started cri-containerd-cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f.scope - libcontainer container cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f. Mar 13 00:38:18.336873 systemd-networkd[1488]: calic64c3eb83bc: Link UP Mar 13 00:38:18.337452 systemd-networkd[1488]: calic64c3eb83bc: Gained carrier Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.120 [INFO][4784] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--86976195a3-k8s-goldmane--5b85766d88--67z6c-eth0 goldmane-5b85766d88- calico-system a6077e04-8bca-4d2d-8663-d83dccf74534 820 0 2026-03-13 00:37:50 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:5b85766d88 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-4-n-86976195a3 goldmane-5b85766d88-67z6c eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calic64c3eb83bc [] [] }} ContainerID="76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" Namespace="calico-system" Pod="goldmane-5b85766d88-67z6c" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-goldmane--5b85766d88--67z6c-" Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.120 [INFO][4784] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" Namespace="calico-system" Pod="goldmane-5b85766d88-67z6c" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-goldmane--5b85766d88--67z6c-eth0" Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.157 [INFO][4812] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" HandleID="k8s-pod-network.76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" Workload="ci--4459--2--4--n--86976195a3-k8s-goldmane--5b85766d88--67z6c-eth0" Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.166 [INFO][4812] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" HandleID="k8s-pod-network.76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" Workload="ci--4459--2--4--n--86976195a3-k8s-goldmane--5b85766d88--67z6c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e1a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-86976195a3", "pod":"goldmane-5b85766d88-67z6c", "timestamp":"2026-03-13 00:38:18.157457409 +0000 UTC"}, Hostname:"ci-4459-2-4-n-86976195a3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001886e0)} Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.166 [INFO][4812] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.191 [INFO][4812] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.191 [INFO][4812] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-86976195a3' Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.271 [INFO][4812] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.280 [INFO][4812] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.286 [INFO][4812] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.287 [INFO][4812] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.289 [INFO][4812] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.289 [INFO][4812] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.291 [INFO][4812] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.294 [INFO][4812] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.307 [INFO][4812] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.71/26] block=192.168.119.64/26 handle="k8s-pod-network.76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.307 [INFO][4812] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.71/26] handle="k8s-pod-network.76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.307 [INFO][4812] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:38:18.358505 containerd[1633]: 2026-03-13 00:38:18.307 [INFO][4812] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.71/26] IPv6=[] ContainerID="76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" HandleID="k8s-pod-network.76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" Workload="ci--4459--2--4--n--86976195a3-k8s-goldmane--5b85766d88--67z6c-eth0" Mar 13 00:38:18.359020 containerd[1633]: 2026-03-13 00:38:18.323 [INFO][4784] cni-plugin/k8s.go 418: Populated endpoint ContainerID="76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" Namespace="calico-system" Pod="goldmane-5b85766d88-67z6c" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-goldmane--5b85766d88--67z6c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-goldmane--5b85766d88--67z6c-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"a6077e04-8bca-4d2d-8663-d83dccf74534", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"", Pod:"goldmane-5b85766d88-67z6c", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.119.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic64c3eb83bc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:18.359020 containerd[1633]: 2026-03-13 00:38:18.323 [INFO][4784] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.71/32] ContainerID="76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" Namespace="calico-system" Pod="goldmane-5b85766d88-67z6c" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-goldmane--5b85766d88--67z6c-eth0" Mar 13 00:38:18.359020 containerd[1633]: 2026-03-13 00:38:18.323 [INFO][4784] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic64c3eb83bc ContainerID="76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" Namespace="calico-system" Pod="goldmane-5b85766d88-67z6c" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-goldmane--5b85766d88--67z6c-eth0" Mar 13 00:38:18.359020 containerd[1633]: 2026-03-13 00:38:18.336 [INFO][4784] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" Namespace="calico-system" Pod="goldmane-5b85766d88-67z6c" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-goldmane--5b85766d88--67z6c-eth0" Mar 13 00:38:18.359020 containerd[1633]: 2026-03-13 00:38:18.336 [INFO][4784] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" Namespace="calico-system" Pod="goldmane-5b85766d88-67z6c" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-goldmane--5b85766d88--67z6c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-goldmane--5b85766d88--67z6c-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"a6077e04-8bca-4d2d-8663-d83dccf74534", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a", Pod:"goldmane-5b85766d88-67z6c", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.119.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic64c3eb83bc", MAC:"5a:6e:51:e7:61:d2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:18.359020 containerd[1633]: 2026-03-13 00:38:18.347 [INFO][4784] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" Namespace="calico-system" Pod="goldmane-5b85766d88-67z6c" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-goldmane--5b85766d88--67z6c-eth0" Mar 13 00:38:18.360786 containerd[1633]: time="2026-03-13T00:38:18.360640966Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7948d5d9db-xhktz,Uid:5761654e-dacf-4b91-a01a-1840500b12d9,Namespace:calico-system,Attempt:0,} returns sandbox id \"cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f\"" Mar 13 00:38:18.389264 containerd[1633]: time="2026-03-13T00:38:18.388563631Z" level=info msg="connecting to shim 76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a" address="unix:///run/containerd/s/800074562d9b1c9f5350e6d8651a3406a43bf356169848f84f4c594d3c3a889a" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:38:18.415872 systemd[1]: Started cri-containerd-76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a.scope - libcontainer container 76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a. Mar 13 00:38:18.461033 containerd[1633]: time="2026-03-13T00:38:18.461004107Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-67z6c,Uid:a6077e04-8bca-4d2d-8663-d83dccf74534,Namespace:calico-system,Attempt:0,} returns sandbox id \"76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a\"" Mar 13 00:38:19.026447 systemd-networkd[1488]: cali0dac7fc3939: Gained IPv6LL Mar 13 00:38:19.069927 containerd[1633]: time="2026-03-13T00:38:19.069868047Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zqjwk,Uid:8d70918b-f6c4-496d-b3bf-681bf2962192,Namespace:kube-system,Attempt:0,}" Mar 13 00:38:19.174649 systemd-networkd[1488]: cali2987291daa7: Link UP Mar 13 00:38:19.175639 systemd-networkd[1488]: cali2987291daa7: Gained carrier Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.110 [INFO][4967] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--zqjwk-eth0 coredns-674b8bbfcf- kube-system 8d70918b-f6c4-496d-b3bf-681bf2962192 813 0 2026-03-13 00:37:40 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-86976195a3 coredns-674b8bbfcf-zqjwk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali2987291daa7 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqjwk" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--zqjwk-" Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.111 [INFO][4967] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqjwk" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--zqjwk-eth0" Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.141 [INFO][4979] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" HandleID="k8s-pod-network.547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" Workload="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--zqjwk-eth0" Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.146 [INFO][4979] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" HandleID="k8s-pod-network.547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" Workload="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--zqjwk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277890), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-86976195a3", "pod":"coredns-674b8bbfcf-zqjwk", "timestamp":"2026-03-13 00:38:19.141432142 +0000 UTC"}, Hostname:"ci-4459-2-4-n-86976195a3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003682c0)} Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.146 [INFO][4979] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.146 [INFO][4979] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.146 [INFO][4979] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-86976195a3' Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.148 [INFO][4979] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.151 [INFO][4979] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.155 [INFO][4979] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.156 [INFO][4979] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.158 [INFO][4979] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.158 [INFO][4979] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.160 [INFO][4979] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400 Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.163 [INFO][4979] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.168 [INFO][4979] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.72/26] block=192.168.119.64/26 handle="k8s-pod-network.547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.168 [INFO][4979] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.72/26] handle="k8s-pod-network.547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" host="ci-4459-2-4-n-86976195a3" Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.168 [INFO][4979] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:38:19.189791 containerd[1633]: 2026-03-13 00:38:19.168 [INFO][4979] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.72/26] IPv6=[] ContainerID="547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" HandleID="k8s-pod-network.547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" Workload="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--zqjwk-eth0" Mar 13 00:38:19.190199 containerd[1633]: 2026-03-13 00:38:19.170 [INFO][4967] cni-plugin/k8s.go 418: Populated endpoint ContainerID="547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqjwk" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--zqjwk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--zqjwk-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"8d70918b-f6c4-496d-b3bf-681bf2962192", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"", Pod:"coredns-674b8bbfcf-zqjwk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.119.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2987291daa7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:19.190199 containerd[1633]: 2026-03-13 00:38:19.170 [INFO][4967] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.72/32] ContainerID="547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqjwk" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--zqjwk-eth0" Mar 13 00:38:19.190199 containerd[1633]: 2026-03-13 00:38:19.170 [INFO][4967] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2987291daa7 ContainerID="547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqjwk" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--zqjwk-eth0" Mar 13 00:38:19.190199 containerd[1633]: 2026-03-13 00:38:19.175 [INFO][4967] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqjwk" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--zqjwk-eth0" Mar 13 00:38:19.190199 containerd[1633]: 2026-03-13 00:38:19.176 [INFO][4967] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqjwk" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--zqjwk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--zqjwk-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"8d70918b-f6c4-496d-b3bf-681bf2962192", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-86976195a3", ContainerID:"547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400", Pod:"coredns-674b8bbfcf-zqjwk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.119.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2987291daa7", MAC:"f6:46:cb:9d:fd:a3", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:38:19.190199 containerd[1633]: 2026-03-13 00:38:19.184 [INFO][4967] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqjwk" WorkloadEndpoint="ci--4459--2--4--n--86976195a3-k8s-coredns--674b8bbfcf--zqjwk-eth0" Mar 13 00:38:19.215677 containerd[1633]: time="2026-03-13T00:38:19.215613202Z" level=info msg="connecting to shim 547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400" address="unix:///run/containerd/s/cee4e48bf3159725b9fe819300b626e7080a3592a1b1358926d1906818109d2d" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:38:19.235584 systemd[1]: Started cri-containerd-547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400.scope - libcontainer container 547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400. Mar 13 00:38:19.278012 containerd[1633]: time="2026-03-13T00:38:19.277921933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zqjwk,Uid:8d70918b-f6c4-496d-b3bf-681bf2962192,Namespace:kube-system,Attempt:0,} returns sandbox id \"547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400\"" Mar 13 00:38:19.281591 systemd-networkd[1488]: cali3add975f274: Gained IPv6LL Mar 13 00:38:19.284094 containerd[1633]: time="2026-03-13T00:38:19.283970868Z" level=info msg="CreateContainer within sandbox \"547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 13 00:38:19.295824 containerd[1633]: time="2026-03-13T00:38:19.295792219Z" level=info msg="Container cef21cfe4956e6ac91b07d11aa77d0928d524e67be5d6c5a8a8e319e219c8851: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:38:19.302743 containerd[1633]: time="2026-03-13T00:38:19.302707597Z" level=info msg="CreateContainer within sandbox \"547d277c0237e0dca8902db72c21ef3dabcd3840c93cc113bbd14207a2ea8400\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"cef21cfe4956e6ac91b07d11aa77d0928d524e67be5d6c5a8a8e319e219c8851\"" Mar 13 00:38:19.303955 containerd[1633]: time="2026-03-13T00:38:19.303934290Z" level=info msg="StartContainer for \"cef21cfe4956e6ac91b07d11aa77d0928d524e67be5d6c5a8a8e319e219c8851\"" Mar 13 00:38:19.306153 containerd[1633]: time="2026-03-13T00:38:19.306116846Z" level=info msg="connecting to shim cef21cfe4956e6ac91b07d11aa77d0928d524e67be5d6c5a8a8e319e219c8851" address="unix:///run/containerd/s/cee4e48bf3159725b9fe819300b626e7080a3592a1b1358926d1906818109d2d" protocol=ttrpc version=3 Mar 13 00:38:19.324673 systemd[1]: Started cri-containerd-cef21cfe4956e6ac91b07d11aa77d0928d524e67be5d6c5a8a8e319e219c8851.scope - libcontainer container cef21cfe4956e6ac91b07d11aa77d0928d524e67be5d6c5a8a8e319e219c8851. Mar 13 00:38:19.386204 containerd[1633]: time="2026-03-13T00:38:19.386165421Z" level=info msg="StartContainer for \"cef21cfe4956e6ac91b07d11aa77d0928d524e67be5d6c5a8a8e319e219c8851\" returns successfully" Mar 13 00:38:20.076908 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3582384704.mount: Deactivated successfully. Mar 13 00:38:20.265831 kubelet[2778]: I0313 00:38:20.265743 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-zqjwk" podStartSLOduration=40.265727429 podStartE2EDuration="40.265727429s" podCreationTimestamp="2026-03-13 00:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:38:20.265026138 +0000 UTC m=+45.277981483" watchObservedRunningTime="2026-03-13 00:38:20.265727429 +0000 UTC m=+45.278682774" Mar 13 00:38:20.371160 systemd-networkd[1488]: calic64c3eb83bc: Gained IPv6LL Mar 13 00:38:20.976752 containerd[1633]: time="2026-03-13T00:38:20.976691823Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:20.977632 containerd[1633]: time="2026-03-13T00:38:20.977612176Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 13 00:38:20.980440 containerd[1633]: time="2026-03-13T00:38:20.978671159Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:20.983727 containerd[1633]: time="2026-03-13T00:38:20.983700611Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:20.984061 containerd[1633]: time="2026-03-13T00:38:20.984044132Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 4.645283505s" Mar 13 00:38:20.984116 containerd[1633]: time="2026-03-13T00:38:20.984107572Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 13 00:38:20.985588 containerd[1633]: time="2026-03-13T00:38:20.985574076Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 13 00:38:20.988088 containerd[1633]: time="2026-03-13T00:38:20.988045062Z" level=info msg="CreateContainer within sandbox \"5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 13 00:38:20.999775 containerd[1633]: time="2026-03-13T00:38:20.999755981Z" level=info msg="Container 827dbae43e4d7cbff2ef1b818ddfa1254d55f3650ee1c527c7389d0e728b42a8: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:38:21.002099 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3866543371.mount: Deactivated successfully. Mar 13 00:38:21.006686 containerd[1633]: time="2026-03-13T00:38:21.006659007Z" level=info msg="CreateContainer within sandbox \"5ffef40a2113d8f5bc9eb9004390c28ef51667504c181058805cde446ddf7589\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"827dbae43e4d7cbff2ef1b818ddfa1254d55f3650ee1c527c7389d0e728b42a8\"" Mar 13 00:38:21.007164 containerd[1633]: time="2026-03-13T00:38:21.007123429Z" level=info msg="StartContainer for \"827dbae43e4d7cbff2ef1b818ddfa1254d55f3650ee1c527c7389d0e728b42a8\"" Mar 13 00:38:21.007993 containerd[1633]: time="2026-03-13T00:38:21.007955511Z" level=info msg="connecting to shim 827dbae43e4d7cbff2ef1b818ddfa1254d55f3650ee1c527c7389d0e728b42a8" address="unix:///run/containerd/s/d40950b181d22610906cda41f025b2f965272c568477dfd14bc37ed4eba0eda8" protocol=ttrpc version=3 Mar 13 00:38:21.029583 systemd[1]: Started cri-containerd-827dbae43e4d7cbff2ef1b818ddfa1254d55f3650ee1c527c7389d0e728b42a8.scope - libcontainer container 827dbae43e4d7cbff2ef1b818ddfa1254d55f3650ee1c527c7389d0e728b42a8. Mar 13 00:38:21.071847 containerd[1633]: time="2026-03-13T00:38:21.071770482Z" level=info msg="StartContainer for \"827dbae43e4d7cbff2ef1b818ddfa1254d55f3650ee1c527c7389d0e728b42a8\" returns successfully" Mar 13 00:38:21.073803 systemd-networkd[1488]: cali2987291daa7: Gained IPv6LL Mar 13 00:38:21.453941 containerd[1633]: time="2026-03-13T00:38:21.453701975Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:21.455949 containerd[1633]: time="2026-03-13T00:38:21.455490419Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 13 00:38:21.457723 containerd[1633]: time="2026-03-13T00:38:21.457660794Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 471.683937ms" Mar 13 00:38:21.457723 containerd[1633]: time="2026-03-13T00:38:21.457681834Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 13 00:38:21.459744 containerd[1633]: time="2026-03-13T00:38:21.459593699Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 13 00:38:21.465775 containerd[1633]: time="2026-03-13T00:38:21.465757023Z" level=info msg="CreateContainer within sandbox \"654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 13 00:38:21.475165 containerd[1633]: time="2026-03-13T00:38:21.474632724Z" level=info msg="Container 0863af2f8904313f777a73f9f5a7c5f3ea2d4b416196ceb45e8f428520691035: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:38:21.496800 containerd[1633]: time="2026-03-13T00:38:21.496765007Z" level=info msg="CreateContainer within sandbox \"654f05fae8c0380c0907fa65d5ca154e13dab92983bff05c9813d5e526ea22c2\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"0863af2f8904313f777a73f9f5a7c5f3ea2d4b416196ceb45e8f428520691035\"" Mar 13 00:38:21.498128 containerd[1633]: time="2026-03-13T00:38:21.498069890Z" level=info msg="StartContainer for \"0863af2f8904313f777a73f9f5a7c5f3ea2d4b416196ceb45e8f428520691035\"" Mar 13 00:38:21.500568 containerd[1633]: time="2026-03-13T00:38:21.500514676Z" level=info msg="connecting to shim 0863af2f8904313f777a73f9f5a7c5f3ea2d4b416196ceb45e8f428520691035" address="unix:///run/containerd/s/d0f023a48f4ca5ff5e1a76558341b4a3e2612eeb716b78b32301b47487795d7b" protocol=ttrpc version=3 Mar 13 00:38:21.533704 systemd[1]: Started cri-containerd-0863af2f8904313f777a73f9f5a7c5f3ea2d4b416196ceb45e8f428520691035.scope - libcontainer container 0863af2f8904313f777a73f9f5a7c5f3ea2d4b416196ceb45e8f428520691035. Mar 13 00:38:21.583211 containerd[1633]: time="2026-03-13T00:38:21.583176321Z" level=info msg="StartContainer for \"0863af2f8904313f777a73f9f5a7c5f3ea2d4b416196ceb45e8f428520691035\" returns successfully" Mar 13 00:38:22.261018 kubelet[2778]: I0313 00:38:22.260964 2778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:38:22.272439 kubelet[2778]: I0313 00:38:22.272222 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-59cfd78dd5-q9vph" podStartSLOduration=28.142623844 podStartE2EDuration="32.272207865s" podCreationTimestamp="2026-03-13 00:37:50 +0000 UTC" firstStartedPulling="2026-03-13 00:38:17.329538426 +0000 UTC m=+42.342493771" lastFinishedPulling="2026-03-13 00:38:21.459122457 +0000 UTC m=+46.472077792" observedRunningTime="2026-03-13 00:38:22.272117225 +0000 UTC m=+47.285072610" watchObservedRunningTime="2026-03-13 00:38:22.272207865 +0000 UTC m=+47.285163210" Mar 13 00:38:22.272439 kubelet[2778]: I0313 00:38:22.272293 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-59cfd78dd5-nddx7" podStartSLOduration=27.624578835 podStartE2EDuration="32.272288776s" podCreationTimestamp="2026-03-13 00:37:50 +0000 UTC" firstStartedPulling="2026-03-13 00:38:16.337284983 +0000 UTC m=+41.350240318" lastFinishedPulling="2026-03-13 00:38:20.984994924 +0000 UTC m=+45.997950259" observedRunningTime="2026-03-13 00:38:21.274815322 +0000 UTC m=+46.287770687" watchObservedRunningTime="2026-03-13 00:38:22.272288776 +0000 UTC m=+47.285244121" Mar 13 00:38:22.744602 systemd[1]: Started sshd@9-89.167.87.208:22-130.12.181.151:63390.service - OpenSSH per-connection server daemon (130.12.181.151:63390). Mar 13 00:38:23.264220 kubelet[2778]: I0313 00:38:23.263872 2778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:38:23.711358 sshd[5179]: Received disconnect from 130.12.181.151 port 63390:11: Bye Bye [preauth] Mar 13 00:38:23.711358 sshd[5179]: Disconnected from authenticating user root 130.12.181.151 port 63390 [preauth] Mar 13 00:38:23.713759 systemd[1]: sshd@9-89.167.87.208:22-130.12.181.151:63390.service: Deactivated successfully. Mar 13 00:38:24.338204 kubelet[2778]: I0313 00:38:24.338096 2778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:38:24.615226 containerd[1633]: time="2026-03-13T00:38:24.615029116Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:24.616461 containerd[1633]: time="2026-03-13T00:38:24.616341350Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 13 00:38:24.617390 containerd[1633]: time="2026-03-13T00:38:24.617366921Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:24.620667 containerd[1633]: time="2026-03-13T00:38:24.620624618Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:24.621670 containerd[1633]: time="2026-03-13T00:38:24.621652571Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 3.162038882s" Mar 13 00:38:24.621786 containerd[1633]: time="2026-03-13T00:38:24.621771511Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 13 00:38:24.624911 containerd[1633]: time="2026-03-13T00:38:24.624735828Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 13 00:38:24.641914 containerd[1633]: time="2026-03-13T00:38:24.641873463Z" level=info msg="CreateContainer within sandbox \"cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 13 00:38:24.650596 containerd[1633]: time="2026-03-13T00:38:24.650567841Z" level=info msg="Container 4a54a46613ba5b3c5fbfebde48c184f1c1077122464acfc4ec98af9fb11fd567: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:38:24.657433 containerd[1633]: time="2026-03-13T00:38:24.657404316Z" level=info msg="CreateContainer within sandbox \"cc19815bf2497bb7d762326302e89b66ec012f02a0fddb8e53f27a14029d334f\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"4a54a46613ba5b3c5fbfebde48c184f1c1077122464acfc4ec98af9fb11fd567\"" Mar 13 00:38:24.658041 containerd[1633]: time="2026-03-13T00:38:24.658014417Z" level=info msg="StartContainer for \"4a54a46613ba5b3c5fbfebde48c184f1c1077122464acfc4ec98af9fb11fd567\"" Mar 13 00:38:24.658831 containerd[1633]: time="2026-03-13T00:38:24.658770948Z" level=info msg="connecting to shim 4a54a46613ba5b3c5fbfebde48c184f1c1077122464acfc4ec98af9fb11fd567" address="unix:///run/containerd/s/f1bb272c6144263f9ea79779ab3cd6e2374df32f984a753a5c8158f71fbc9a74" protocol=ttrpc version=3 Mar 13 00:38:24.679593 systemd[1]: Started cri-containerd-4a54a46613ba5b3c5fbfebde48c184f1c1077122464acfc4ec98af9fb11fd567.scope - libcontainer container 4a54a46613ba5b3c5fbfebde48c184f1c1077122464acfc4ec98af9fb11fd567. Mar 13 00:38:24.724251 containerd[1633]: time="2026-03-13T00:38:24.724200445Z" level=info msg="StartContainer for \"4a54a46613ba5b3c5fbfebde48c184f1c1077122464acfc4ec98af9fb11fd567\" returns successfully" Mar 13 00:38:25.312907 kubelet[2778]: I0313 00:38:25.312819 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7948d5d9db-xhktz" podStartSLOduration=29.0513495 podStartE2EDuration="35.312800115s" podCreationTimestamp="2026-03-13 00:37:50 +0000 UTC" firstStartedPulling="2026-03-13 00:38:18.361934119 +0000 UTC m=+43.374889454" lastFinishedPulling="2026-03-13 00:38:24.623384734 +0000 UTC m=+49.636340069" observedRunningTime="2026-03-13 00:38:25.312395573 +0000 UTC m=+50.325350928" watchObservedRunningTime="2026-03-13 00:38:25.312800115 +0000 UTC m=+50.325755480" Mar 13 00:38:26.278911 kubelet[2778]: I0313 00:38:26.278858 2778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:38:27.160039 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount43891190.mount: Deactivated successfully. Mar 13 00:38:27.519191 containerd[1633]: time="2026-03-13T00:38:27.519140647Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:27.520318 containerd[1633]: time="2026-03-13T00:38:27.520206569Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 13 00:38:27.521177 containerd[1633]: time="2026-03-13T00:38:27.521152280Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:27.523027 containerd[1633]: time="2026-03-13T00:38:27.523001824Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:38:27.523465 containerd[1633]: time="2026-03-13T00:38:27.523440774Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 2.898685706s" Mar 13 00:38:27.523545 containerd[1633]: time="2026-03-13T00:38:27.523534214Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 13 00:38:27.526660 containerd[1633]: time="2026-03-13T00:38:27.526626180Z" level=info msg="CreateContainer within sandbox \"76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 13 00:38:27.533676 containerd[1633]: time="2026-03-13T00:38:27.533616724Z" level=info msg="Container ca5d7f90d6f9e8b134f2074c258e8c41d8e754e5fddc3c108523598872815204: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:38:27.543243 containerd[1633]: time="2026-03-13T00:38:27.543210492Z" level=info msg="CreateContainer within sandbox \"76b9d63a0df80e133678dbbf60db82497f462983956a25aaa41c4c594b0e6e8a\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"ca5d7f90d6f9e8b134f2074c258e8c41d8e754e5fddc3c108523598872815204\"" Mar 13 00:38:27.543712 containerd[1633]: time="2026-03-13T00:38:27.543615862Z" level=info msg="StartContainer for \"ca5d7f90d6f9e8b134f2074c258e8c41d8e754e5fddc3c108523598872815204\"" Mar 13 00:38:27.544715 containerd[1633]: time="2026-03-13T00:38:27.544687084Z" level=info msg="connecting to shim ca5d7f90d6f9e8b134f2074c258e8c41d8e754e5fddc3c108523598872815204" address="unix:///run/containerd/s/800074562d9b1c9f5350e6d8651a3406a43bf356169848f84f4c594d3c3a889a" protocol=ttrpc version=3 Mar 13 00:38:27.570585 systemd[1]: Started cri-containerd-ca5d7f90d6f9e8b134f2074c258e8c41d8e754e5fddc3c108523598872815204.scope - libcontainer container ca5d7f90d6f9e8b134f2074c258e8c41d8e754e5fddc3c108523598872815204. Mar 13 00:38:27.622131 containerd[1633]: time="2026-03-13T00:38:27.622095610Z" level=info msg="StartContainer for \"ca5d7f90d6f9e8b134f2074c258e8c41d8e754e5fddc3c108523598872815204\" returns successfully" Mar 13 00:38:28.305121 kubelet[2778]: I0313 00:38:28.303249 2778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-5b85766d88-67z6c" podStartSLOduration=29.241048901 podStartE2EDuration="38.303228617s" podCreationTimestamp="2026-03-13 00:37:50 +0000 UTC" firstStartedPulling="2026-03-13 00:38:18.46210889 +0000 UTC m=+43.475064225" lastFinishedPulling="2026-03-13 00:38:27.524288596 +0000 UTC m=+52.537243941" observedRunningTime="2026-03-13 00:38:28.302144294 +0000 UTC m=+53.315099659" watchObservedRunningTime="2026-03-13 00:38:28.303228617 +0000 UTC m=+53.316183982" Mar 13 00:38:52.491559 kubelet[2778]: I0313 00:38:52.491375 2778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:39:04.536142 systemd[1]: Started sshd@10-89.167.87.208:22-4.153.228.146:45970.service - OpenSSH per-connection server daemon (4.153.228.146:45970). Mar 13 00:39:05.204371 sshd[5540]: Accepted publickey for core from 4.153.228.146 port 45970 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:39:05.207689 sshd-session[5540]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:05.217207 systemd-logind[1602]: New session 8 of user core. Mar 13 00:39:05.226752 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 13 00:39:05.676749 sshd[5544]: Connection closed by 4.153.228.146 port 45970 Mar 13 00:39:05.678438 sshd-session[5540]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:05.681360 systemd[1]: sshd@10-89.167.87.208:22-4.153.228.146:45970.service: Deactivated successfully. Mar 13 00:39:05.683512 systemd[1]: session-8.scope: Deactivated successfully. Mar 13 00:39:05.685684 systemd-logind[1602]: Session 8 logged out. Waiting for processes to exit. Mar 13 00:39:05.687589 systemd-logind[1602]: Removed session 8. Mar 13 00:39:10.813251 systemd[1]: Started sshd@11-89.167.87.208:22-4.153.228.146:37240.service - OpenSSH per-connection server daemon (4.153.228.146:37240). Mar 13 00:39:11.472417 sshd[5581]: Accepted publickey for core from 4.153.228.146 port 37240 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:39:11.473910 sshd-session[5581]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:11.479196 systemd-logind[1602]: New session 9 of user core. Mar 13 00:39:11.486609 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 13 00:39:11.887928 sshd[5586]: Connection closed by 4.153.228.146 port 37240 Mar 13 00:39:11.888675 sshd-session[5581]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:11.893655 systemd-logind[1602]: Session 9 logged out. Waiting for processes to exit. Mar 13 00:39:11.895392 systemd[1]: sshd@11-89.167.87.208:22-4.153.228.146:37240.service: Deactivated successfully. Mar 13 00:39:11.898143 systemd[1]: session-9.scope: Deactivated successfully. Mar 13 00:39:11.900826 systemd-logind[1602]: Removed session 9. Mar 13 00:39:17.022188 systemd[1]: Started sshd@12-89.167.87.208:22-4.153.228.146:37250.service - OpenSSH per-connection server daemon (4.153.228.146:37250). Mar 13 00:39:17.681562 sshd[5599]: Accepted publickey for core from 4.153.228.146 port 37250 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:39:17.684163 sshd-session[5599]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:17.692674 systemd-logind[1602]: New session 10 of user core. Mar 13 00:39:17.701718 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 13 00:39:18.103972 sshd[5602]: Connection closed by 4.153.228.146 port 37250 Mar 13 00:39:18.105895 sshd-session[5599]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:18.118374 systemd[1]: sshd@12-89.167.87.208:22-4.153.228.146:37250.service: Deactivated successfully. Mar 13 00:39:18.124292 systemd[1]: session-10.scope: Deactivated successfully. Mar 13 00:39:18.125888 systemd-logind[1602]: Session 10 logged out. Waiting for processes to exit. Mar 13 00:39:18.128620 systemd-logind[1602]: Removed session 10. Mar 13 00:39:18.234351 systemd[1]: Started sshd@13-89.167.87.208:22-4.153.228.146:37256.service - OpenSSH per-connection server daemon (4.153.228.146:37256). Mar 13 00:39:18.872191 sshd[5622]: Accepted publickey for core from 4.153.228.146 port 37256 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:39:18.874858 sshd-session[5622]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:18.884079 systemd-logind[1602]: New session 11 of user core. Mar 13 00:39:18.891780 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 13 00:39:19.363999 sshd[5634]: Connection closed by 4.153.228.146 port 37256 Mar 13 00:39:19.364967 sshd-session[5622]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:19.368791 systemd-logind[1602]: Session 11 logged out. Waiting for processes to exit. Mar 13 00:39:19.369581 systemd[1]: sshd@13-89.167.87.208:22-4.153.228.146:37256.service: Deactivated successfully. Mar 13 00:39:19.371585 systemd[1]: session-11.scope: Deactivated successfully. Mar 13 00:39:19.372991 systemd-logind[1602]: Removed session 11. Mar 13 00:39:19.501263 systemd[1]: Started sshd@14-89.167.87.208:22-4.153.228.146:38746.service - OpenSSH per-connection server daemon (4.153.228.146:38746). Mar 13 00:39:20.143528 sshd[5644]: Accepted publickey for core from 4.153.228.146 port 38746 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:39:20.144494 sshd-session[5644]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:20.151039 systemd-logind[1602]: New session 12 of user core. Mar 13 00:39:20.155592 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 13 00:39:20.591881 sshd[5647]: Connection closed by 4.153.228.146 port 38746 Mar 13 00:39:20.592669 sshd-session[5644]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:20.596902 systemd[1]: sshd@14-89.167.87.208:22-4.153.228.146:38746.service: Deactivated successfully. Mar 13 00:39:20.599378 systemd[1]: session-12.scope: Deactivated successfully. Mar 13 00:39:20.601323 systemd-logind[1602]: Session 12 logged out. Waiting for processes to exit. Mar 13 00:39:20.603808 systemd-logind[1602]: Removed session 12. Mar 13 00:39:21.248525 systemd[1]: Started sshd@15-89.167.87.208:22-103.237.144.204:47254.service - OpenSSH per-connection server daemon (103.237.144.204:47254). Mar 13 00:39:23.105010 sshd[5682]: Received disconnect from 103.237.144.204 port 47254:11: Bye Bye [preauth] Mar 13 00:39:23.105010 sshd[5682]: Disconnected from authenticating user root 103.237.144.204 port 47254 [preauth] Mar 13 00:39:23.107272 systemd[1]: sshd@15-89.167.87.208:22-103.237.144.204:47254.service: Deactivated successfully. Mar 13 00:39:25.729419 systemd[1]: Started sshd@16-89.167.87.208:22-4.153.228.146:38762.service - OpenSSH per-connection server daemon (4.153.228.146:38762). Mar 13 00:39:26.378904 sshd[5688]: Accepted publickey for core from 4.153.228.146 port 38762 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:39:26.381934 sshd-session[5688]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:26.391252 systemd-logind[1602]: New session 13 of user core. Mar 13 00:39:26.396707 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 13 00:39:26.857665 sshd[5691]: Connection closed by 4.153.228.146 port 38762 Mar 13 00:39:26.858710 sshd-session[5688]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:26.865886 systemd-logind[1602]: Session 13 logged out. Waiting for processes to exit. Mar 13 00:39:26.867117 systemd[1]: sshd@16-89.167.87.208:22-4.153.228.146:38762.service: Deactivated successfully. Mar 13 00:39:26.871130 systemd[1]: session-13.scope: Deactivated successfully. Mar 13 00:39:26.874458 systemd-logind[1602]: Removed session 13. Mar 13 00:39:26.994611 systemd[1]: Started sshd@17-89.167.87.208:22-4.153.228.146:38772.service - OpenSSH per-connection server daemon (4.153.228.146:38772). Mar 13 00:39:27.657525 sshd[5725]: Accepted publickey for core from 4.153.228.146 port 38772 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:39:27.661154 sshd-session[5725]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:27.675843 systemd-logind[1602]: New session 14 of user core. Mar 13 00:39:27.684686 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 13 00:39:28.246158 sshd[5736]: Connection closed by 4.153.228.146 port 38772 Mar 13 00:39:28.248259 sshd-session[5725]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:28.256605 systemd[1]: sshd@17-89.167.87.208:22-4.153.228.146:38772.service: Deactivated successfully. Mar 13 00:39:28.260312 systemd[1]: session-14.scope: Deactivated successfully. Mar 13 00:39:28.261825 systemd-logind[1602]: Session 14 logged out. Waiting for processes to exit. Mar 13 00:39:28.263718 systemd-logind[1602]: Removed session 14. Mar 13 00:39:28.379846 systemd[1]: Started sshd@18-89.167.87.208:22-4.153.228.146:38782.service - OpenSSH per-connection server daemon (4.153.228.146:38782). Mar 13 00:39:29.049533 sshd[5748]: Accepted publickey for core from 4.153.228.146 port 38782 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:39:29.051268 sshd-session[5748]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:29.058806 systemd-logind[1602]: New session 15 of user core. Mar 13 00:39:29.073706 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 13 00:39:30.065543 sshd[5751]: Connection closed by 4.153.228.146 port 38782 Mar 13 00:39:30.066651 sshd-session[5748]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:30.070759 systemd-logind[1602]: Session 15 logged out. Waiting for processes to exit. Mar 13 00:39:30.071294 systemd[1]: sshd@18-89.167.87.208:22-4.153.228.146:38782.service: Deactivated successfully. Mar 13 00:39:30.073412 systemd[1]: session-15.scope: Deactivated successfully. Mar 13 00:39:30.075690 systemd-logind[1602]: Removed session 15. Mar 13 00:39:30.199913 systemd[1]: Started sshd@19-89.167.87.208:22-4.153.228.146:55322.service - OpenSSH per-connection server daemon (4.153.228.146:55322). Mar 13 00:39:30.835524 sshd[5776]: Accepted publickey for core from 4.153.228.146 port 55322 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:39:30.838150 sshd-session[5776]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:30.847630 systemd-logind[1602]: New session 16 of user core. Mar 13 00:39:30.855768 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 13 00:39:31.375301 sshd[5802]: Connection closed by 4.153.228.146 port 55322 Mar 13 00:39:31.376781 sshd-session[5776]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:31.383083 systemd[1]: sshd@19-89.167.87.208:22-4.153.228.146:55322.service: Deactivated successfully. Mar 13 00:39:31.388806 systemd[1]: session-16.scope: Deactivated successfully. Mar 13 00:39:31.391284 systemd-logind[1602]: Session 16 logged out. Waiting for processes to exit. Mar 13 00:39:31.395106 systemd-logind[1602]: Removed session 16. Mar 13 00:39:31.504156 systemd[1]: Started sshd@20-89.167.87.208:22-4.153.228.146:55326.service - OpenSSH per-connection server daemon (4.153.228.146:55326). Mar 13 00:39:32.141462 sshd[5812]: Accepted publickey for core from 4.153.228.146 port 55326 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:39:32.142065 sshd-session[5812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:32.147107 systemd-logind[1602]: New session 17 of user core. Mar 13 00:39:32.153661 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 13 00:39:32.567656 sshd[5815]: Connection closed by 4.153.228.146 port 55326 Mar 13 00:39:32.568687 sshd-session[5812]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:32.573033 systemd[1]: sshd@20-89.167.87.208:22-4.153.228.146:55326.service: Deactivated successfully. Mar 13 00:39:32.573244 systemd-logind[1602]: Session 17 logged out. Waiting for processes to exit. Mar 13 00:39:32.575260 systemd[1]: session-17.scope: Deactivated successfully. Mar 13 00:39:32.576840 systemd-logind[1602]: Removed session 17. Mar 13 00:39:37.703588 systemd[1]: Started sshd@21-89.167.87.208:22-4.153.228.146:55332.service - OpenSSH per-connection server daemon (4.153.228.146:55332). Mar 13 00:39:38.370537 sshd[5854]: Accepted publickey for core from 4.153.228.146 port 55332 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:39:38.373071 sshd-session[5854]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:38.384607 systemd-logind[1602]: New session 18 of user core. Mar 13 00:39:38.389955 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 13 00:39:38.832615 sshd[5857]: Connection closed by 4.153.228.146 port 55332 Mar 13 00:39:38.834236 sshd-session[5854]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:38.838726 systemd[1]: sshd@21-89.167.87.208:22-4.153.228.146:55332.service: Deactivated successfully. Mar 13 00:39:38.838798 systemd-logind[1602]: Session 18 logged out. Waiting for processes to exit. Mar 13 00:39:38.840708 systemd[1]: session-18.scope: Deactivated successfully. Mar 13 00:39:38.843129 systemd-logind[1602]: Removed session 18. Mar 13 00:39:43.966284 systemd[1]: Started sshd@22-89.167.87.208:22-4.153.228.146:41004.service - OpenSSH per-connection server daemon (4.153.228.146:41004). Mar 13 00:39:44.608537 sshd[5893]: Accepted publickey for core from 4.153.228.146 port 41004 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:39:44.610852 sshd-session[5893]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:44.619428 systemd-logind[1602]: New session 19 of user core. Mar 13 00:39:44.625724 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 13 00:39:45.080800 sshd[5914]: Connection closed by 4.153.228.146 port 41004 Mar 13 00:39:45.082454 sshd-session[5893]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:45.086117 systemd[1]: sshd@22-89.167.87.208:22-4.153.228.146:41004.service: Deactivated successfully. Mar 13 00:39:45.088145 systemd[1]: session-19.scope: Deactivated successfully. Mar 13 00:39:45.091105 systemd-logind[1602]: Session 19 logged out. Waiting for processes to exit. Mar 13 00:39:45.093717 systemd-logind[1602]: Removed session 19. Mar 13 00:40:34.941579 systemd[1]: cri-containerd-267b1a2ee333e933aad955eaff3dcc25a62173231acf15385235b9106514d975.scope: Deactivated successfully. Mar 13 00:40:34.943239 systemd[1]: cri-containerd-267b1a2ee333e933aad955eaff3dcc25a62173231acf15385235b9106514d975.scope: Consumed 3.456s CPU time, 58.6M memory peak, 64K read from disk. Mar 13 00:40:34.948085 containerd[1633]: time="2026-03-13T00:40:34.947924642Z" level=info msg="received container exit event container_id:\"267b1a2ee333e933aad955eaff3dcc25a62173231acf15385235b9106514d975\" id:\"267b1a2ee333e933aad955eaff3dcc25a62173231acf15385235b9106514d975\" pid:2610 exit_status:1 exited_at:{seconds:1773362434 nanos:947024847}" Mar 13 00:40:34.996533 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-267b1a2ee333e933aad955eaff3dcc25a62173231acf15385235b9106514d975-rootfs.mount: Deactivated successfully. Mar 13 00:40:35.159534 kubelet[2778]: E0313 00:40:35.159351 2778 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:33672->10.0.0.2:2379: read: connection timed out" Mar 13 00:40:35.165699 systemd[1]: cri-containerd-9a835eb74ceafecb6e1233ff8ac76dcc26dd9e9d116c8941a5b9e8e18c716ead.scope: Deactivated successfully. Mar 13 00:40:35.166542 systemd[1]: cri-containerd-9a835eb74ceafecb6e1233ff8ac76dcc26dd9e9d116c8941a5b9e8e18c716ead.scope: Consumed 1.858s CPU time, 23.1M memory peak, 376K read from disk. Mar 13 00:40:35.168216 containerd[1633]: time="2026-03-13T00:40:35.168180858Z" level=info msg="received container exit event container_id:\"9a835eb74ceafecb6e1233ff8ac76dcc26dd9e9d116c8941a5b9e8e18c716ead\" id:\"9a835eb74ceafecb6e1233ff8ac76dcc26dd9e9d116c8941a5b9e8e18c716ead\" pid:2629 exit_status:1 exited_at:{seconds:1773362435 nanos:167921837}" Mar 13 00:40:35.199271 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9a835eb74ceafecb6e1233ff8ac76dcc26dd9e9d116c8941a5b9e8e18c716ead-rootfs.mount: Deactivated successfully. Mar 13 00:40:35.561041 systemd[1]: cri-containerd-8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f.scope: Deactivated successfully. Mar 13 00:40:35.563694 systemd[1]: cri-containerd-8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f.scope: Consumed 8.515s CPU time, 151.6M memory peak, 604K read from disk. Mar 13 00:40:35.568311 containerd[1633]: time="2026-03-13T00:40:35.568232593Z" level=info msg="received container exit event container_id:\"8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f\" id:\"8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f\" pid:3111 exit_status:1 exited_at:{seconds:1773362435 nanos:567776171}" Mar 13 00:40:35.596132 kubelet[2778]: I0313 00:40:35.596102 2778 scope.go:117] "RemoveContainer" containerID="9a835eb74ceafecb6e1233ff8ac76dcc26dd9e9d116c8941a5b9e8e18c716ead" Mar 13 00:40:35.599176 containerd[1633]: time="2026-03-13T00:40:35.599143596Z" level=info msg="CreateContainer within sandbox \"5b75cf094cc77c77d668712426fa51a6afce040b8f5590ebd8f9e5606cb83ebe\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 13 00:40:35.602060 kubelet[2778]: I0313 00:40:35.602039 2778 scope.go:117] "RemoveContainer" containerID="267b1a2ee333e933aad955eaff3dcc25a62173231acf15385235b9106514d975" Mar 13 00:40:35.605934 containerd[1633]: time="2026-03-13T00:40:35.605801487Z" level=info msg="CreateContainer within sandbox \"9733e54e71f5bcb193c00e8fd100e318f12ab864e6ae2f902d76723f082d2677\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 13 00:40:35.614803 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f-rootfs.mount: Deactivated successfully. Mar 13 00:40:35.625468 containerd[1633]: time="2026-03-13T00:40:35.625447917Z" level=info msg="Container 7e53ef254b8f79c69471d3af61de8cfc03ba6758d1cafe2ccda6434e72403afd: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:40:35.631164 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount376954302.mount: Deactivated successfully. Mar 13 00:40:35.633009 containerd[1633]: time="2026-03-13T00:40:35.632857021Z" level=info msg="Container 09edf95789ce34b1f2351c328286406f2c4e7989a2a1ed1d4d5ab13254a11237: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:40:35.636762 containerd[1633]: time="2026-03-13T00:40:35.636731329Z" level=info msg="CreateContainer within sandbox \"5b75cf094cc77c77d668712426fa51a6afce040b8f5590ebd8f9e5606cb83ebe\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"7e53ef254b8f79c69471d3af61de8cfc03ba6758d1cafe2ccda6434e72403afd\"" Mar 13 00:40:35.638384 containerd[1633]: time="2026-03-13T00:40:35.638359457Z" level=info msg="StartContainer for \"7e53ef254b8f79c69471d3af61de8cfc03ba6758d1cafe2ccda6434e72403afd\"" Mar 13 00:40:35.640135 containerd[1633]: time="2026-03-13T00:40:35.640117755Z" level=info msg="connecting to shim 7e53ef254b8f79c69471d3af61de8cfc03ba6758d1cafe2ccda6434e72403afd" address="unix:///run/containerd/s/504e720ce657650fa69c8108c526b54491ab2322dab91a41246a94533dc4fc43" protocol=ttrpc version=3 Mar 13 00:40:35.643488 containerd[1633]: time="2026-03-13T00:40:35.643170739Z" level=info msg="CreateContainer within sandbox \"9733e54e71f5bcb193c00e8fd100e318f12ab864e6ae2f902d76723f082d2677\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"09edf95789ce34b1f2351c328286406f2c4e7989a2a1ed1d4d5ab13254a11237\"" Mar 13 00:40:35.643488 containerd[1633]: time="2026-03-13T00:40:35.643401620Z" level=info msg="StartContainer for \"09edf95789ce34b1f2351c328286406f2c4e7989a2a1ed1d4d5ab13254a11237\"" Mar 13 00:40:35.644225 containerd[1633]: time="2026-03-13T00:40:35.644206254Z" level=info msg="connecting to shim 09edf95789ce34b1f2351c328286406f2c4e7989a2a1ed1d4d5ab13254a11237" address="unix:///run/containerd/s/fbba04a9df17e65e9fffb84e6f6c2eb2fa4c54999b54dab12a6224b9b8d6ba10" protocol=ttrpc version=3 Mar 13 00:40:35.658589 systemd[1]: Started cri-containerd-7e53ef254b8f79c69471d3af61de8cfc03ba6758d1cafe2ccda6434e72403afd.scope - libcontainer container 7e53ef254b8f79c69471d3af61de8cfc03ba6758d1cafe2ccda6434e72403afd. Mar 13 00:40:35.662176 systemd[1]: Started cri-containerd-09edf95789ce34b1f2351c328286406f2c4e7989a2a1ed1d4d5ab13254a11237.scope - libcontainer container 09edf95789ce34b1f2351c328286406f2c4e7989a2a1ed1d4d5ab13254a11237. Mar 13 00:40:35.712094 containerd[1633]: time="2026-03-13T00:40:35.712046667Z" level=info msg="StartContainer for \"7e53ef254b8f79c69471d3af61de8cfc03ba6758d1cafe2ccda6434e72403afd\" returns successfully" Mar 13 00:40:35.716177 containerd[1633]: time="2026-03-13T00:40:35.716127066Z" level=info msg="StartContainer for \"09edf95789ce34b1f2351c328286406f2c4e7989a2a1ed1d4d5ab13254a11237\" returns successfully" Mar 13 00:40:36.611088 kubelet[2778]: I0313 00:40:36.611061 2778 scope.go:117] "RemoveContainer" containerID="8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f" Mar 13 00:40:36.612366 containerd[1633]: time="2026-03-13T00:40:36.612342374Z" level=info msg="CreateContainer within sandbox \"653df142b714da5a83195cdf9273f7d325f85e6976e38cc4dcf61d92ceded312\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 13 00:40:36.630319 containerd[1633]: time="2026-03-13T00:40:36.625545985Z" level=info msg="Container 643fe34f45aad80fe07ce6b5d861ec6d230e3e2c3dc6d055102c9ac3a92e3548: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:40:36.633191 containerd[1633]: time="2026-03-13T00:40:36.633168941Z" level=info msg="CreateContainer within sandbox \"653df142b714da5a83195cdf9273f7d325f85e6976e38cc4dcf61d92ceded312\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"643fe34f45aad80fe07ce6b5d861ec6d230e3e2c3dc6d055102c9ac3a92e3548\"" Mar 13 00:40:36.633571 containerd[1633]: time="2026-03-13T00:40:36.633555182Z" level=info msg="StartContainer for \"643fe34f45aad80fe07ce6b5d861ec6d230e3e2c3dc6d055102c9ac3a92e3548\"" Mar 13 00:40:36.634132 containerd[1633]: time="2026-03-13T00:40:36.634114474Z" level=info msg="connecting to shim 643fe34f45aad80fe07ce6b5d861ec6d230e3e2c3dc6d055102c9ac3a92e3548" address="unix:///run/containerd/s/8666144c9d49d8abfd125abd2c04920ac5b8771514a392bfa701b75e6f680650" protocol=ttrpc version=3 Mar 13 00:40:36.660796 systemd[1]: Started cri-containerd-643fe34f45aad80fe07ce6b5d861ec6d230e3e2c3dc6d055102c9ac3a92e3548.scope - libcontainer container 643fe34f45aad80fe07ce6b5d861ec6d230e3e2c3dc6d055102c9ac3a92e3548. Mar 13 00:40:36.726111 containerd[1633]: time="2026-03-13T00:40:36.725620755Z" level=info msg="StartContainer for \"643fe34f45aad80fe07ce6b5d861ec6d230e3e2c3dc6d055102c9ac3a92e3548\" returns successfully" Mar 13 00:40:38.700330 systemd[1]: cri-containerd-643fe34f45aad80fe07ce6b5d861ec6d230e3e2c3dc6d055102c9ac3a92e3548.scope: Deactivated successfully. Mar 13 00:40:38.700939 containerd[1633]: time="2026-03-13T00:40:38.700425864Z" level=info msg="received container exit event container_id:\"643fe34f45aad80fe07ce6b5d861ec6d230e3e2c3dc6d055102c9ac3a92e3548\" id:\"643fe34f45aad80fe07ce6b5d861ec6d230e3e2c3dc6d055102c9ac3a92e3548\" pid:6212 exit_status:1 exited_at:{seconds:1773362438 nanos:700223533}" Mar 13 00:40:38.725634 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-643fe34f45aad80fe07ce6b5d861ec6d230e3e2c3dc6d055102c9ac3a92e3548-rootfs.mount: Deactivated successfully. Mar 13 00:40:39.315908 kubelet[2778]: I0313 00:40:39.315826 2778 status_manager.go:895] "Failed to get status for pod" podUID="b3296c0246d369a29c26aaca40ad972b" pod="kube-system/kube-apiserver-ci-4459-2-4-n-86976195a3" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:33556->10.0.0.2:2379: read: connection timed out" Mar 13 00:40:39.627053 kubelet[2778]: I0313 00:40:39.626901 2778 scope.go:117] "RemoveContainer" containerID="8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f" Mar 13 00:40:39.628376 kubelet[2778]: I0313 00:40:39.628307 2778 scope.go:117] "RemoveContainer" containerID="643fe34f45aad80fe07ce6b5d861ec6d230e3e2c3dc6d055102c9ac3a92e3548" Mar 13 00:40:39.629009 kubelet[2778]: E0313 00:40:39.628705 2778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-6bf85f8dd-gpvgn_tigera-operator(233a5fff-dac5-4b37-94e2-164ed589f546)\"" pod="tigera-operator/tigera-operator-6bf85f8dd-gpvgn" podUID="233a5fff-dac5-4b37-94e2-164ed589f546" Mar 13 00:40:39.630823 containerd[1633]: time="2026-03-13T00:40:39.630772889Z" level=info msg="RemoveContainer for \"8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f\"" Mar 13 00:40:39.645910 containerd[1633]: time="2026-03-13T00:40:39.645735246Z" level=info msg="RemoveContainer for \"8f2dc2fe2ff355790c74a73c01db852af141416c2793e15c4733fe4a1622b50f\" returns successfully" Mar 13 00:40:39.959937 kubelet[2778]: E0313 00:40:39.955943 2778 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:33446->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-4-n-86976195a3.189c3fbbe418041d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-4-n-86976195a3,UID:b3296c0246d369a29c26aaca40ad972b,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-86976195a3,},FirstTimestamp:2026-03-13 00:40:29.495346205 +0000 UTC m=+174.508301580,LastTimestamp:2026-03-13 00:40:29.495346205 +0000 UTC m=+174.508301580,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-86976195a3,}" Mar 13 00:40:45.160310 kubelet[2778]: E0313 00:40:45.159772 2778 controller.go:195] "Failed to update lease" err="Put \"https://89.167.87.208:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-86976195a3?timeout=10s\": context deadline exceeded" Mar 13 00:40:50.069773 kubelet[2778]: I0313 00:40:50.069693 2778 scope.go:117] "RemoveContainer" containerID="643fe34f45aad80fe07ce6b5d861ec6d230e3e2c3dc6d055102c9ac3a92e3548" Mar 13 00:40:50.073466 containerd[1633]: time="2026-03-13T00:40:50.073391479Z" level=info msg="CreateContainer within sandbox \"653df142b714da5a83195cdf9273f7d325f85e6976e38cc4dcf61d92ceded312\" for container &ContainerMetadata{Name:tigera-operator,Attempt:2,}" Mar 13 00:40:50.086004 containerd[1633]: time="2026-03-13T00:40:50.085940274Z" level=info msg="Container be7c61fc780735b186226a10eaa9d934a0acdf17ef2b125845983ec3d710eee0: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:40:50.101230 containerd[1633]: time="2026-03-13T00:40:50.101157009Z" level=info msg="CreateContainer within sandbox \"653df142b714da5a83195cdf9273f7d325f85e6976e38cc4dcf61d92ceded312\" for &ContainerMetadata{Name:tigera-operator,Attempt:2,} returns container id \"be7c61fc780735b186226a10eaa9d934a0acdf17ef2b125845983ec3d710eee0\"" Mar 13 00:40:50.102943 containerd[1633]: time="2026-03-13T00:40:50.102656235Z" level=info msg="StartContainer for \"be7c61fc780735b186226a10eaa9d934a0acdf17ef2b125845983ec3d710eee0\"" Mar 13 00:40:50.105628 containerd[1633]: time="2026-03-13T00:40:50.105517218Z" level=info msg="connecting to shim be7c61fc780735b186226a10eaa9d934a0acdf17ef2b125845983ec3d710eee0" address="unix:///run/containerd/s/8666144c9d49d8abfd125abd2c04920ac5b8771514a392bfa701b75e6f680650" protocol=ttrpc version=3 Mar 13 00:40:50.142738 systemd[1]: Started cri-containerd-be7c61fc780735b186226a10eaa9d934a0acdf17ef2b125845983ec3d710eee0.scope - libcontainer container be7c61fc780735b186226a10eaa9d934a0acdf17ef2b125845983ec3d710eee0. Mar 13 00:40:50.169199 containerd[1633]: time="2026-03-13T00:40:50.168728142Z" level=info msg="StartContainer for \"be7c61fc780735b186226a10eaa9d934a0acdf17ef2b125845983ec3d710eee0\" returns successfully" Mar 13 00:40:55.162978 kubelet[2778]: E0313 00:40:55.160706 2778 controller.go:195] "Failed to update lease" err="Put \"https://89.167.87.208:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-86976195a3?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"