Mar 13 00:35:17.801561 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Mar 12 22:08:29 -00 2026 Mar 13 00:35:17.801603 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a2116dc4421f78fe124deb19b9ad6d70a0cb4fc0b3349854f4ce4e2904d4925d Mar 13 00:35:17.801613 kernel: BIOS-provided physical RAM map: Mar 13 00:35:17.801619 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 13 00:35:17.801625 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Mar 13 00:35:17.801631 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Mar 13 00:35:17.801640 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Mar 13 00:35:17.801646 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Mar 13 00:35:17.801652 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Mar 13 00:35:17.801658 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Mar 13 00:35:17.801664 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000007e93efff] usable Mar 13 00:35:17.801669 kernel: BIOS-e820: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Mar 13 00:35:17.801675 kernel: BIOS-e820: [mem 0x000000007ea00000-0x000000007ec70fff] usable Mar 13 00:35:17.801681 kernel: BIOS-e820: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Mar 13 00:35:17.801690 kernel: BIOS-e820: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Mar 13 00:35:17.801696 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Mar 13 00:35:17.801702 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Mar 13 00:35:17.801708 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Mar 13 00:35:17.801715 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007feaefff] usable Mar 13 00:35:17.801721 kernel: BIOS-e820: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Mar 13 00:35:17.801727 kernel: BIOS-e820: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Mar 13 00:35:17.801734 kernel: BIOS-e820: [mem 0x000000007feb5000-0x000000007feebfff] usable Mar 13 00:35:17.801740 kernel: BIOS-e820: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Mar 13 00:35:17.801747 kernel: BIOS-e820: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Mar 13 00:35:17.801753 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 13 00:35:17.801759 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 13 00:35:17.801788 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 13 00:35:17.801794 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000017fffffff] usable Mar 13 00:35:17.801800 kernel: NX (Execute Disable) protection: active Mar 13 00:35:17.801806 kernel: APIC: Static calls initialized Mar 13 00:35:17.801812 kernel: e820: update [mem 0x7df7f018-0x7df88a57] usable ==> usable Mar 13 00:35:17.801819 kernel: e820: update [mem 0x7df57018-0x7df7e457] usable ==> usable Mar 13 00:35:17.801825 kernel: extended physical RAM map: Mar 13 00:35:17.801833 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 13 00:35:17.801839 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Mar 13 00:35:17.801845 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Mar 13 00:35:17.801851 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Mar 13 00:35:17.801858 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Mar 13 00:35:17.801864 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Mar 13 00:35:17.801870 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Mar 13 00:35:17.801879 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000007df57017] usable Mar 13 00:35:17.801888 kernel: reserve setup_data: [mem 0x000000007df57018-0x000000007df7e457] usable Mar 13 00:35:17.801894 kernel: reserve setup_data: [mem 0x000000007df7e458-0x000000007df7f017] usable Mar 13 00:35:17.801901 kernel: reserve setup_data: [mem 0x000000007df7f018-0x000000007df88a57] usable Mar 13 00:35:17.801907 kernel: reserve setup_data: [mem 0x000000007df88a58-0x000000007e93efff] usable Mar 13 00:35:17.801914 kernel: reserve setup_data: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Mar 13 00:35:17.801920 kernel: reserve setup_data: [mem 0x000000007ea00000-0x000000007ec70fff] usable Mar 13 00:35:17.801927 kernel: reserve setup_data: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Mar 13 00:35:17.801935 kernel: reserve setup_data: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Mar 13 00:35:17.801941 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Mar 13 00:35:17.801947 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Mar 13 00:35:17.801954 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Mar 13 00:35:17.801961 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007feaefff] usable Mar 13 00:35:17.801967 kernel: reserve setup_data: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Mar 13 00:35:17.801973 kernel: reserve setup_data: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Mar 13 00:35:17.801980 kernel: reserve setup_data: [mem 0x000000007feb5000-0x000000007feebfff] usable Mar 13 00:35:17.801986 kernel: reserve setup_data: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Mar 13 00:35:17.801993 kernel: reserve setup_data: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Mar 13 00:35:17.801999 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 13 00:35:17.802007 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 13 00:35:17.802014 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 13 00:35:17.802020 kernel: reserve setup_data: [mem 0x0000000100000000-0x000000017fffffff] usable Mar 13 00:35:17.802026 kernel: efi: EFI v2.7 by EDK II Mar 13 00:35:17.802033 kernel: efi: SMBIOS=0x7f972000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7dfd8018 RNG=0x7fb72018 Mar 13 00:35:17.802040 kernel: random: crng init done Mar 13 00:35:17.802046 kernel: efi: Remove mem139: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Mar 13 00:35:17.802053 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Mar 13 00:35:17.802059 kernel: secureboot: Secure boot disabled Mar 13 00:35:17.802065 kernel: SMBIOS 2.8 present. Mar 13 00:35:17.802072 kernel: DMI: STACKIT Cloud OpenStack Nova/Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Mar 13 00:35:17.802078 kernel: DMI: Memory slots populated: 1/1 Mar 13 00:35:17.802086 kernel: Hypervisor detected: KVM Mar 13 00:35:17.802093 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Mar 13 00:35:17.802099 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 13 00:35:17.802106 kernel: kvm-clock: using sched offset of 6859493954 cycles Mar 13 00:35:17.802112 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 13 00:35:17.802119 kernel: tsc: Detected 2294.608 MHz processor Mar 13 00:35:17.802126 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 13 00:35:17.802133 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 13 00:35:17.802140 kernel: last_pfn = 0x180000 max_arch_pfn = 0x10000000000 Mar 13 00:35:17.802147 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Mar 13 00:35:17.802155 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 13 00:35:17.802162 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Mar 13 00:35:17.802168 kernel: Using GB pages for direct mapping Mar 13 00:35:17.802175 kernel: ACPI: Early table checksum verification disabled Mar 13 00:35:17.802182 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Mar 13 00:35:17.802189 kernel: ACPI: XSDT 0x000000007FB7D0E8 00004C (v01 BOCHS BXPC 00000001 01000013) Mar 13 00:35:17.802196 kernel: ACPI: FACP 0x000000007FB77000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:35:17.802202 kernel: ACPI: DSDT 0x000000007FB78000 00424E (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:35:17.802209 kernel: ACPI: FACS 0x000000007FBDD000 000040 Mar 13 00:35:17.802217 kernel: ACPI: APIC 0x000000007FB76000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:35:17.802224 kernel: ACPI: MCFG 0x000000007FB75000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:35:17.802230 kernel: ACPI: WAET 0x000000007FB74000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:35:17.802237 kernel: ACPI: BGRT 0x000000007FB73000 000038 (v01 INTEL EDK2 00000002 01000013) Mar 13 00:35:17.802243 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb77000-0x7fb770f3] Mar 13 00:35:17.802250 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb78000-0x7fb7c24d] Mar 13 00:35:17.802257 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Mar 13 00:35:17.802263 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb76000-0x7fb7607f] Mar 13 00:35:17.802270 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb75000-0x7fb7503b] Mar 13 00:35:17.802278 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb74000-0x7fb74027] Mar 13 00:35:17.802285 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb73000-0x7fb73037] Mar 13 00:35:17.802291 kernel: No NUMA configuration found Mar 13 00:35:17.802298 kernel: Faking a node at [mem 0x0000000000000000-0x000000017fffffff] Mar 13 00:35:17.802305 kernel: NODE_DATA(0) allocated [mem 0x17fff6dc0-0x17fffdfff] Mar 13 00:35:17.802311 kernel: Zone ranges: Mar 13 00:35:17.802318 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 13 00:35:17.802325 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 13 00:35:17.802331 kernel: Normal [mem 0x0000000100000000-0x000000017fffffff] Mar 13 00:35:17.802340 kernel: Device empty Mar 13 00:35:17.802346 kernel: Movable zone start for each node Mar 13 00:35:17.802353 kernel: Early memory node ranges Mar 13 00:35:17.802360 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Mar 13 00:35:17.802366 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Mar 13 00:35:17.802373 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Mar 13 00:35:17.802380 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Mar 13 00:35:17.802386 kernel: node 0: [mem 0x0000000000900000-0x000000007e93efff] Mar 13 00:35:17.802393 kernel: node 0: [mem 0x000000007ea00000-0x000000007ec70fff] Mar 13 00:35:17.802400 kernel: node 0: [mem 0x000000007ed85000-0x000000007f8ecfff] Mar 13 00:35:17.802413 kernel: node 0: [mem 0x000000007fbff000-0x000000007feaefff] Mar 13 00:35:17.802421 kernel: node 0: [mem 0x000000007feb5000-0x000000007feebfff] Mar 13 00:35:17.802428 kernel: node 0: [mem 0x0000000100000000-0x000000017fffffff] Mar 13 00:35:17.802437 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000017fffffff] Mar 13 00:35:17.802444 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 13 00:35:17.802451 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Mar 13 00:35:17.802458 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Mar 13 00:35:17.802465 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 13 00:35:17.802474 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Mar 13 00:35:17.802481 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Mar 13 00:35:17.802488 kernel: On node 0, zone DMA32: 276 pages in unavailable ranges Mar 13 00:35:17.802496 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Mar 13 00:35:17.802503 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Mar 13 00:35:17.802510 kernel: On node 0, zone Normal: 276 pages in unavailable ranges Mar 13 00:35:17.802518 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 13 00:35:17.802525 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 13 00:35:17.802532 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 13 00:35:17.802541 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 13 00:35:17.802549 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 13 00:35:17.802556 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 13 00:35:17.802573 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 13 00:35:17.802581 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 13 00:35:17.802588 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 13 00:35:17.802595 kernel: TSC deadline timer available Mar 13 00:35:17.802603 kernel: CPU topo: Max. logical packages: 2 Mar 13 00:35:17.802610 kernel: CPU topo: Max. logical dies: 2 Mar 13 00:35:17.802619 kernel: CPU topo: Max. dies per package: 1 Mar 13 00:35:17.802626 kernel: CPU topo: Max. threads per core: 1 Mar 13 00:35:17.802633 kernel: CPU topo: Num. cores per package: 1 Mar 13 00:35:17.802641 kernel: CPU topo: Num. threads per package: 1 Mar 13 00:35:17.802648 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Mar 13 00:35:17.802655 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 13 00:35:17.802663 kernel: kvm-guest: KVM setup pv remote TLB flush Mar 13 00:35:17.802670 kernel: kvm-guest: setup PV sched yield Mar 13 00:35:17.802677 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Mar 13 00:35:17.802685 kernel: Booting paravirtualized kernel on KVM Mar 13 00:35:17.802694 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 13 00:35:17.802701 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 13 00:35:17.802708 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Mar 13 00:35:17.802715 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Mar 13 00:35:17.802723 kernel: pcpu-alloc: [0] 0 1 Mar 13 00:35:17.802730 kernel: kvm-guest: PV spinlocks enabled Mar 13 00:35:17.802737 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 13 00:35:17.802746 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a2116dc4421f78fe124deb19b9ad6d70a0cb4fc0b3349854f4ce4e2904d4925d Mar 13 00:35:17.802755 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 13 00:35:17.802762 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 13 00:35:17.802770 kernel: Fallback order for Node 0: 0 Mar 13 00:35:17.802777 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1046694 Mar 13 00:35:17.802784 kernel: Policy zone: Normal Mar 13 00:35:17.802791 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 13 00:35:17.802799 kernel: software IO TLB: area num 2. Mar 13 00:35:17.802806 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 13 00:35:17.802813 kernel: ftrace: allocating 40099 entries in 157 pages Mar 13 00:35:17.802822 kernel: ftrace: allocated 157 pages with 5 groups Mar 13 00:35:17.802829 kernel: Dynamic Preempt: voluntary Mar 13 00:35:17.802837 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 13 00:35:17.802845 kernel: rcu: RCU event tracing is enabled. Mar 13 00:35:17.802852 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 13 00:35:17.802860 kernel: Trampoline variant of Tasks RCU enabled. Mar 13 00:35:17.802867 kernel: Rude variant of Tasks RCU enabled. Mar 13 00:35:17.802874 kernel: Tracing variant of Tasks RCU enabled. Mar 13 00:35:17.802882 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 13 00:35:17.802890 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 13 00:35:17.802898 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 13 00:35:17.802905 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 13 00:35:17.802912 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 13 00:35:17.802920 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 13 00:35:17.802927 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 13 00:35:17.802934 kernel: Console: colour dummy device 80x25 Mar 13 00:35:17.802942 kernel: printk: legacy console [tty0] enabled Mar 13 00:35:17.802949 kernel: printk: legacy console [ttyS0] enabled Mar 13 00:35:17.802958 kernel: ACPI: Core revision 20240827 Mar 13 00:35:17.802966 kernel: APIC: Switch to symmetric I/O mode setup Mar 13 00:35:17.802973 kernel: x2apic enabled Mar 13 00:35:17.802980 kernel: APIC: Switched APIC routing to: physical x2apic Mar 13 00:35:17.802987 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Mar 13 00:35:17.802995 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Mar 13 00:35:17.803002 kernel: kvm-guest: setup PV IPIs Mar 13 00:35:17.803009 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Mar 13 00:35:17.803017 kernel: Calibrating delay loop (skipped) preset value.. 4589.21 BogoMIPS (lpj=2294608) Mar 13 00:35:17.803026 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 13 00:35:17.803033 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Mar 13 00:35:17.803040 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Mar 13 00:35:17.803047 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 13 00:35:17.803054 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Mar 13 00:35:17.803061 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Mar 13 00:35:17.803068 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Mar 13 00:35:17.803075 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 13 00:35:17.803082 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 13 00:35:17.803089 kernel: TAA: Mitigation: Clear CPU buffers Mar 13 00:35:17.803098 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Mar 13 00:35:17.803105 kernel: active return thunk: its_return_thunk Mar 13 00:35:17.803112 kernel: ITS: Mitigation: Aligned branch/return thunks Mar 13 00:35:17.803119 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 13 00:35:17.803126 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 13 00:35:17.803133 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 13 00:35:17.803140 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Mar 13 00:35:17.803147 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Mar 13 00:35:17.803154 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Mar 13 00:35:17.803161 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Mar 13 00:35:17.803169 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 13 00:35:17.803177 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Mar 13 00:35:17.803184 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Mar 13 00:35:17.803191 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Mar 13 00:35:17.803198 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Mar 13 00:35:17.803205 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Mar 13 00:35:17.803212 kernel: Freeing SMP alternatives memory: 32K Mar 13 00:35:17.803219 kernel: pid_max: default: 32768 minimum: 301 Mar 13 00:35:17.803226 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 13 00:35:17.803233 kernel: landlock: Up and running. Mar 13 00:35:17.803240 kernel: SELinux: Initializing. Mar 13 00:35:17.803248 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 13 00:35:17.803255 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 13 00:35:17.803264 kernel: smpboot: CPU0: Intel(R) Xeon(R) Silver 4316 CPU @ 2.30GHz (family: 0x6, model: 0x6a, stepping: 0x6) Mar 13 00:35:17.803271 kernel: Performance Events: PEBS fmt0-, Icelake events, full-width counters, Intel PMU driver. Mar 13 00:35:17.803279 kernel: ... version: 2 Mar 13 00:35:17.803286 kernel: ... bit width: 48 Mar 13 00:35:17.803293 kernel: ... generic registers: 8 Mar 13 00:35:17.803301 kernel: ... value mask: 0000ffffffffffff Mar 13 00:35:17.803308 kernel: ... max period: 00007fffffffffff Mar 13 00:35:17.803315 kernel: ... fixed-purpose events: 3 Mar 13 00:35:17.803322 kernel: ... event mask: 00000007000000ff Mar 13 00:35:17.803331 kernel: signal: max sigframe size: 3632 Mar 13 00:35:17.803338 kernel: rcu: Hierarchical SRCU implementation. Mar 13 00:35:17.803346 kernel: rcu: Max phase no-delay instances is 400. Mar 13 00:35:17.803353 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 13 00:35:17.803361 kernel: smp: Bringing up secondary CPUs ... Mar 13 00:35:17.803368 kernel: smpboot: x86: Booting SMP configuration: Mar 13 00:35:17.803376 kernel: .... node #0, CPUs: #1 Mar 13 00:35:17.803383 kernel: smp: Brought up 1 node, 2 CPUs Mar 13 00:35:17.803390 kernel: smpboot: Total of 2 processors activated (9178.43 BogoMIPS) Mar 13 00:35:17.803400 kernel: Memory: 3945188K/4186776K available (14336K kernel code, 2445K rwdata, 26064K rodata, 46200K init, 2560K bss, 236712K reserved, 0K cma-reserved) Mar 13 00:35:17.803407 kernel: devtmpfs: initialized Mar 13 00:35:17.803414 kernel: x86/mm: Memory block size: 128MB Mar 13 00:35:17.803422 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Mar 13 00:35:17.803429 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Mar 13 00:35:17.803437 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Mar 13 00:35:17.803444 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Mar 13 00:35:17.803451 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feb3000-0x7feb4fff] (8192 bytes) Mar 13 00:35:17.803459 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7ff70000-0x7fffffff] (589824 bytes) Mar 13 00:35:17.803468 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 13 00:35:17.803475 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 13 00:35:17.803482 kernel: pinctrl core: initialized pinctrl subsystem Mar 13 00:35:17.803490 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 13 00:35:17.803497 kernel: audit: initializing netlink subsys (disabled) Mar 13 00:35:17.803504 kernel: audit: type=2000 audit(1773362114.167:1): state=initialized audit_enabled=0 res=1 Mar 13 00:35:17.803511 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 13 00:35:17.803519 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 13 00:35:17.803526 kernel: cpuidle: using governor menu Mar 13 00:35:17.803535 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 13 00:35:17.803542 kernel: dca service started, version 1.12.1 Mar 13 00:35:17.803549 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Mar 13 00:35:17.803557 kernel: PCI: Using configuration type 1 for base access Mar 13 00:35:17.803575 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 13 00:35:17.803583 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 13 00:35:17.803590 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 13 00:35:17.803597 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 13 00:35:17.803605 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 13 00:35:17.803614 kernel: ACPI: Added _OSI(Module Device) Mar 13 00:35:17.803621 kernel: ACPI: Added _OSI(Processor Device) Mar 13 00:35:17.803628 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 13 00:35:17.803636 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 13 00:35:17.803643 kernel: ACPI: Interpreter enabled Mar 13 00:35:17.803650 kernel: ACPI: PM: (supports S0 S3 S5) Mar 13 00:35:17.803658 kernel: ACPI: Using IOAPIC for interrupt routing Mar 13 00:35:17.803665 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 13 00:35:17.803672 kernel: PCI: Using E820 reservations for host bridge windows Mar 13 00:35:17.803681 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 13 00:35:17.803688 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 13 00:35:17.803814 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 13 00:35:17.803888 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 13 00:35:17.803955 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 13 00:35:17.803964 kernel: PCI host bridge to bus 0000:00 Mar 13 00:35:17.804033 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 13 00:35:17.804098 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 13 00:35:17.804158 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 13 00:35:17.804219 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Mar 13 00:35:17.804279 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Mar 13 00:35:17.804339 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x38e800003fff window] Mar 13 00:35:17.804399 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 13 00:35:17.804479 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Mar 13 00:35:17.804561 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Mar 13 00:35:17.804643 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Mar 13 00:35:17.804712 kernel: pci 0000:00:01.0: BAR 2 [mem 0x38e800000000-0x38e800003fff 64bit pref] Mar 13 00:35:17.804781 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8439e000-0x8439efff] Mar 13 00:35:17.804848 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Mar 13 00:35:17.804918 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 13 00:35:17.804997 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.805066 kernel: pci 0000:00:02.0: BAR 0 [mem 0x8439d000-0x8439dfff] Mar 13 00:35:17.805135 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 13 00:35:17.805203 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Mar 13 00:35:17.805270 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Mar 13 00:35:17.805339 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Mar 13 00:35:17.805407 kernel: pci 0000:00:02.0: enabling Extended Tags Mar 13 00:35:17.805482 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.805551 kernel: pci 0000:00:02.1: BAR 0 [mem 0x8439c000-0x8439cfff] Mar 13 00:35:17.805625 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 13 00:35:17.805692 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Mar 13 00:35:17.805760 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Mar 13 00:35:17.805838 kernel: pci 0000:00:02.1: enabling Extended Tags Mar 13 00:35:17.805911 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.805984 kernel: pci 0000:00:02.2: BAR 0 [mem 0x8439b000-0x8439bfff] Mar 13 00:35:17.806052 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 13 00:35:17.806119 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Mar 13 00:35:17.806187 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Mar 13 00:35:17.806254 kernel: pci 0000:00:02.2: enabling Extended Tags Mar 13 00:35:17.806328 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.806398 kernel: pci 0000:00:02.3: BAR 0 [mem 0x8439a000-0x8439afff] Mar 13 00:35:17.806465 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 13 00:35:17.806531 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Mar 13 00:35:17.807034 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Mar 13 00:35:17.807111 kernel: pci 0000:00:02.3: enabling Extended Tags Mar 13 00:35:17.807188 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.807257 kernel: pci 0000:00:02.4: BAR 0 [mem 0x84399000-0x84399fff] Mar 13 00:35:17.807329 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 13 00:35:17.807398 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Mar 13 00:35:17.807466 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Mar 13 00:35:17.807534 kernel: pci 0000:00:02.4: enabling Extended Tags Mar 13 00:35:17.807619 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.807689 kernel: pci 0000:00:02.5: BAR 0 [mem 0x84398000-0x84398fff] Mar 13 00:35:17.807759 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 13 00:35:17.807827 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Mar 13 00:35:17.807894 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Mar 13 00:35:17.807961 kernel: pci 0000:00:02.5: enabling Extended Tags Mar 13 00:35:17.808032 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.808100 kernel: pci 0000:00:02.6: BAR 0 [mem 0x84397000-0x84397fff] Mar 13 00:35:17.808169 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 13 00:35:17.808240 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Mar 13 00:35:17.808308 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Mar 13 00:35:17.808375 kernel: pci 0000:00:02.6: enabling Extended Tags Mar 13 00:35:17.808448 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.808516 kernel: pci 0000:00:02.7: BAR 0 [mem 0x84396000-0x84396fff] Mar 13 00:35:17.808609 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 13 00:35:17.808687 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Mar 13 00:35:17.808768 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Mar 13 00:35:17.808839 kernel: pci 0000:00:02.7: enabling Extended Tags Mar 13 00:35:17.808913 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.808983 kernel: pci 0000:00:03.0: BAR 0 [mem 0x84395000-0x84395fff] Mar 13 00:35:17.809052 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Mar 13 00:35:17.809119 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Mar 13 00:35:17.809187 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Mar 13 00:35:17.809257 kernel: pci 0000:00:03.0: enabling Extended Tags Mar 13 00:35:17.809329 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.809398 kernel: pci 0000:00:03.1: BAR 0 [mem 0x84394000-0x84394fff] Mar 13 00:35:17.809465 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Mar 13 00:35:17.809532 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Mar 13 00:35:17.809617 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Mar 13 00:35:17.809693 kernel: pci 0000:00:03.1: enabling Extended Tags Mar 13 00:35:17.809775 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.809845 kernel: pci 0000:00:03.2: BAR 0 [mem 0x84393000-0x84393fff] Mar 13 00:35:17.811657 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Mar 13 00:35:17.811747 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Mar 13 00:35:17.811819 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Mar 13 00:35:17.811893 kernel: pci 0000:00:03.2: enabling Extended Tags Mar 13 00:35:17.811970 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.812041 kernel: pci 0000:00:03.3: BAR 0 [mem 0x84392000-0x84392fff] Mar 13 00:35:17.812111 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Mar 13 00:35:17.812179 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Mar 13 00:35:17.812248 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Mar 13 00:35:17.812316 kernel: pci 0000:00:03.3: enabling Extended Tags Mar 13 00:35:17.812394 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.812463 kernel: pci 0000:00:03.4: BAR 0 [mem 0x84391000-0x84391fff] Mar 13 00:35:17.812531 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Mar 13 00:35:17.813492 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Mar 13 00:35:17.813596 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Mar 13 00:35:17.813671 kernel: pci 0000:00:03.4: enabling Extended Tags Mar 13 00:35:17.813750 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.813836 kernel: pci 0000:00:03.5: BAR 0 [mem 0x84390000-0x84390fff] Mar 13 00:35:17.813906 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Mar 13 00:35:17.813975 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Mar 13 00:35:17.814045 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Mar 13 00:35:17.814113 kernel: pci 0000:00:03.5: enabling Extended Tags Mar 13 00:35:17.814188 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.814262 kernel: pci 0000:00:03.6: BAR 0 [mem 0x8438f000-0x8438ffff] Mar 13 00:35:17.814333 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Mar 13 00:35:17.814401 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Mar 13 00:35:17.814470 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Mar 13 00:35:17.814539 kernel: pci 0000:00:03.6: enabling Extended Tags Mar 13 00:35:17.814635 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.814706 kernel: pci 0000:00:03.7: BAR 0 [mem 0x8438e000-0x8438efff] Mar 13 00:35:17.814778 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Mar 13 00:35:17.814847 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Mar 13 00:35:17.814915 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Mar 13 00:35:17.814984 kernel: pci 0000:00:03.7: enabling Extended Tags Mar 13 00:35:17.815058 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.815128 kernel: pci 0000:00:04.0: BAR 0 [mem 0x8438d000-0x8438dfff] Mar 13 00:35:17.815201 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Mar 13 00:35:17.815273 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Mar 13 00:35:17.815343 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Mar 13 00:35:17.815411 kernel: pci 0000:00:04.0: enabling Extended Tags Mar 13 00:35:17.815490 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.815560 kernel: pci 0000:00:04.1: BAR 0 [mem 0x8438c000-0x8438cfff] Mar 13 00:35:17.817758 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Mar 13 00:35:17.817847 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Mar 13 00:35:17.817918 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Mar 13 00:35:17.817991 kernel: pci 0000:00:04.1: enabling Extended Tags Mar 13 00:35:17.818067 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.818137 kernel: pci 0000:00:04.2: BAR 0 [mem 0x8438b000-0x8438bfff] Mar 13 00:35:17.818208 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Mar 13 00:35:17.818278 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Mar 13 00:35:17.818347 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Mar 13 00:35:17.818415 kernel: pci 0000:00:04.2: enabling Extended Tags Mar 13 00:35:17.818488 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.818557 kernel: pci 0000:00:04.3: BAR 0 [mem 0x8438a000-0x8438afff] Mar 13 00:35:17.819679 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Mar 13 00:35:17.819759 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Mar 13 00:35:17.819830 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Mar 13 00:35:17.819899 kernel: pci 0000:00:04.3: enabling Extended Tags Mar 13 00:35:17.819974 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.820044 kernel: pci 0000:00:04.4: BAR 0 [mem 0x84389000-0x84389fff] Mar 13 00:35:17.820113 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Mar 13 00:35:17.820180 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Mar 13 00:35:17.820251 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Mar 13 00:35:17.820319 kernel: pci 0000:00:04.4: enabling Extended Tags Mar 13 00:35:17.820396 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.820465 kernel: pci 0000:00:04.5: BAR 0 [mem 0x84388000-0x84388fff] Mar 13 00:35:17.820532 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Mar 13 00:35:17.821638 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Mar 13 00:35:17.821715 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Mar 13 00:35:17.821796 kernel: pci 0000:00:04.5: enabling Extended Tags Mar 13 00:35:17.821872 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.821942 kernel: pci 0000:00:04.6: BAR 0 [mem 0x84387000-0x84387fff] Mar 13 00:35:17.822011 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Mar 13 00:35:17.822079 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Mar 13 00:35:17.822147 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Mar 13 00:35:17.822215 kernel: pci 0000:00:04.6: enabling Extended Tags Mar 13 00:35:17.822290 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.822359 kernel: pci 0000:00:04.7: BAR 0 [mem 0x84386000-0x84386fff] Mar 13 00:35:17.822427 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Mar 13 00:35:17.822497 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Mar 13 00:35:17.822576 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Mar 13 00:35:17.822645 kernel: pci 0000:00:04.7: enabling Extended Tags Mar 13 00:35:17.822719 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.822791 kernel: pci 0000:00:05.0: BAR 0 [mem 0x84385000-0x84385fff] Mar 13 00:35:17.822858 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Mar 13 00:35:17.822927 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Mar 13 00:35:17.822994 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Mar 13 00:35:17.823062 kernel: pci 0000:00:05.0: enabling Extended Tags Mar 13 00:35:17.823138 kernel: pci 0000:00:05.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.823207 kernel: pci 0000:00:05.1: BAR 0 [mem 0x84384000-0x84384fff] Mar 13 00:35:17.823277 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Mar 13 00:35:17.823344 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Mar 13 00:35:17.823429 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Mar 13 00:35:17.823509 kernel: pci 0000:00:05.1: enabling Extended Tags Mar 13 00:35:17.823850 kernel: pci 0000:00:05.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.823928 kernel: pci 0000:00:05.2: BAR 0 [mem 0x84383000-0x84383fff] Mar 13 00:35:17.823997 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Mar 13 00:35:17.824069 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Mar 13 00:35:17.824137 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Mar 13 00:35:17.824206 kernel: pci 0000:00:05.2: enabling Extended Tags Mar 13 00:35:17.824279 kernel: pci 0000:00:05.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.824348 kernel: pci 0000:00:05.3: BAR 0 [mem 0x84382000-0x84382fff] Mar 13 00:35:17.824416 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Mar 13 00:35:17.824490 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Mar 13 00:35:17.824577 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Mar 13 00:35:17.824647 kernel: pci 0000:00:05.3: enabling Extended Tags Mar 13 00:35:17.824721 kernel: pci 0000:00:05.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:35:17.824790 kernel: pci 0000:00:05.4: BAR 0 [mem 0x84381000-0x84381fff] Mar 13 00:35:17.824858 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Mar 13 00:35:17.824925 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Mar 13 00:35:17.824997 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Mar 13 00:35:17.825068 kernel: pci 0000:00:05.4: enabling Extended Tags Mar 13 00:35:17.825144 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Mar 13 00:35:17.825213 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 13 00:35:17.825288 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Mar 13 00:35:17.825356 kernel: pci 0000:00:1f.2: BAR 4 [io 0x7040-0x705f] Mar 13 00:35:17.825424 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x84380000-0x84380fff] Mar 13 00:35:17.825497 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Mar 13 00:35:17.827593 kernel: pci 0000:00:1f.3: BAR 4 [io 0x7000-0x703f] Mar 13 00:35:17.827691 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Mar 13 00:35:17.827778 kernel: pci 0000:01:00.0: BAR 0 [mem 0x84200000-0x842000ff 64bit] Mar 13 00:35:17.827850 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 13 00:35:17.827922 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Mar 13 00:35:17.827992 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Mar 13 00:35:17.828062 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Mar 13 00:35:17.828136 kernel: pci 0000:01:00.0: enabling Extended Tags Mar 13 00:35:17.828206 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 13 00:35:17.828290 kernel: pci_bus 0000:02: extended config space not accessible Mar 13 00:35:17.828301 kernel: acpiphp: Slot [1] registered Mar 13 00:35:17.828309 kernel: acpiphp: Slot [0] registered Mar 13 00:35:17.828317 kernel: acpiphp: Slot [2] registered Mar 13 00:35:17.828325 kernel: acpiphp: Slot [3] registered Mar 13 00:35:17.828332 kernel: acpiphp: Slot [4] registered Mar 13 00:35:17.828342 kernel: acpiphp: Slot [5] registered Mar 13 00:35:17.828350 kernel: acpiphp: Slot [6] registered Mar 13 00:35:17.828357 kernel: acpiphp: Slot [7] registered Mar 13 00:35:17.828365 kernel: acpiphp: Slot [8] registered Mar 13 00:35:17.828372 kernel: acpiphp: Slot [9] registered Mar 13 00:35:17.828380 kernel: acpiphp: Slot [10] registered Mar 13 00:35:17.828388 kernel: acpiphp: Slot [11] registered Mar 13 00:35:17.828395 kernel: acpiphp: Slot [12] registered Mar 13 00:35:17.828403 kernel: acpiphp: Slot [13] registered Mar 13 00:35:17.828412 kernel: acpiphp: Slot [14] registered Mar 13 00:35:17.828420 kernel: acpiphp: Slot [15] registered Mar 13 00:35:17.828428 kernel: acpiphp: Slot [16] registered Mar 13 00:35:17.828435 kernel: acpiphp: Slot [17] registered Mar 13 00:35:17.828442 kernel: acpiphp: Slot [18] registered Mar 13 00:35:17.828450 kernel: acpiphp: Slot [19] registered Mar 13 00:35:17.828458 kernel: acpiphp: Slot [20] registered Mar 13 00:35:17.828465 kernel: acpiphp: Slot [21] registered Mar 13 00:35:17.828473 kernel: acpiphp: Slot [22] registered Mar 13 00:35:17.828480 kernel: acpiphp: Slot [23] registered Mar 13 00:35:17.828490 kernel: acpiphp: Slot [24] registered Mar 13 00:35:17.828497 kernel: acpiphp: Slot [25] registered Mar 13 00:35:17.828505 kernel: acpiphp: Slot [26] registered Mar 13 00:35:17.828513 kernel: acpiphp: Slot [27] registered Mar 13 00:35:17.828520 kernel: acpiphp: Slot [28] registered Mar 13 00:35:17.828528 kernel: acpiphp: Slot [29] registered Mar 13 00:35:17.828536 kernel: acpiphp: Slot [30] registered Mar 13 00:35:17.828543 kernel: acpiphp: Slot [31] registered Mar 13 00:35:17.828635 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint Mar 13 00:35:17.828713 kernel: pci 0000:02:01.0: BAR 4 [io 0x6000-0x601f] Mar 13 00:35:17.828786 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 13 00:35:17.828795 kernel: acpiphp: Slot [0-2] registered Mar 13 00:35:17.828869 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 13 00:35:17.828943 kernel: pci 0000:03:00.0: BAR 1 [mem 0x83e00000-0x83e00fff] Mar 13 00:35:17.829014 kernel: pci 0000:03:00.0: BAR 4 [mem 0x380800000000-0x380800003fff 64bit pref] Mar 13 00:35:17.829084 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 13 00:35:17.829157 kernel: pci 0000:03:00.0: enabling Extended Tags Mar 13 00:35:17.829227 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 13 00:35:17.829237 kernel: acpiphp: Slot [0-3] registered Mar 13 00:35:17.829312 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Mar 13 00:35:17.829383 kernel: pci 0000:04:00.0: BAR 1 [mem 0x83c00000-0x83c00fff] Mar 13 00:35:17.829454 kernel: pci 0000:04:00.0: BAR 4 [mem 0x381000000000-0x381000003fff 64bit pref] Mar 13 00:35:17.829524 kernel: pci 0000:04:00.0: enabling Extended Tags Mar 13 00:35:17.829999 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 13 00:35:17.830015 kernel: acpiphp: Slot [0-4] registered Mar 13 00:35:17.830097 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Mar 13 00:35:17.830172 kernel: pci 0000:05:00.0: BAR 4 [mem 0x381800000000-0x381800003fff 64bit pref] Mar 13 00:35:17.830243 kernel: pci 0000:05:00.0: enabling Extended Tags Mar 13 00:35:17.830313 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 13 00:35:17.830323 kernel: acpiphp: Slot [0-5] registered Mar 13 00:35:17.830397 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Mar 13 00:35:17.830469 kernel: pci 0000:06:00.0: BAR 1 [mem 0x83800000-0x83800fff] Mar 13 00:35:17.830542 kernel: pci 0000:06:00.0: BAR 4 [mem 0x382000000000-0x382000003fff 64bit pref] Mar 13 00:35:17.833669 kernel: pci 0000:06:00.0: enabling Extended Tags Mar 13 00:35:17.833757 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 13 00:35:17.833789 kernel: acpiphp: Slot [0-6] registered Mar 13 00:35:17.833864 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 13 00:35:17.833874 kernel: acpiphp: Slot [0-7] registered Mar 13 00:35:17.833943 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 13 00:35:17.833953 kernel: acpiphp: Slot [0-8] registered Mar 13 00:35:17.834027 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 13 00:35:17.834036 kernel: acpiphp: Slot [0-9] registered Mar 13 00:35:17.834107 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Mar 13 00:35:17.834117 kernel: acpiphp: Slot [0-10] registered Mar 13 00:35:17.834186 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Mar 13 00:35:17.834196 kernel: acpiphp: Slot [0-11] registered Mar 13 00:35:17.834265 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Mar 13 00:35:17.834275 kernel: acpiphp: Slot [0-12] registered Mar 13 00:35:17.834345 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Mar 13 00:35:17.834355 kernel: acpiphp: Slot [0-13] registered Mar 13 00:35:17.834425 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Mar 13 00:35:17.834435 kernel: acpiphp: Slot [0-14] registered Mar 13 00:35:17.834505 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Mar 13 00:35:17.834515 kernel: acpiphp: Slot [0-15] registered Mar 13 00:35:17.834600 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Mar 13 00:35:17.834610 kernel: acpiphp: Slot [0-16] registered Mar 13 00:35:17.834679 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Mar 13 00:35:17.834690 kernel: acpiphp: Slot [0-17] registered Mar 13 00:35:17.834758 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Mar 13 00:35:17.834771 kernel: acpiphp: Slot [0-18] registered Mar 13 00:35:17.834840 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Mar 13 00:35:17.834850 kernel: acpiphp: Slot [0-19] registered Mar 13 00:35:17.834918 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Mar 13 00:35:17.834928 kernel: acpiphp: Slot [0-20] registered Mar 13 00:35:17.834996 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Mar 13 00:35:17.835007 kernel: acpiphp: Slot [0-21] registered Mar 13 00:35:17.835074 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Mar 13 00:35:17.835084 kernel: acpiphp: Slot [0-22] registered Mar 13 00:35:17.835152 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Mar 13 00:35:17.835164 kernel: acpiphp: Slot [0-23] registered Mar 13 00:35:17.835233 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Mar 13 00:35:17.835243 kernel: acpiphp: Slot [0-24] registered Mar 13 00:35:17.835311 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Mar 13 00:35:17.835321 kernel: acpiphp: Slot [0-25] registered Mar 13 00:35:17.835388 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Mar 13 00:35:17.835398 kernel: acpiphp: Slot [0-26] registered Mar 13 00:35:17.835466 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Mar 13 00:35:17.835476 kernel: acpiphp: Slot [0-27] registered Mar 13 00:35:17.835547 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Mar 13 00:35:17.835557 kernel: acpiphp: Slot [0-28] registered Mar 13 00:35:17.839456 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Mar 13 00:35:17.839473 kernel: acpiphp: Slot [0-29] registered Mar 13 00:35:17.839548 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Mar 13 00:35:17.839560 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 13 00:35:17.839585 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 13 00:35:17.839594 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 13 00:35:17.839601 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 13 00:35:17.839613 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 13 00:35:17.839620 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 13 00:35:17.839628 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 13 00:35:17.839636 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 13 00:35:17.839644 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 13 00:35:17.839652 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 13 00:35:17.839660 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 13 00:35:17.839668 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 13 00:35:17.839675 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 13 00:35:17.839685 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 13 00:35:17.839693 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 13 00:35:17.839700 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 13 00:35:17.839708 kernel: iommu: Default domain type: Translated Mar 13 00:35:17.839716 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 13 00:35:17.839723 kernel: efivars: Registered efivars operations Mar 13 00:35:17.839731 kernel: PCI: Using ACPI for IRQ routing Mar 13 00:35:17.839739 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 13 00:35:17.839747 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Mar 13 00:35:17.839755 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Mar 13 00:35:17.839765 kernel: e820: reserve RAM buffer [mem 0x7df57018-0x7fffffff] Mar 13 00:35:17.839772 kernel: e820: reserve RAM buffer [mem 0x7df7f018-0x7fffffff] Mar 13 00:35:17.839780 kernel: e820: reserve RAM buffer [mem 0x7e93f000-0x7fffffff] Mar 13 00:35:17.839787 kernel: e820: reserve RAM buffer [mem 0x7ec71000-0x7fffffff] Mar 13 00:35:17.839795 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Mar 13 00:35:17.839802 kernel: e820: reserve RAM buffer [mem 0x7feaf000-0x7fffffff] Mar 13 00:35:17.839810 kernel: e820: reserve RAM buffer [mem 0x7feec000-0x7fffffff] Mar 13 00:35:17.839885 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 13 00:35:17.839959 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 13 00:35:17.840028 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 13 00:35:17.840038 kernel: vgaarb: loaded Mar 13 00:35:17.840046 kernel: clocksource: Switched to clocksource kvm-clock Mar 13 00:35:17.840054 kernel: VFS: Disk quotas dquot_6.6.0 Mar 13 00:35:17.840061 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 13 00:35:17.840069 kernel: pnp: PnP ACPI init Mar 13 00:35:17.840146 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Mar 13 00:35:17.840160 kernel: pnp: PnP ACPI: found 5 devices Mar 13 00:35:17.840168 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 13 00:35:17.840176 kernel: NET: Registered PF_INET protocol family Mar 13 00:35:17.840184 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 13 00:35:17.840192 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 13 00:35:17.840200 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 13 00:35:17.840208 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 13 00:35:17.840216 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 13 00:35:17.840224 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 13 00:35:17.840234 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 13 00:35:17.840241 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 13 00:35:17.840249 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 13 00:35:17.840257 kernel: NET: Registered PF_XDP protocol family Mar 13 00:35:17.840333 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Mar 13 00:35:17.840406 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 13 00:35:17.840476 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 13 00:35:17.840547 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 13 00:35:17.840648 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 13 00:35:17.840719 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 13 00:35:17.840790 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 13 00:35:17.840860 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 13 00:35:17.840931 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Mar 13 00:35:17.841001 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Mar 13 00:35:17.841071 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Mar 13 00:35:17.841141 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Mar 13 00:35:17.841213 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Mar 13 00:35:17.841282 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Mar 13 00:35:17.841352 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Mar 13 00:35:17.841422 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Mar 13 00:35:17.841491 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Mar 13 00:35:17.841563 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Mar 13 00:35:17.841649 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Mar 13 00:35:17.841722 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Mar 13 00:35:17.841801 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Mar 13 00:35:17.841871 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Mar 13 00:35:17.841942 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Mar 13 00:35:17.842013 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Mar 13 00:35:17.842083 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Mar 13 00:35:17.842153 kernel: pci 0000:00:05.1: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Mar 13 00:35:17.842222 kernel: pci 0000:00:05.2: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Mar 13 00:35:17.842294 kernel: pci 0000:00:05.3: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Mar 13 00:35:17.842362 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Mar 13 00:35:17.842430 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Mar 13 00:35:17.842499 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Mar 13 00:35:17.842593 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Mar 13 00:35:17.842665 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Mar 13 00:35:17.842733 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Mar 13 00:35:17.842801 kernel: pci 0000:00:02.6: bridge window [io 0x8000-0x8fff]: assigned Mar 13 00:35:17.842872 kernel: pci 0000:00:02.7: bridge window [io 0x9000-0x9fff]: assigned Mar 13 00:35:17.842940 kernel: pci 0000:00:03.0: bridge window [io 0xa000-0xafff]: assigned Mar 13 00:35:17.843008 kernel: pci 0000:00:03.1: bridge window [io 0xb000-0xbfff]: assigned Mar 13 00:35:17.843076 kernel: pci 0000:00:03.2: bridge window [io 0xc000-0xcfff]: assigned Mar 13 00:35:17.843144 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff]: assigned Mar 13 00:35:17.843211 kernel: pci 0000:00:03.4: bridge window [io 0xe000-0xefff]: assigned Mar 13 00:35:17.843279 kernel: pci 0000:00:03.5: bridge window [io 0xf000-0xffff]: assigned Mar 13 00:35:17.843347 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.843416 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.843484 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.843551 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.843632 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.843700 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.843768 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.843837 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.843904 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.843976 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.844044 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.844112 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.844180 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.844247 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.844315 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.844383 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.844450 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.844521 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.844604 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.844672 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.844740 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.844808 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.844878 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.844946 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.845013 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.845083 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.845151 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.845219 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.845286 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.845353 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.845420 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff]: assigned Mar 13 00:35:17.845488 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff]: assigned Mar 13 00:35:17.845558 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff]: assigned Mar 13 00:35:17.845635 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff]: assigned Mar 13 00:35:17.845704 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff]: assigned Mar 13 00:35:17.845780 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff]: assigned Mar 13 00:35:17.845848 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff]: assigned Mar 13 00:35:17.845915 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff]: assigned Mar 13 00:35:17.845982 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff]: assigned Mar 13 00:35:17.846050 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff]: assigned Mar 13 00:35:17.846121 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff]: assigned Mar 13 00:35:17.846190 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff]: assigned Mar 13 00:35:17.846257 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff]: assigned Mar 13 00:35:17.846324 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.846392 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.846460 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.846528 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.846616 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.846684 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.846756 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.846824 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.846891 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.846959 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.847027 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.847094 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.847162 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.847230 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.847300 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.847369 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.847437 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.847506 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.847582 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.848669 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.848749 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.848824 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.848894 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.848963 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.849031 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.849098 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.849167 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.849239 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.849309 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Mar 13 00:35:17.849376 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Mar 13 00:35:17.849450 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 13 00:35:17.849521 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Mar 13 00:35:17.849634 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Mar 13 00:35:17.849705 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Mar 13 00:35:17.849789 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 13 00:35:17.849861 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Mar 13 00:35:17.849929 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Mar 13 00:35:17.849997 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Mar 13 00:35:17.850069 kernel: pci 0000:03:00.0: ROM [mem 0x83e80000-0x83efffff pref]: assigned Mar 13 00:35:17.850138 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 13 00:35:17.850206 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Mar 13 00:35:17.850273 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Mar 13 00:35:17.850341 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 13 00:35:17.850409 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Mar 13 00:35:17.850480 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Mar 13 00:35:17.850548 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 13 00:35:17.851663 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Mar 13 00:35:17.851745 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Mar 13 00:35:17.851816 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 13 00:35:17.851885 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Mar 13 00:35:17.851958 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Mar 13 00:35:17.852039 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 13 00:35:17.852107 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Mar 13 00:35:17.852174 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Mar 13 00:35:17.852243 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 13 00:35:17.852310 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Mar 13 00:35:17.852378 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Mar 13 00:35:17.852447 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 13 00:35:17.852518 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Mar 13 00:35:17.852603 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Mar 13 00:35:17.852673 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Mar 13 00:35:17.852741 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Mar 13 00:35:17.852808 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Mar 13 00:35:17.852878 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Mar 13 00:35:17.852946 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Mar 13 00:35:17.853014 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Mar 13 00:35:17.853086 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Mar 13 00:35:17.853153 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Mar 13 00:35:17.853220 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Mar 13 00:35:17.853288 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Mar 13 00:35:17.853356 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Mar 13 00:35:17.853424 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Mar 13 00:35:17.853496 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Mar 13 00:35:17.853563 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Mar 13 00:35:17.853640 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Mar 13 00:35:17.853709 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Mar 13 00:35:17.853786 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Mar 13 00:35:17.853855 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Mar 13 00:35:17.853925 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Mar 13 00:35:17.853994 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Mar 13 00:35:17.854065 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Mar 13 00:35:17.855685 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Mar 13 00:35:17.855767 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Mar 13 00:35:17.855837 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Mar 13 00:35:17.855909 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Mar 13 00:35:17.855978 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff] Mar 13 00:35:17.856047 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Mar 13 00:35:17.856120 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Mar 13 00:35:17.856190 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Mar 13 00:35:17.856259 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff] Mar 13 00:35:17.856326 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Mar 13 00:35:17.856395 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Mar 13 00:35:17.856467 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Mar 13 00:35:17.856535 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff] Mar 13 00:35:17.856612 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Mar 13 00:35:17.856682 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Mar 13 00:35:17.856752 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Mar 13 00:35:17.856820 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff] Mar 13 00:35:17.856887 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Mar 13 00:35:17.856955 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Mar 13 00:35:17.857023 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Mar 13 00:35:17.857092 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff] Mar 13 00:35:17.858229 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Mar 13 00:35:17.858322 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Mar 13 00:35:17.858394 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Mar 13 00:35:17.858464 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff] Mar 13 00:35:17.858532 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Mar 13 00:35:17.858610 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Mar 13 00:35:17.858680 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Mar 13 00:35:17.858752 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff] Mar 13 00:35:17.858821 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Mar 13 00:35:17.858888 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Mar 13 00:35:17.858957 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Mar 13 00:35:17.859025 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff] Mar 13 00:35:17.859093 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Mar 13 00:35:17.859161 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Mar 13 00:35:17.859230 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Mar 13 00:35:17.859300 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff] Mar 13 00:35:17.859368 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Mar 13 00:35:17.859436 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Mar 13 00:35:17.859505 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Mar 13 00:35:17.859581 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff] Mar 13 00:35:17.859649 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Mar 13 00:35:17.860618 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Mar 13 00:35:17.860726 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Mar 13 00:35:17.860799 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff] Mar 13 00:35:17.860867 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Mar 13 00:35:17.860935 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Mar 13 00:35:17.861005 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Mar 13 00:35:17.861073 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff] Mar 13 00:35:17.861141 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Mar 13 00:35:17.861212 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Mar 13 00:35:17.861281 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Mar 13 00:35:17.861348 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff] Mar 13 00:35:17.861446 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Mar 13 00:35:17.861523 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Mar 13 00:35:17.862694 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 13 00:35:17.862771 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 13 00:35:17.862833 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 13 00:35:17.862898 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Mar 13 00:35:17.862959 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Mar 13 00:35:17.863018 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x38e800003fff window] Mar 13 00:35:17.863090 kernel: pci_bus 0000:01: resource 0 [io 0x6000-0x6fff] Mar 13 00:35:17.863155 kernel: pci_bus 0000:01: resource 1 [mem 0x84000000-0x842fffff] Mar 13 00:35:17.863219 kernel: pci_bus 0000:01: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Mar 13 00:35:17.863288 kernel: pci_bus 0000:02: resource 0 [io 0x6000-0x6fff] Mar 13 00:35:17.863358 kernel: pci_bus 0000:02: resource 1 [mem 0x84000000-0x841fffff] Mar 13 00:35:17.863425 kernel: pci_bus 0000:02: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Mar 13 00:35:17.863496 kernel: pci_bus 0000:03: resource 1 [mem 0x83e00000-0x83ffffff] Mar 13 00:35:17.863560 kernel: pci_bus 0000:03: resource 2 [mem 0x380800000000-0x380fffffffff 64bit pref] Mar 13 00:35:17.863656 kernel: pci_bus 0000:04: resource 1 [mem 0x83c00000-0x83dfffff] Mar 13 00:35:17.863720 kernel: pci_bus 0000:04: resource 2 [mem 0x381000000000-0x3817ffffffff 64bit pref] Mar 13 00:35:17.863788 kernel: pci_bus 0000:05: resource 1 [mem 0x83a00000-0x83bfffff] Mar 13 00:35:17.863855 kernel: pci_bus 0000:05: resource 2 [mem 0x381800000000-0x381fffffffff 64bit pref] Mar 13 00:35:17.863923 kernel: pci_bus 0000:06: resource 1 [mem 0x83800000-0x839fffff] Mar 13 00:35:17.863987 kernel: pci_bus 0000:06: resource 2 [mem 0x382000000000-0x3827ffffffff 64bit pref] Mar 13 00:35:17.864058 kernel: pci_bus 0000:07: resource 1 [mem 0x83600000-0x837fffff] Mar 13 00:35:17.864122 kernel: pci_bus 0000:07: resource 2 [mem 0x382800000000-0x382fffffffff 64bit pref] Mar 13 00:35:17.864189 kernel: pci_bus 0000:08: resource 1 [mem 0x83400000-0x835fffff] Mar 13 00:35:17.864256 kernel: pci_bus 0000:08: resource 2 [mem 0x383000000000-0x3837ffffffff 64bit pref] Mar 13 00:35:17.864324 kernel: pci_bus 0000:09: resource 1 [mem 0x83200000-0x833fffff] Mar 13 00:35:17.864387 kernel: pci_bus 0000:09: resource 2 [mem 0x383800000000-0x383fffffffff 64bit pref] Mar 13 00:35:17.864455 kernel: pci_bus 0000:0a: resource 1 [mem 0x83000000-0x831fffff] Mar 13 00:35:17.864518 kernel: pci_bus 0000:0a: resource 2 [mem 0x384000000000-0x3847ffffffff 64bit pref] Mar 13 00:35:17.865136 kernel: pci_bus 0000:0b: resource 1 [mem 0x82e00000-0x82ffffff] Mar 13 00:35:17.865212 kernel: pci_bus 0000:0b: resource 2 [mem 0x384800000000-0x384fffffffff 64bit pref] Mar 13 00:35:17.865284 kernel: pci_bus 0000:0c: resource 1 [mem 0x82c00000-0x82dfffff] Mar 13 00:35:17.865348 kernel: pci_bus 0000:0c: resource 2 [mem 0x385000000000-0x3857ffffffff 64bit pref] Mar 13 00:35:17.865416 kernel: pci_bus 0000:0d: resource 1 [mem 0x82a00000-0x82bfffff] Mar 13 00:35:17.865875 kernel: pci_bus 0000:0d: resource 2 [mem 0x385800000000-0x385fffffffff 64bit pref] Mar 13 00:35:17.866115 kernel: pci_bus 0000:0e: resource 1 [mem 0x82800000-0x829fffff] Mar 13 00:35:17.866183 kernel: pci_bus 0000:0e: resource 2 [mem 0x386000000000-0x3867ffffffff 64bit pref] Mar 13 00:35:17.866255 kernel: pci_bus 0000:0f: resource 1 [mem 0x82600000-0x827fffff] Mar 13 00:35:17.866319 kernel: pci_bus 0000:0f: resource 2 [mem 0x386800000000-0x386fffffffff 64bit pref] Mar 13 00:35:17.866389 kernel: pci_bus 0000:10: resource 1 [mem 0x82400000-0x825fffff] Mar 13 00:35:17.866453 kernel: pci_bus 0000:10: resource 2 [mem 0x387000000000-0x3877ffffffff 64bit pref] Mar 13 00:35:17.866526 kernel: pci_bus 0000:11: resource 1 [mem 0x82200000-0x823fffff] Mar 13 00:35:17.866603 kernel: pci_bus 0000:11: resource 2 [mem 0x387800000000-0x387fffffffff 64bit pref] Mar 13 00:35:17.866671 kernel: pci_bus 0000:12: resource 0 [io 0xf000-0xffff] Mar 13 00:35:17.866734 kernel: pci_bus 0000:12: resource 1 [mem 0x82000000-0x821fffff] Mar 13 00:35:17.866797 kernel: pci_bus 0000:12: resource 2 [mem 0x388000000000-0x3887ffffffff 64bit pref] Mar 13 00:35:17.866869 kernel: pci_bus 0000:13: resource 0 [io 0xe000-0xefff] Mar 13 00:35:17.866933 kernel: pci_bus 0000:13: resource 1 [mem 0x81e00000-0x81ffffff] Mar 13 00:35:17.866998 kernel: pci_bus 0000:13: resource 2 [mem 0x388800000000-0x388fffffffff 64bit pref] Mar 13 00:35:17.867065 kernel: pci_bus 0000:14: resource 0 [io 0xd000-0xdfff] Mar 13 00:35:17.867130 kernel: pci_bus 0000:14: resource 1 [mem 0x81c00000-0x81dfffff] Mar 13 00:35:17.867192 kernel: pci_bus 0000:14: resource 2 [mem 0x389000000000-0x3897ffffffff 64bit pref] Mar 13 00:35:17.867259 kernel: pci_bus 0000:15: resource 0 [io 0xc000-0xcfff] Mar 13 00:35:17.867324 kernel: pci_bus 0000:15: resource 1 [mem 0x81a00000-0x81bfffff] Mar 13 00:35:17.867389 kernel: pci_bus 0000:15: resource 2 [mem 0x389800000000-0x389fffffffff 64bit pref] Mar 13 00:35:17.867456 kernel: pci_bus 0000:16: resource 0 [io 0xb000-0xbfff] Mar 13 00:35:17.867519 kernel: pci_bus 0000:16: resource 1 [mem 0x81800000-0x819fffff] Mar 13 00:35:17.867597 kernel: pci_bus 0000:16: resource 2 [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Mar 13 00:35:17.867666 kernel: pci_bus 0000:17: resource 0 [io 0xa000-0xafff] Mar 13 00:35:17.867729 kernel: pci_bus 0000:17: resource 1 [mem 0x81600000-0x817fffff] Mar 13 00:35:17.867792 kernel: pci_bus 0000:17: resource 2 [mem 0x38a800000000-0x38afffffffff 64bit pref] Mar 13 00:35:17.867864 kernel: pci_bus 0000:18: resource 0 [io 0x9000-0x9fff] Mar 13 00:35:17.867927 kernel: pci_bus 0000:18: resource 1 [mem 0x81400000-0x815fffff] Mar 13 00:35:17.867990 kernel: pci_bus 0000:18: resource 2 [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Mar 13 00:35:17.868058 kernel: pci_bus 0000:19: resource 0 [io 0x8000-0x8fff] Mar 13 00:35:17.868121 kernel: pci_bus 0000:19: resource 1 [mem 0x81200000-0x813fffff] Mar 13 00:35:17.868184 kernel: pci_bus 0000:19: resource 2 [mem 0x38b800000000-0x38bfffffffff 64bit pref] Mar 13 00:35:17.868250 kernel: pci_bus 0000:1a: resource 0 [io 0x5000-0x5fff] Mar 13 00:35:17.868316 kernel: pci_bus 0000:1a: resource 1 [mem 0x81000000-0x811fffff] Mar 13 00:35:17.868379 kernel: pci_bus 0000:1a: resource 2 [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Mar 13 00:35:17.868447 kernel: pci_bus 0000:1b: resource 0 [io 0x4000-0x4fff] Mar 13 00:35:17.868510 kernel: pci_bus 0000:1b: resource 1 [mem 0x80e00000-0x80ffffff] Mar 13 00:35:17.868894 kernel: pci_bus 0000:1b: resource 2 [mem 0x38c800000000-0x38cfffffffff 64bit pref] Mar 13 00:35:17.868979 kernel: pci_bus 0000:1c: resource 0 [io 0x3000-0x3fff] Mar 13 00:35:17.869045 kernel: pci_bus 0000:1c: resource 1 [mem 0x80c00000-0x80dfffff] Mar 13 00:35:17.869112 kernel: pci_bus 0000:1c: resource 2 [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Mar 13 00:35:17.869181 kernel: pci_bus 0000:1d: resource 0 [io 0x2000-0x2fff] Mar 13 00:35:17.869245 kernel: pci_bus 0000:1d: resource 1 [mem 0x80a00000-0x80bfffff] Mar 13 00:35:17.869330 kernel: pci_bus 0000:1d: resource 2 [mem 0x38d800000000-0x38dfffffffff 64bit pref] Mar 13 00:35:17.869400 kernel: pci_bus 0000:1e: resource 0 [io 0x1000-0x1fff] Mar 13 00:35:17.869464 kernel: pci_bus 0000:1e: resource 1 [mem 0x80800000-0x809fffff] Mar 13 00:35:17.869546 kernel: pci_bus 0000:1e: resource 2 [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Mar 13 00:35:17.869557 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 13 00:35:17.869576 kernel: PCI: CLS 0 bytes, default 64 Mar 13 00:35:17.869585 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 13 00:35:17.869593 kernel: software IO TLB: mapped [mem 0x0000000077ede000-0x000000007bede000] (64MB) Mar 13 00:35:17.869600 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 13 00:35:17.869608 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Mar 13 00:35:17.869616 kernel: Initialise system trusted keyrings Mar 13 00:35:17.869624 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 13 00:35:17.869635 kernel: Key type asymmetric registered Mar 13 00:35:17.869642 kernel: Asymmetric key parser 'x509' registered Mar 13 00:35:17.869650 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 13 00:35:17.869658 kernel: io scheduler mq-deadline registered Mar 13 00:35:17.869666 kernel: io scheduler kyber registered Mar 13 00:35:17.869673 kernel: io scheduler bfq registered Mar 13 00:35:17.869751 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 13 00:35:17.869835 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 13 00:35:17.869912 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 13 00:35:17.869983 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 13 00:35:17.870055 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 13 00:35:17.870125 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 13 00:35:17.870194 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 13 00:35:17.870264 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 13 00:35:17.870333 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 13 00:35:17.870403 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 13 00:35:17.870475 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 13 00:35:17.870546 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 13 00:35:17.870626 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 13 00:35:17.870695 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 13 00:35:17.870764 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 13 00:35:17.870832 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 13 00:35:17.870843 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 13 00:35:17.870910 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Mar 13 00:35:17.871672 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Mar 13 00:35:17.871747 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33 Mar 13 00:35:17.871815 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33 Mar 13 00:35:17.871886 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34 Mar 13 00:35:17.871956 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34 Mar 13 00:35:17.872026 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35 Mar 13 00:35:17.872095 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35 Mar 13 00:35:17.872165 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36 Mar 13 00:35:17.872237 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36 Mar 13 00:35:17.872307 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37 Mar 13 00:35:17.872376 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37 Mar 13 00:35:17.872445 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38 Mar 13 00:35:17.872513 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38 Mar 13 00:35:17.872595 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39 Mar 13 00:35:17.872665 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39 Mar 13 00:35:17.872675 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Mar 13 00:35:17.872743 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40 Mar 13 00:35:17.872815 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40 Mar 13 00:35:17.872885 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 41 Mar 13 00:35:17.872954 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 41 Mar 13 00:35:17.873024 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 42 Mar 13 00:35:17.873094 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 42 Mar 13 00:35:17.873163 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 43 Mar 13 00:35:17.873232 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 43 Mar 13 00:35:17.873301 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 44 Mar 13 00:35:17.873372 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 44 Mar 13 00:35:17.873442 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 45 Mar 13 00:35:17.873510 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 45 Mar 13 00:35:17.875392 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 46 Mar 13 00:35:17.875492 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 46 Mar 13 00:35:17.875591 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 47 Mar 13 00:35:17.875666 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 47 Mar 13 00:35:17.875676 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Mar 13 00:35:17.875752 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 48 Mar 13 00:35:17.875823 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 48 Mar 13 00:35:17.875894 kernel: pcieport 0000:00:05.1: PME: Signaling with IRQ 49 Mar 13 00:35:17.875964 kernel: pcieport 0000:00:05.1: AER: enabled with IRQ 49 Mar 13 00:35:17.876034 kernel: pcieport 0000:00:05.2: PME: Signaling with IRQ 50 Mar 13 00:35:17.876103 kernel: pcieport 0000:00:05.2: AER: enabled with IRQ 50 Mar 13 00:35:17.876174 kernel: pcieport 0000:00:05.3: PME: Signaling with IRQ 51 Mar 13 00:35:17.876243 kernel: pcieport 0000:00:05.3: AER: enabled with IRQ 51 Mar 13 00:35:17.876315 kernel: pcieport 0000:00:05.4: PME: Signaling with IRQ 52 Mar 13 00:35:17.876385 kernel: pcieport 0000:00:05.4: AER: enabled with IRQ 52 Mar 13 00:35:17.876394 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 13 00:35:17.876403 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 13 00:35:17.876411 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 13 00:35:17.876418 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 13 00:35:17.876426 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 13 00:35:17.876434 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 13 00:35:17.876507 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 13 00:35:17.876520 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 13 00:35:17.876593 kernel: rtc_cmos 00:03: registered as rtc0 Mar 13 00:35:17.876658 kernel: rtc_cmos 00:03: setting system clock to 2026-03-13T00:35:17 UTC (1773362117) Mar 13 00:35:17.876721 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Mar 13 00:35:17.876730 kernel: intel_pstate: CPU model not supported Mar 13 00:35:17.876738 kernel: efifb: probing for efifb Mar 13 00:35:17.876746 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Mar 13 00:35:17.876754 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Mar 13 00:35:17.876764 kernel: efifb: scrolling: redraw Mar 13 00:35:17.876772 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 13 00:35:17.876780 kernel: Console: switching to colour frame buffer device 160x50 Mar 13 00:35:17.876788 kernel: fb0: EFI VGA frame buffer device Mar 13 00:35:17.876795 kernel: pstore: Using crash dump compression: deflate Mar 13 00:35:17.876804 kernel: pstore: Registered efi_pstore as persistent store backend Mar 13 00:35:17.876812 kernel: NET: Registered PF_INET6 protocol family Mar 13 00:35:17.876832 kernel: Segment Routing with IPv6 Mar 13 00:35:17.876842 kernel: In-situ OAM (IOAM) with IPv6 Mar 13 00:35:17.876850 kernel: NET: Registered PF_PACKET protocol family Mar 13 00:35:17.876860 kernel: Key type dns_resolver registered Mar 13 00:35:17.876869 kernel: IPI shorthand broadcast: enabled Mar 13 00:35:17.876876 kernel: sched_clock: Marking stable (4019002541, 153175877)->(4276398680, -104220262) Mar 13 00:35:17.876884 kernel: registered taskstats version 1 Mar 13 00:35:17.876892 kernel: Loading compiled-in X.509 certificates Mar 13 00:35:17.876900 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 5aff49df330f42445474818d085d5033fee752d8' Mar 13 00:35:17.876908 kernel: Demotion targets for Node 0: null Mar 13 00:35:17.876916 kernel: Key type .fscrypt registered Mar 13 00:35:17.876924 kernel: Key type fscrypt-provisioning registered Mar 13 00:35:17.876933 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 13 00:35:17.876941 kernel: ima: Allocated hash algorithm: sha1 Mar 13 00:35:17.876949 kernel: ima: No architecture policies found Mar 13 00:35:17.876957 kernel: clk: Disabling unused clocks Mar 13 00:35:17.876965 kernel: Warning: unable to open an initial console. Mar 13 00:35:17.876974 kernel: Freeing unused kernel image (initmem) memory: 46200K Mar 13 00:35:17.876982 kernel: Write protecting the kernel read-only data: 40960k Mar 13 00:35:17.876990 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Mar 13 00:35:17.876998 kernel: Run /init as init process Mar 13 00:35:17.877008 kernel: with arguments: Mar 13 00:35:17.877016 kernel: /init Mar 13 00:35:17.877024 kernel: with environment: Mar 13 00:35:17.877031 kernel: HOME=/ Mar 13 00:35:17.877039 kernel: TERM=linux Mar 13 00:35:17.877048 systemd[1]: Successfully made /usr/ read-only. Mar 13 00:35:17.877060 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 13 00:35:17.877069 systemd[1]: Detected virtualization kvm. Mar 13 00:35:17.877079 systemd[1]: Detected architecture x86-64. Mar 13 00:35:17.877087 systemd[1]: Running in initrd. Mar 13 00:35:17.877095 systemd[1]: No hostname configured, using default hostname. Mar 13 00:35:17.877105 systemd[1]: Hostname set to . Mar 13 00:35:17.877113 systemd[1]: Initializing machine ID from VM UUID. Mar 13 00:35:17.877121 systemd[1]: Queued start job for default target initrd.target. Mar 13 00:35:17.877129 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 13 00:35:17.877138 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 13 00:35:17.877149 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 13 00:35:17.877157 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 13 00:35:17.877166 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 13 00:35:17.877174 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 13 00:35:17.877184 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 13 00:35:17.877194 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 13 00:35:17.877202 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 13 00:35:17.877210 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 13 00:35:17.877219 systemd[1]: Reached target paths.target - Path Units. Mar 13 00:35:17.877227 systemd[1]: Reached target slices.target - Slice Units. Mar 13 00:35:17.877235 systemd[1]: Reached target swap.target - Swaps. Mar 13 00:35:17.877243 systemd[1]: Reached target timers.target - Timer Units. Mar 13 00:35:17.877252 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 13 00:35:17.877262 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 13 00:35:17.877272 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 13 00:35:17.877281 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 13 00:35:17.877289 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 13 00:35:17.877297 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 13 00:35:17.877306 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 13 00:35:17.877314 systemd[1]: Reached target sockets.target - Socket Units. Mar 13 00:35:17.877323 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 13 00:35:17.877331 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 13 00:35:17.877339 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 13 00:35:17.877349 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 13 00:35:17.877358 systemd[1]: Starting systemd-fsck-usr.service... Mar 13 00:35:17.877366 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 13 00:35:17.877374 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 13 00:35:17.877383 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:35:17.877409 systemd-journald[225]: Collecting audit messages is disabled. Mar 13 00:35:17.877432 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 13 00:35:17.877441 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 13 00:35:17.877451 systemd[1]: Finished systemd-fsck-usr.service. Mar 13 00:35:17.877459 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 13 00:35:17.877468 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:35:17.877476 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 13 00:35:17.877485 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 13 00:35:17.877493 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 13 00:35:17.877502 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 13 00:35:17.877510 kernel: Bridge firewalling registered Mar 13 00:35:17.877520 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 13 00:35:17.877529 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 13 00:35:17.877538 systemd-journald[225]: Journal started Mar 13 00:35:17.877558 systemd-journald[225]: Runtime Journal (/run/log/journal/80894e66ff394dea97d4bf6d1d592e6b) is 8M, max 78M, 70M free. Mar 13 00:35:17.879591 systemd[1]: Started systemd-journald.service - Journal Service. Mar 13 00:35:17.825836 systemd-modules-load[226]: Inserted module 'overlay' Mar 13 00:35:17.863616 systemd-modules-load[226]: Inserted module 'br_netfilter' Mar 13 00:35:17.881222 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 13 00:35:17.886896 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 13 00:35:17.888487 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 13 00:35:17.897681 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 13 00:35:17.899160 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 13 00:35:17.903473 systemd-tmpfiles[255]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 13 00:35:17.908156 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 13 00:35:17.909745 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 13 00:35:17.917178 dracut-cmdline[262]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a2116dc4421f78fe124deb19b9ad6d70a0cb4fc0b3349854f4ce4e2904d4925d Mar 13 00:35:17.952220 systemd-resolved[268]: Positive Trust Anchors: Mar 13 00:35:17.952232 systemd-resolved[268]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 13 00:35:17.952263 systemd-resolved[268]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 13 00:35:17.955362 systemd-resolved[268]: Defaulting to hostname 'linux'. Mar 13 00:35:17.956438 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 13 00:35:17.957608 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 13 00:35:18.002600 kernel: SCSI subsystem initialized Mar 13 00:35:18.013594 kernel: Loading iSCSI transport class v2.0-870. Mar 13 00:35:18.023590 kernel: iscsi: registered transport (tcp) Mar 13 00:35:18.044591 kernel: iscsi: registered transport (qla4xxx) Mar 13 00:35:18.044660 kernel: QLogic iSCSI HBA Driver Mar 13 00:35:18.061674 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 13 00:35:18.077846 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 13 00:35:18.079679 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 13 00:35:18.122259 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 13 00:35:18.124527 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 13 00:35:18.176603 kernel: raid6: avx512x4 gen() 41441 MB/s Mar 13 00:35:18.193591 kernel: raid6: avx512x2 gen() 46640 MB/s Mar 13 00:35:18.210629 kernel: raid6: avx512x1 gen() 44384 MB/s Mar 13 00:35:18.227612 kernel: raid6: avx2x4 gen() 34532 MB/s Mar 13 00:35:18.244636 kernel: raid6: avx2x2 gen() 34378 MB/s Mar 13 00:35:18.261958 kernel: raid6: avx2x1 gen() 26534 MB/s Mar 13 00:35:18.262070 kernel: raid6: using algorithm avx512x2 gen() 46640 MB/s Mar 13 00:35:18.280019 kernel: raid6: .... xor() 26779 MB/s, rmw enabled Mar 13 00:35:18.280119 kernel: raid6: using avx512x2 recovery algorithm Mar 13 00:35:18.331608 kernel: xor: automatically using best checksumming function avx Mar 13 00:35:18.469610 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 13 00:35:18.477461 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 13 00:35:18.482042 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 13 00:35:18.505396 systemd-udevd[474]: Using default interface naming scheme 'v255'. Mar 13 00:35:18.509971 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 13 00:35:18.516777 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 13 00:35:18.546500 dracut-pre-trigger[483]: rd.md=0: removing MD RAID activation Mar 13 00:35:18.581635 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 13 00:35:18.585985 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 13 00:35:18.666105 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 13 00:35:18.674355 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 13 00:35:18.757662 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Mar 13 00:35:18.775655 kernel: virtio_blk virtio2: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Mar 13 00:35:18.779596 kernel: ACPI: bus type USB registered Mar 13 00:35:18.785606 kernel: usbcore: registered new interface driver usbfs Mar 13 00:35:18.789622 kernel: usbcore: registered new interface driver hub Mar 13 00:35:18.791582 kernel: cryptd: max_cpu_qlen set to 1000 Mar 13 00:35:18.791613 kernel: usbcore: registered new device driver usb Mar 13 00:35:18.806576 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 13 00:35:18.806611 kernel: GPT:17805311 != 104857599 Mar 13 00:35:18.806627 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 13 00:35:18.806637 kernel: GPT:17805311 != 104857599 Mar 13 00:35:18.806646 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 13 00:35:18.806656 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 13 00:35:18.810214 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:35:18.810845 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:35:18.815506 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Mar 13 00:35:18.814644 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:35:18.816094 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:35:18.820295 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 13 00:35:18.830222 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:35:18.831689 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:35:18.835269 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller Mar 13 00:35:18.835428 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1 Mar 13 00:35:18.834695 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:35:18.846646 kernel: uhci_hcd 0000:02:01.0: detected 2 ports Mar 13 00:35:18.846789 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x00006000 Mar 13 00:35:18.846885 kernel: hub 1-0:1.0: USB hub found Mar 13 00:35:18.846997 kernel: hub 1-0:1.0: 2 ports detected Mar 13 00:35:18.847093 kernel: libata version 3.00 loaded. Mar 13 00:35:18.851808 kernel: AES CTR mode by8 optimization enabled Mar 13 00:35:18.880609 kernel: ahci 0000:00:1f.2: version 3.0 Mar 13 00:35:18.880794 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 13 00:35:18.886650 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Mar 13 00:35:18.886818 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Mar 13 00:35:18.887872 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 13 00:35:18.889882 kernel: scsi host0: ahci Mar 13 00:35:18.890835 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:35:18.904020 kernel: scsi host1: ahci Mar 13 00:35:18.904153 kernel: scsi host2: ahci Mar 13 00:35:18.904243 kernel: scsi host3: ahci Mar 13 00:35:18.904323 kernel: scsi host4: ahci Mar 13 00:35:18.904404 kernel: scsi host5: ahci Mar 13 00:35:18.904485 kernel: ata1: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380100 irq 61 lpm-pol 1 Mar 13 00:35:18.904498 kernel: ata2: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380180 irq 61 lpm-pol 1 Mar 13 00:35:18.904507 kernel: ata3: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380200 irq 61 lpm-pol 1 Mar 13 00:35:18.904516 kernel: ata4: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380280 irq 61 lpm-pol 1 Mar 13 00:35:18.904526 kernel: ata5: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380300 irq 61 lpm-pol 1 Mar 13 00:35:18.904535 kernel: ata6: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380380 irq 61 lpm-pol 1 Mar 13 00:35:18.944290 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 13 00:35:18.951582 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 13 00:35:18.957539 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 13 00:35:18.957963 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 13 00:35:18.965380 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 13 00:35:18.966490 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 13 00:35:18.992118 disk-uuid[667]: Primary Header is updated. Mar 13 00:35:18.992118 disk-uuid[667]: Secondary Entries is updated. Mar 13 00:35:18.992118 disk-uuid[667]: Secondary Header is updated. Mar 13 00:35:19.001585 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 13 00:35:19.059585 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Mar 13 00:35:19.206302 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 13 00:35:19.206395 kernel: ata1: SATA link down (SStatus 0 SControl 300) Mar 13 00:35:19.206417 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 13 00:35:19.206449 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 13 00:35:19.210232 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 13 00:35:19.210980 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 13 00:35:19.224397 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 13 00:35:19.225535 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 13 00:35:19.226159 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 13 00:35:19.227055 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 13 00:35:19.229001 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 13 00:35:19.251580 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 13 00:35:19.254338 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 13 00:35:19.260593 kernel: usbcore: registered new interface driver usbhid Mar 13 00:35:19.260638 kernel: usbhid: USB HID core driver Mar 13 00:35:19.267894 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Mar 13 00:35:19.267938 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0 Mar 13 00:35:20.022306 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 13 00:35:20.022426 disk-uuid[668]: The operation has completed successfully. Mar 13 00:35:20.138649 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 13 00:35:20.138776 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 13 00:35:20.170561 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 13 00:35:20.187904 sh[693]: Success Mar 13 00:35:20.206695 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 13 00:35:20.206763 kernel: device-mapper: uevent: version 1.0.3 Mar 13 00:35:20.206775 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 13 00:35:20.217584 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Mar 13 00:35:20.295384 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 13 00:35:20.299637 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 13 00:35:20.309999 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 13 00:35:20.330607 kernel: BTRFS: device fsid 503642f8-c59c-4168-97a8-9c3603183fa3 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (705) Mar 13 00:35:20.333593 kernel: BTRFS info (device dm-0): first mount of filesystem 503642f8-c59c-4168-97a8-9c3603183fa3 Mar 13 00:35:20.333645 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:35:20.354430 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 13 00:35:20.354494 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 13 00:35:20.357500 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 13 00:35:20.358404 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 13 00:35:20.358949 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 13 00:35:20.359661 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 13 00:35:20.361992 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 13 00:35:20.396604 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (738) Mar 13 00:35:20.399589 kernel: BTRFS info (device vda6): first mount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:35:20.402607 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:35:20.410135 kernel: BTRFS info (device vda6): turning on async discard Mar 13 00:35:20.410177 kernel: BTRFS info (device vda6): enabling free space tree Mar 13 00:35:20.416619 kernel: BTRFS info (device vda6): last unmount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:35:20.417095 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 13 00:35:20.419201 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 13 00:35:20.467079 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 13 00:35:20.471851 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 13 00:35:20.509474 systemd-networkd[874]: lo: Link UP Mar 13 00:35:20.510163 systemd-networkd[874]: lo: Gained carrier Mar 13 00:35:20.511200 systemd-networkd[874]: Enumeration completed Mar 13 00:35:20.511456 systemd-networkd[874]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:35:20.511459 systemd-networkd[874]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 13 00:35:20.512687 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 13 00:35:20.514223 systemd[1]: Reached target network.target - Network. Mar 13 00:35:20.515127 systemd-networkd[874]: eth0: Link UP Mar 13 00:35:20.515262 systemd-networkd[874]: eth0: Gained carrier Mar 13 00:35:20.515274 systemd-networkd[874]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:35:20.530636 systemd-networkd[874]: eth0: DHCPv4 address 10.0.1.99/25, gateway 10.0.1.1 acquired from 10.0.1.1 Mar 13 00:35:20.559751 ignition[812]: Ignition 2.22.0 Mar 13 00:35:20.559763 ignition[812]: Stage: fetch-offline Mar 13 00:35:20.562087 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 13 00:35:20.559795 ignition[812]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:35:20.559802 ignition[812]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 13 00:35:20.559878 ignition[812]: parsed url from cmdline: "" Mar 13 00:35:20.563680 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 13 00:35:20.559881 ignition[812]: no config URL provided Mar 13 00:35:20.559886 ignition[812]: reading system config file "/usr/lib/ignition/user.ign" Mar 13 00:35:20.559892 ignition[812]: no config at "/usr/lib/ignition/user.ign" Mar 13 00:35:20.559896 ignition[812]: failed to fetch config: resource requires networking Mar 13 00:35:20.560023 ignition[812]: Ignition finished successfully Mar 13 00:35:20.596028 ignition[885]: Ignition 2.22.0 Mar 13 00:35:20.596039 ignition[885]: Stage: fetch Mar 13 00:35:20.596152 ignition[885]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:35:20.596159 ignition[885]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 13 00:35:20.596232 ignition[885]: parsed url from cmdline: "" Mar 13 00:35:20.596235 ignition[885]: no config URL provided Mar 13 00:35:20.596239 ignition[885]: reading system config file "/usr/lib/ignition/user.ign" Mar 13 00:35:20.596245 ignition[885]: no config at "/usr/lib/ignition/user.ign" Mar 13 00:35:20.596322 ignition[885]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 13 00:35:20.596646 ignition[885]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 13 00:35:20.596675 ignition[885]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 13 00:35:21.508465 ignition[885]: GET result: OK Mar 13 00:35:21.509626 ignition[885]: parsing config with SHA512: 51931e688d6f263482aff6f4af6a28131cf97e98d8e00457d6e9a860772321a626c1eec5210f0bff5db6f4885fee0ea7c169e84778489dffd4e8991a51d82de8 Mar 13 00:35:21.525323 unknown[885]: fetched base config from "system" Mar 13 00:35:21.525813 unknown[885]: fetched base config from "system" Mar 13 00:35:21.526481 ignition[885]: fetch: fetch complete Mar 13 00:35:21.525828 unknown[885]: fetched user config from "openstack" Mar 13 00:35:21.526491 ignition[885]: fetch: fetch passed Mar 13 00:35:21.526588 ignition[885]: Ignition finished successfully Mar 13 00:35:21.531503 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 13 00:35:21.536522 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 13 00:35:21.586499 ignition[891]: Ignition 2.22.0 Mar 13 00:35:21.586519 ignition[891]: Stage: kargs Mar 13 00:35:21.586752 ignition[891]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:35:21.586766 ignition[891]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 13 00:35:21.588411 ignition[891]: kargs: kargs passed Mar 13 00:35:21.590027 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 13 00:35:21.588476 ignition[891]: Ignition finished successfully Mar 13 00:35:21.592416 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 13 00:35:21.627102 ignition[897]: Ignition 2.22.0 Mar 13 00:35:21.627127 ignition[897]: Stage: disks Mar 13 00:35:21.627288 ignition[897]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:35:21.627299 ignition[897]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 13 00:35:21.631979 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 13 00:35:21.628446 ignition[897]: disks: disks passed Mar 13 00:35:21.634039 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 13 00:35:21.628498 ignition[897]: Ignition finished successfully Mar 13 00:35:21.634837 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 13 00:35:21.635929 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 13 00:35:21.636955 systemd[1]: Reached target sysinit.target - System Initialization. Mar 13 00:35:21.638096 systemd[1]: Reached target basic.target - Basic System. Mar 13 00:35:21.641642 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 13 00:35:21.691908 systemd-fsck[906]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Mar 13 00:35:21.697675 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 13 00:35:21.700682 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 13 00:35:21.876595 kernel: EXT4-fs (vda9): mounted filesystem 26348f72-0225-4c06-aedc-823e61beebc6 r/w with ordered data mode. Quota mode: none. Mar 13 00:35:21.876979 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 13 00:35:21.877845 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 13 00:35:21.880285 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 13 00:35:21.882625 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 13 00:35:21.884854 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 13 00:35:21.886666 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 13 00:35:21.887892 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 13 00:35:21.887918 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 13 00:35:21.897601 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 13 00:35:21.898810 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 13 00:35:21.914584 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (914) Mar 13 00:35:21.918988 kernel: BTRFS info (device vda6): first mount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:35:21.919022 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:35:21.927839 kernel: BTRFS info (device vda6): turning on async discard Mar 13 00:35:21.927870 kernel: BTRFS info (device vda6): enabling free space tree Mar 13 00:35:21.933246 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 13 00:35:21.963606 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 13 00:35:21.973079 initrd-setup-root[947]: cut: /sysroot/etc/passwd: No such file or directory Mar 13 00:35:21.977842 initrd-setup-root[954]: cut: /sysroot/etc/group: No such file or directory Mar 13 00:35:21.984745 initrd-setup-root[961]: cut: /sysroot/etc/shadow: No such file or directory Mar 13 00:35:21.990013 initrd-setup-root[968]: cut: /sysroot/etc/gshadow: No such file or directory Mar 13 00:35:22.087310 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 13 00:35:22.089224 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 13 00:35:22.091666 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 13 00:35:22.109276 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 13 00:35:22.111052 kernel: BTRFS info (device vda6): last unmount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:35:22.139228 ignition[1035]: INFO : Ignition 2.22.0 Mar 13 00:35:22.140426 ignition[1035]: INFO : Stage: mount Mar 13 00:35:22.140426 ignition[1035]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 13 00:35:22.140426 ignition[1035]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 13 00:35:22.139320 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 13 00:35:22.142120 ignition[1035]: INFO : mount: mount passed Mar 13 00:35:22.142120 ignition[1035]: INFO : Ignition finished successfully Mar 13 00:35:22.142625 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 13 00:35:22.169832 systemd-networkd[874]: eth0: Gained IPv6LL Mar 13 00:35:23.010599 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 13 00:35:25.023600 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 13 00:35:29.030662 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 13 00:35:29.037811 coreos-metadata[916]: Mar 13 00:35:29.037 WARN failed to locate config-drive, using the metadata service API instead Mar 13 00:35:29.050494 coreos-metadata[916]: Mar 13 00:35:29.050 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 13 00:35:29.773326 coreos-metadata[916]: Mar 13 00:35:29.773 INFO Fetch successful Mar 13 00:35:29.774019 coreos-metadata[916]: Mar 13 00:35:29.773 INFO wrote hostname ci-4459-2-4-n-23cf6448d4 to /sysroot/etc/hostname Mar 13 00:35:29.775381 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 13 00:35:29.775504 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 13 00:35:29.777339 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 13 00:35:29.799361 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 13 00:35:29.842611 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1053) Mar 13 00:35:29.847332 kernel: BTRFS info (device vda6): first mount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:35:29.847376 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:35:29.860481 kernel: BTRFS info (device vda6): turning on async discard Mar 13 00:35:29.860623 kernel: BTRFS info (device vda6): enabling free space tree Mar 13 00:35:29.867521 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 13 00:35:29.917541 ignition[1071]: INFO : Ignition 2.22.0 Mar 13 00:35:29.917541 ignition[1071]: INFO : Stage: files Mar 13 00:35:29.919415 ignition[1071]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 13 00:35:29.919415 ignition[1071]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 13 00:35:29.919415 ignition[1071]: DEBUG : files: compiled without relabeling support, skipping Mar 13 00:35:29.921990 ignition[1071]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 13 00:35:29.921990 ignition[1071]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 13 00:35:29.923517 ignition[1071]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 13 00:35:29.924276 ignition[1071]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 13 00:35:29.925117 ignition[1071]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 13 00:35:29.924730 unknown[1071]: wrote ssh authorized keys file for user: core Mar 13 00:35:29.928324 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 13 00:35:29.929768 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 13 00:35:29.992236 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 13 00:35:30.125654 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 13 00:35:30.125654 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 13 00:35:30.127780 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 13 00:35:30.127780 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 13 00:35:30.127780 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 13 00:35:30.127780 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 13 00:35:30.127780 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 13 00:35:30.127780 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 13 00:35:30.127780 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 13 00:35:30.131784 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 13 00:35:30.131784 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 13 00:35:30.131784 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 13 00:35:30.131784 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 13 00:35:30.131784 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 13 00:35:30.131784 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.8-x86-64.raw: attempt #1 Mar 13 00:35:30.313308 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 13 00:35:31.882946 ignition[1071]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 13 00:35:31.882946 ignition[1071]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 13 00:35:31.885945 ignition[1071]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 13 00:35:31.892324 ignition[1071]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 13 00:35:31.892324 ignition[1071]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 13 00:35:31.892324 ignition[1071]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 13 00:35:31.896302 ignition[1071]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 13 00:35:31.896302 ignition[1071]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 13 00:35:31.896302 ignition[1071]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 13 00:35:31.896302 ignition[1071]: INFO : files: files passed Mar 13 00:35:31.896302 ignition[1071]: INFO : Ignition finished successfully Mar 13 00:35:31.894486 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 13 00:35:31.898731 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 13 00:35:31.901343 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 13 00:35:31.914835 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 13 00:35:31.914970 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 13 00:35:31.924549 initrd-setup-root-after-ignition[1100]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 13 00:35:31.924549 initrd-setup-root-after-ignition[1100]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 13 00:35:31.927144 initrd-setup-root-after-ignition[1104]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 13 00:35:31.929353 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 13 00:35:31.930190 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 13 00:35:31.931900 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 13 00:35:31.992972 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 13 00:35:31.993222 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 13 00:35:31.995832 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 13 00:35:31.997679 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 13 00:35:31.999968 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 13 00:35:32.001808 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 13 00:35:32.041008 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 13 00:35:32.045763 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 13 00:35:32.074043 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 13 00:35:32.075349 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 13 00:35:32.077012 systemd[1]: Stopped target timers.target - Timer Units. Mar 13 00:35:32.078173 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 13 00:35:32.078297 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 13 00:35:32.079757 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 13 00:35:32.080613 systemd[1]: Stopped target basic.target - Basic System. Mar 13 00:35:32.081479 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 13 00:35:32.082371 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 13 00:35:32.083257 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 13 00:35:32.084157 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 13 00:35:32.085031 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 13 00:35:32.085939 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 13 00:35:32.086834 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 13 00:35:32.087733 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 13 00:35:32.088626 systemd[1]: Stopped target swap.target - Swaps. Mar 13 00:35:32.089502 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 13 00:35:32.089623 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 13 00:35:32.090952 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 13 00:35:32.091905 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 13 00:35:32.093122 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 13 00:35:32.094278 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 13 00:35:32.094822 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 13 00:35:32.094945 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 13 00:35:32.096489 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 13 00:35:32.096610 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 13 00:35:32.097496 systemd[1]: ignition-files.service: Deactivated successfully. Mar 13 00:35:32.097600 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 13 00:35:32.099204 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 13 00:35:32.102710 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 13 00:35:32.103152 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 13 00:35:32.103259 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 13 00:35:32.103761 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 13 00:35:32.103844 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 13 00:35:32.109335 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 13 00:35:32.116476 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 13 00:35:32.134405 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 13 00:35:32.135701 ignition[1124]: INFO : Ignition 2.22.0 Mar 13 00:35:32.135701 ignition[1124]: INFO : Stage: umount Mar 13 00:35:32.136717 ignition[1124]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 13 00:35:32.136717 ignition[1124]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 13 00:35:32.138404 ignition[1124]: INFO : umount: umount passed Mar 13 00:35:32.138404 ignition[1124]: INFO : Ignition finished successfully Mar 13 00:35:32.139103 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 13 00:35:32.139199 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 13 00:35:32.140693 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 13 00:35:32.140776 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 13 00:35:32.141805 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 13 00:35:32.141875 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 13 00:35:32.142482 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 13 00:35:32.142517 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 13 00:35:32.143117 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 13 00:35:32.143148 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 13 00:35:32.143835 systemd[1]: Stopped target network.target - Network. Mar 13 00:35:32.144457 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 13 00:35:32.144492 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 13 00:35:32.145134 systemd[1]: Stopped target paths.target - Path Units. Mar 13 00:35:32.145822 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 13 00:35:32.145868 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 13 00:35:32.146464 systemd[1]: Stopped target slices.target - Slice Units. Mar 13 00:35:32.147100 systemd[1]: Stopped target sockets.target - Socket Units. Mar 13 00:35:32.147781 systemd[1]: iscsid.socket: Deactivated successfully. Mar 13 00:35:32.147810 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 13 00:35:32.148411 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 13 00:35:32.148440 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 13 00:35:32.149051 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 13 00:35:32.149087 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 13 00:35:32.149727 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 13 00:35:32.149757 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 13 00:35:32.150364 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 13 00:35:32.150399 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 13 00:35:32.151132 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 13 00:35:32.151848 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 13 00:35:32.159084 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 13 00:35:32.159184 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 13 00:35:32.162508 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 13 00:35:32.162781 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 13 00:35:32.162867 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 13 00:35:32.164368 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 13 00:35:32.164838 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 13 00:35:32.165738 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 13 00:35:32.165777 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 13 00:35:32.167235 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 13 00:35:32.167664 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 13 00:35:32.167705 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 13 00:35:32.168102 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 13 00:35:32.168138 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 13 00:35:32.168551 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 13 00:35:32.168593 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 13 00:35:32.168947 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 13 00:35:32.168974 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 13 00:35:32.171677 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 13 00:35:32.173362 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 13 00:35:32.173410 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 13 00:35:32.182836 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 13 00:35:32.183909 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 13 00:35:32.184724 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 13 00:35:32.184781 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 13 00:35:32.185369 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 13 00:35:32.185396 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 13 00:35:32.186051 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 13 00:35:32.186087 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 13 00:35:32.187058 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 13 00:35:32.187090 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 13 00:35:32.188187 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 13 00:35:32.188222 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 13 00:35:32.190662 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 13 00:35:32.191022 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 13 00:35:32.191065 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 13 00:35:32.194070 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 13 00:35:32.194113 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 13 00:35:32.195775 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 13 00:35:32.195815 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 13 00:35:32.196631 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 13 00:35:32.196663 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 13 00:35:32.197452 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:35:32.197483 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:35:32.200791 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Mar 13 00:35:32.200836 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Mar 13 00:35:32.200868 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 13 00:35:32.200899 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 13 00:35:32.201178 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 13 00:35:32.202070 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 13 00:35:32.203069 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 13 00:35:32.203140 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 13 00:35:32.204530 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 13 00:35:32.206243 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 13 00:35:32.219582 systemd[1]: Switching root. Mar 13 00:35:32.260737 systemd-journald[225]: Journal stopped Mar 13 00:35:33.422864 systemd-journald[225]: Received SIGTERM from PID 1 (systemd). Mar 13 00:35:33.422949 kernel: SELinux: policy capability network_peer_controls=1 Mar 13 00:35:33.422963 kernel: SELinux: policy capability open_perms=1 Mar 13 00:35:33.422973 kernel: SELinux: policy capability extended_socket_class=1 Mar 13 00:35:33.422983 kernel: SELinux: policy capability always_check_network=0 Mar 13 00:35:33.422996 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 13 00:35:33.423006 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 13 00:35:33.423018 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 13 00:35:33.423030 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 13 00:35:33.423039 kernel: SELinux: policy capability userspace_initial_context=0 Mar 13 00:35:33.423049 kernel: audit: type=1403 audit(1773362132.468:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 13 00:35:33.423063 systemd[1]: Successfully loaded SELinux policy in 64.554ms. Mar 13 00:35:33.423085 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.784ms. Mar 13 00:35:33.423098 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 13 00:35:33.423108 systemd[1]: Detected virtualization kvm. Mar 13 00:35:33.423120 systemd[1]: Detected architecture x86-64. Mar 13 00:35:33.423131 systemd[1]: Detected first boot. Mar 13 00:35:33.423140 systemd[1]: Hostname set to . Mar 13 00:35:33.423150 systemd[1]: Initializing machine ID from VM UUID. Mar 13 00:35:33.423161 zram_generator::config[1167]: No configuration found. Mar 13 00:35:33.423172 kernel: Guest personality initialized and is inactive Mar 13 00:35:33.423182 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Mar 13 00:35:33.423193 kernel: Initialized host personality Mar 13 00:35:33.423202 kernel: NET: Registered PF_VSOCK protocol family Mar 13 00:35:33.423214 systemd[1]: Populated /etc with preset unit settings. Mar 13 00:35:33.423225 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 13 00:35:33.423235 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 13 00:35:33.423246 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 13 00:35:33.423256 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 13 00:35:33.423269 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 13 00:35:33.423280 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 13 00:35:33.423291 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 13 00:35:33.423303 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 13 00:35:33.423313 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 13 00:35:33.423323 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 13 00:35:33.423334 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 13 00:35:33.423345 systemd[1]: Created slice user.slice - User and Session Slice. Mar 13 00:35:33.423355 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 13 00:35:33.423365 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 13 00:35:33.423375 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 13 00:35:33.423387 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 13 00:35:33.423398 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 13 00:35:33.423409 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 13 00:35:33.423419 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 13 00:35:33.423430 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 13 00:35:33.423440 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 13 00:35:33.423450 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 13 00:35:33.423466 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 13 00:35:33.423476 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 13 00:35:33.423486 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 13 00:35:33.423496 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 13 00:35:33.423509 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 13 00:35:33.423519 systemd[1]: Reached target slices.target - Slice Units. Mar 13 00:35:33.423530 systemd[1]: Reached target swap.target - Swaps. Mar 13 00:35:33.423540 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 13 00:35:33.423550 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 13 00:35:33.423562 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 13 00:35:33.428484 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 13 00:35:33.428500 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 13 00:35:33.428510 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 13 00:35:33.428521 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 13 00:35:33.428532 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 13 00:35:33.428543 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 13 00:35:33.428553 systemd[1]: Mounting media.mount - External Media Directory... Mar 13 00:35:33.428628 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:35:33.428645 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 13 00:35:33.428660 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 13 00:35:33.428670 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 13 00:35:33.428682 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 13 00:35:33.428692 systemd[1]: Reached target machines.target - Containers. Mar 13 00:35:33.428702 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 13 00:35:33.428712 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:35:33.428722 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 13 00:35:33.428733 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 13 00:35:33.428745 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 13 00:35:33.428754 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 13 00:35:33.428768 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 13 00:35:33.428779 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 13 00:35:33.428789 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 13 00:35:33.428800 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 13 00:35:33.428811 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 13 00:35:33.428820 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 13 00:35:33.428832 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 13 00:35:33.428842 systemd[1]: Stopped systemd-fsck-usr.service. Mar 13 00:35:33.428854 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:35:33.428867 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 13 00:35:33.428877 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 13 00:35:33.428888 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 13 00:35:33.428898 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 13 00:35:33.428908 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 13 00:35:33.428918 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 13 00:35:33.428928 systemd[1]: verity-setup.service: Deactivated successfully. Mar 13 00:35:33.428940 systemd[1]: Stopped verity-setup.service. Mar 13 00:35:33.428951 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:35:33.428962 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 13 00:35:33.428972 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 13 00:35:33.428982 systemd[1]: Mounted media.mount - External Media Directory. Mar 13 00:35:33.428992 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 13 00:35:33.429001 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 13 00:35:33.429012 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 13 00:35:33.429022 kernel: loop: module loaded Mar 13 00:35:33.429036 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 13 00:35:33.429046 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 13 00:35:33.429056 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 13 00:35:33.429067 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 13 00:35:33.429076 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 13 00:35:33.429087 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 13 00:35:33.429097 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 13 00:35:33.429107 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 13 00:35:33.429117 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 13 00:35:33.429129 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 13 00:35:33.429141 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 13 00:35:33.429155 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 13 00:35:33.429165 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 13 00:35:33.429176 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 13 00:35:33.429186 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 13 00:35:33.429224 systemd-journald[1241]: Collecting audit messages is disabled. Mar 13 00:35:33.429250 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 13 00:35:33.429261 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 13 00:35:33.429273 kernel: fuse: init (API version 7.41) Mar 13 00:35:33.429282 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 13 00:35:33.429292 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 13 00:35:33.429302 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:35:33.429315 systemd-journald[1241]: Journal started Mar 13 00:35:33.429336 systemd-journald[1241]: Runtime Journal (/run/log/journal/80894e66ff394dea97d4bf6d1d592e6b) is 8M, max 78M, 70M free. Mar 13 00:35:33.130385 systemd[1]: Queued start job for default target multi-user.target. Mar 13 00:35:33.151436 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 13 00:35:33.151986 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 13 00:35:33.436223 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 13 00:35:33.436261 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 13 00:35:33.442577 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 13 00:35:33.442617 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 13 00:35:33.449583 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 13 00:35:33.459940 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 13 00:35:33.470632 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 13 00:35:33.470679 systemd[1]: Started systemd-journald.service - Journal Service. Mar 13 00:35:33.471984 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 13 00:35:33.472613 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 13 00:35:33.472743 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 13 00:35:33.473274 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 13 00:35:33.473906 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 13 00:35:33.482899 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 13 00:35:33.488722 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 13 00:35:33.492603 kernel: loop0: detected capacity change from 0 to 110984 Mar 13 00:35:33.492641 kernel: ACPI: bus type drm_connector registered Mar 13 00:35:33.504721 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 13 00:35:33.512069 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 13 00:35:33.513000 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 13 00:35:33.513162 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 13 00:35:33.515767 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 13 00:35:33.535608 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 13 00:35:33.539735 systemd-journald[1241]: Time spent on flushing to /var/log/journal/80894e66ff394dea97d4bf6d1d592e6b is 56.323ms for 1753 entries. Mar 13 00:35:33.539735 systemd-journald[1241]: System Journal (/var/log/journal/80894e66ff394dea97d4bf6d1d592e6b) is 8M, max 584.8M, 576.8M free. Mar 13 00:35:33.624724 systemd-journald[1241]: Received client request to flush runtime journal. Mar 13 00:35:33.624781 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 13 00:35:33.624804 kernel: loop1: detected capacity change from 0 to 228704 Mar 13 00:35:33.624818 kernel: loop2: detected capacity change from 0 to 1640 Mar 13 00:35:33.559176 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 13 00:35:33.561357 systemd-tmpfiles[1273]: ACLs are not supported, ignoring. Mar 13 00:35:33.561368 systemd-tmpfiles[1273]: ACLs are not supported, ignoring. Mar 13 00:35:33.567736 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 13 00:35:33.572907 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 13 00:35:33.617957 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 13 00:35:33.626705 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 13 00:35:33.638271 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 13 00:35:33.640744 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 13 00:35:33.642767 kernel: loop3: detected capacity change from 0 to 128560 Mar 13 00:35:33.665768 systemd-tmpfiles[1318]: ACLs are not supported, ignoring. Mar 13 00:35:33.666656 systemd-tmpfiles[1318]: ACLs are not supported, ignoring. Mar 13 00:35:33.671355 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 13 00:35:33.681958 kernel: loop4: detected capacity change from 0 to 110984 Mar 13 00:35:33.719628 kernel: loop5: detected capacity change from 0 to 228704 Mar 13 00:35:33.763606 kernel: loop6: detected capacity change from 0 to 1640 Mar 13 00:35:33.773635 kernel: loop7: detected capacity change from 0 to 128560 Mar 13 00:35:33.800099 (sd-merge)[1322]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Mar 13 00:35:33.801068 (sd-merge)[1322]: Merged extensions into '/usr'. Mar 13 00:35:33.805613 systemd[1]: Reload requested from client PID 1272 ('systemd-sysext') (unit systemd-sysext.service)... Mar 13 00:35:33.805735 systemd[1]: Reloading... Mar 13 00:35:33.897929 zram_generator::config[1348]: No configuration found. Mar 13 00:35:34.117690 ldconfig[1268]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 13 00:35:34.139916 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 13 00:35:34.139990 systemd[1]: Reloading finished in 333 ms. Mar 13 00:35:34.158218 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 13 00:35:34.159014 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 13 00:35:34.159746 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 13 00:35:34.168426 systemd[1]: Starting ensure-sysext.service... Mar 13 00:35:34.169726 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 13 00:35:34.173803 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 13 00:35:34.193386 systemd[1]: Reload requested from client PID 1392 ('systemctl') (unit ensure-sysext.service)... Mar 13 00:35:34.193400 systemd[1]: Reloading... Mar 13 00:35:34.198774 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 13 00:35:34.198797 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 13 00:35:34.198993 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 13 00:35:34.199188 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 13 00:35:34.204990 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 13 00:35:34.205201 systemd-tmpfiles[1393]: ACLs are not supported, ignoring. Mar 13 00:35:34.205248 systemd-tmpfiles[1393]: ACLs are not supported, ignoring. Mar 13 00:35:34.209394 systemd-tmpfiles[1393]: Detected autofs mount point /boot during canonicalization of boot. Mar 13 00:35:34.209405 systemd-tmpfiles[1393]: Skipping /boot Mar 13 00:35:34.210107 systemd-udevd[1394]: Using default interface naming scheme 'v255'. Mar 13 00:35:34.218389 systemd-tmpfiles[1393]: Detected autofs mount point /boot during canonicalization of boot. Mar 13 00:35:34.218400 systemd-tmpfiles[1393]: Skipping /boot Mar 13 00:35:34.234603 zram_generator::config[1418]: No configuration found. Mar 13 00:35:34.451587 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Mar 13 00:35:34.460587 kernel: mousedev: PS/2 mouse device common for all mice Mar 13 00:35:34.480614 kernel: ACPI: button: Power Button [PWRF] Mar 13 00:35:34.525723 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 13 00:35:34.526274 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 13 00:35:34.526319 systemd[1]: Reloading finished in 332 ms. Mar 13 00:35:34.534005 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 13 00:35:34.534766 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 13 00:35:34.567609 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Mar 13 00:35:34.567857 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 13 00:35:34.571625 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 13 00:35:34.597431 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:35:34.600088 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 13 00:35:34.604826 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 13 00:35:34.606751 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:35:34.607774 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 13 00:35:34.613142 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 13 00:35:34.621775 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 13 00:35:34.622726 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:35:34.626935 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 13 00:35:34.627375 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:35:34.632946 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 13 00:35:34.636214 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 13 00:35:34.641962 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 13 00:35:34.646788 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 13 00:35:34.647628 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:35:34.649739 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 13 00:35:34.649892 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 13 00:35:34.651066 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 13 00:35:34.651217 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 13 00:35:34.651927 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 13 00:35:34.652103 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 13 00:35:34.656841 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:35:34.657012 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:35:34.660762 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 13 00:35:34.670642 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 13 00:35:34.680313 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 13 00:35:34.680850 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:35:34.680958 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:35:34.681044 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:35:34.688211 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 13 00:35:34.692561 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 13 00:35:34.692748 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 13 00:35:34.696794 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:35:34.697105 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:35:34.703430 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 13 00:35:34.704769 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Mar 13 00:35:34.705657 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:35:34.705794 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:35:34.705894 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 13 00:35:34.706012 systemd[1]: Reached target time-set.target - System Time Set. Mar 13 00:35:34.708723 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 13 00:35:34.709138 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:35:34.710617 systemd[1]: Finished ensure-sysext.service. Mar 13 00:35:34.712090 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 13 00:35:34.722462 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 13 00:35:34.728943 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 13 00:35:34.730059 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 13 00:35:34.735456 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 13 00:35:34.739913 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:35:34.741514 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 13 00:35:34.742872 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 13 00:35:34.745424 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 13 00:35:34.754374 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:35:34.755729 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:35:34.756556 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 13 00:35:34.756732 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 13 00:35:34.763155 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:35:34.764589 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 13 00:35:34.764628 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 13 00:35:34.771919 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 13 00:35:34.773625 kernel: PTP clock support registered Mar 13 00:35:34.781212 augenrules[1567]: No rules Mar 13 00:35:34.781285 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Mar 13 00:35:34.782069 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Mar 13 00:35:34.783205 systemd[1]: audit-rules.service: Deactivated successfully. Mar 13 00:35:34.783921 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 13 00:35:34.809589 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Mar 13 00:35:34.814488 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 13 00:35:34.815070 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 13 00:35:34.818731 kernel: Console: switching to colour dummy device 80x25 Mar 13 00:35:34.820728 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Mar 13 00:35:34.820922 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 13 00:35:34.820938 kernel: [drm] features: -context_init Mar 13 00:35:34.827839 kernel: [drm] number of scanouts: 1 Mar 13 00:35:34.827897 kernel: [drm] number of cap sets: 0 Mar 13 00:35:34.833586 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Mar 13 00:35:34.856352 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 13 00:35:34.856425 kernel: Console: switching to colour frame buffer device 160x50 Mar 13 00:35:34.875736 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 13 00:35:34.881505 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 13 00:35:34.889637 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:35:34.889930 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:35:34.892036 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 13 00:35:34.895447 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:35:34.980724 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:35:34.988655 systemd-networkd[1523]: lo: Link UP Mar 13 00:35:34.988875 systemd-networkd[1523]: lo: Gained carrier Mar 13 00:35:34.989966 systemd-networkd[1523]: Enumeration completed Mar 13 00:35:34.990101 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 13 00:35:34.991489 systemd-networkd[1523]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:35:34.991540 systemd-networkd[1523]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 13 00:35:34.991858 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 13 00:35:34.993546 systemd-resolved[1524]: Positive Trust Anchors: Mar 13 00:35:34.993553 systemd-resolved[1524]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 13 00:35:34.993603 systemd-resolved[1524]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 13 00:35:34.994590 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 13 00:35:34.997710 systemd-networkd[1523]: eth0: Link UP Mar 13 00:35:34.999862 systemd-networkd[1523]: eth0: Gained carrier Mar 13 00:35:34.999882 systemd-networkd[1523]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:35:35.000908 systemd-resolved[1524]: Using system hostname 'ci-4459-2-4-n-23cf6448d4'. Mar 13 00:35:35.002703 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 13 00:35:35.002818 systemd[1]: Reached target network.target - Network. Mar 13 00:35:35.002871 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 13 00:35:35.002923 systemd[1]: Reached target sysinit.target - System Initialization. Mar 13 00:35:35.003042 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 13 00:35:35.003112 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 13 00:35:35.003182 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Mar 13 00:35:35.003350 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 13 00:35:35.003461 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 13 00:35:35.003518 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 13 00:35:35.003582 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 13 00:35:35.003600 systemd[1]: Reached target paths.target - Path Units. Mar 13 00:35:35.003646 systemd[1]: Reached target timers.target - Timer Units. Mar 13 00:35:35.004908 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 13 00:35:35.008954 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 13 00:35:35.011833 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 13 00:35:35.014086 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 13 00:35:35.014667 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 13 00:35:35.017646 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 13 00:35:35.020143 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 13 00:35:35.020406 systemd-networkd[1523]: eth0: DHCPv4 address 10.0.1.99/25, gateway 10.0.1.1 acquired from 10.0.1.1 Mar 13 00:35:35.022206 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 13 00:35:35.024420 systemd[1]: Reached target sockets.target - Socket Units. Mar 13 00:35:35.025776 systemd[1]: Reached target basic.target - Basic System. Mar 13 00:35:35.027022 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 13 00:35:35.027047 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 13 00:35:35.034051 systemd[1]: Starting chronyd.service - NTP client/server... Mar 13 00:35:35.036793 systemd[1]: Starting containerd.service - containerd container runtime... Mar 13 00:35:35.039663 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 13 00:35:35.046735 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 13 00:35:35.050194 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 13 00:35:35.056278 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 13 00:35:35.058621 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 13 00:35:35.060737 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 13 00:35:35.062289 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 13 00:35:35.069181 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Mar 13 00:35:35.071737 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 13 00:35:35.074871 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 13 00:35:35.079738 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 13 00:35:35.082206 jq[1603]: false Mar 13 00:35:35.087303 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 13 00:35:35.090167 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 13 00:35:35.094120 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 13 00:35:35.095656 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 13 00:35:35.098626 google_oslogin_nss_cache[1607]: oslogin_cache_refresh[1607]: Refreshing passwd entry cache Mar 13 00:35:35.100258 systemd[1]: Starting update-engine.service - Update Engine... Mar 13 00:35:35.100598 oslogin_cache_refresh[1607]: Refreshing passwd entry cache Mar 13 00:35:35.108653 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 13 00:35:35.110095 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 13 00:35:35.114074 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 13 00:35:35.114306 google_oslogin_nss_cache[1607]: oslogin_cache_refresh[1607]: Failure getting users, quitting Mar 13 00:35:35.114356 oslogin_cache_refresh[1607]: Failure getting users, quitting Mar 13 00:35:35.114409 google_oslogin_nss_cache[1607]: oslogin_cache_refresh[1607]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 13 00:35:35.115393 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 13 00:35:35.115472 oslogin_cache_refresh[1607]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 13 00:35:35.115624 google_oslogin_nss_cache[1607]: oslogin_cache_refresh[1607]: Refreshing group entry cache Mar 13 00:35:35.115515 oslogin_cache_refresh[1607]: Refreshing group entry cache Mar 13 00:35:35.116614 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 13 00:35:35.121956 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 13 00:35:35.122123 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 13 00:35:35.125351 google_oslogin_nss_cache[1607]: oslogin_cache_refresh[1607]: Failure getting groups, quitting Mar 13 00:35:35.125351 google_oslogin_nss_cache[1607]: oslogin_cache_refresh[1607]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 13 00:35:35.124702 oslogin_cache_refresh[1607]: Failure getting groups, quitting Mar 13 00:35:35.124711 oslogin_cache_refresh[1607]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 13 00:35:35.126802 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Mar 13 00:35:35.126977 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Mar 13 00:35:35.133690 systemd[1]: motdgen.service: Deactivated successfully. Mar 13 00:35:35.135650 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 13 00:35:35.143812 extend-filesystems[1604]: Found /dev/vda6 Mar 13 00:35:35.152927 jq[1615]: true Mar 13 00:35:35.163978 extend-filesystems[1604]: Found /dev/vda9 Mar 13 00:35:35.166142 update_engine[1613]: I20260313 00:35:35.163832 1613 main.cc:92] Flatcar Update Engine starting Mar 13 00:35:35.168584 extend-filesystems[1604]: Checking size of /dev/vda9 Mar 13 00:35:35.173449 chronyd[1598]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Mar 13 00:35:35.176614 chronyd[1598]: Loaded seccomp filter (level 2) Mar 13 00:35:35.176767 systemd[1]: Started chronyd.service - NTP client/server. Mar 13 00:35:35.180275 (ntainerd)[1639]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 13 00:35:35.194524 dbus-daemon[1601]: [system] SELinux support is enabled Mar 13 00:35:35.194676 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 13 00:35:35.197289 extend-filesystems[1604]: Resized partition /dev/vda9 Mar 13 00:35:35.197745 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 13 00:35:35.197767 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 13 00:35:35.200586 jq[1642]: true Mar 13 00:35:35.201434 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 13 00:35:35.201455 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 13 00:35:35.206702 tar[1625]: linux-amd64/LICENSE Mar 13 00:35:35.206702 tar[1625]: linux-amd64/helm Mar 13 00:35:35.208407 update_engine[1613]: I20260313 00:35:35.207912 1613 update_check_scheduler.cc:74] Next update check in 4m42s Mar 13 00:35:35.208715 systemd[1]: Started update-engine.service - Update Engine. Mar 13 00:35:35.217473 extend-filesystems[1649]: resize2fs 1.47.3 (8-Jul-2025) Mar 13 00:35:35.231766 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Mar 13 00:35:35.236191 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 13 00:35:35.262486 systemd-logind[1612]: New seat seat0. Mar 13 00:35:35.264180 systemd-logind[1612]: Watching system buttons on /dev/input/event3 (Power Button) Mar 13 00:35:35.266433 systemd-logind[1612]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 13 00:35:35.266627 systemd[1]: Started systemd-logind.service - User Login Management. Mar 13 00:35:35.410851 locksmithd[1652]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 13 00:35:35.449518 containerd[1639]: time="2026-03-13T00:35:35Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 13 00:35:35.459739 containerd[1639]: time="2026-03-13T00:35:35.458612090Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 13 00:35:35.468145 containerd[1639]: time="2026-03-13T00:35:35.468100778Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.169µs" Mar 13 00:35:35.468145 containerd[1639]: time="2026-03-13T00:35:35.468140260Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 13 00:35:35.479903 containerd[1639]: time="2026-03-13T00:35:35.468165722Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 13 00:35:35.483580 containerd[1639]: time="2026-03-13T00:35:35.483136509Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 13 00:35:35.483580 containerd[1639]: time="2026-03-13T00:35:35.483187796Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 13 00:35:35.483580 containerd[1639]: time="2026-03-13T00:35:35.483215227Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 13 00:35:35.483580 containerd[1639]: time="2026-03-13T00:35:35.483264783Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 13 00:35:35.483580 containerd[1639]: time="2026-03-13T00:35:35.483274831Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 13 00:35:35.483580 containerd[1639]: time="2026-03-13T00:35:35.483478946Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 13 00:35:35.483580 containerd[1639]: time="2026-03-13T00:35:35.483491571Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 13 00:35:35.483580 containerd[1639]: time="2026-03-13T00:35:35.483500808Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 13 00:35:35.483580 containerd[1639]: time="2026-03-13T00:35:35.483508318Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 13 00:35:35.483580 containerd[1639]: time="2026-03-13T00:35:35.483582472Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 13 00:35:35.501554 containerd[1639]: time="2026-03-13T00:35:35.501422966Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 13 00:35:35.501554 containerd[1639]: time="2026-03-13T00:35:35.501481818Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 13 00:35:35.501554 containerd[1639]: time="2026-03-13T00:35:35.501493280Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 13 00:35:35.501554 containerd[1639]: time="2026-03-13T00:35:35.501521308Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 13 00:35:35.502551 bash[1667]: Updated "/home/core/.ssh/authorized_keys" Mar 13 00:35:35.502864 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 13 00:35:35.505816 containerd[1639]: time="2026-03-13T00:35:35.502771579Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 13 00:35:35.505816 containerd[1639]: time="2026-03-13T00:35:35.503051469Z" level=info msg="metadata content store policy set" policy=shared Mar 13 00:35:35.510849 systemd[1]: Starting sshkeys.service... Mar 13 00:35:35.532280 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 13 00:35:35.537829 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 13 00:35:35.554580 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 13 00:35:35.566764 containerd[1639]: time="2026-03-13T00:35:35.566726236Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 13 00:35:35.566894 containerd[1639]: time="2026-03-13T00:35:35.566794244Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 13 00:35:35.566894 containerd[1639]: time="2026-03-13T00:35:35.566812109Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 13 00:35:35.566894 containerd[1639]: time="2026-03-13T00:35:35.566840155Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 13 00:35:35.566894 containerd[1639]: time="2026-03-13T00:35:35.566851395Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 13 00:35:35.566894 containerd[1639]: time="2026-03-13T00:35:35.566863030Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 13 00:35:35.566894 containerd[1639]: time="2026-03-13T00:35:35.566873424Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 13 00:35:35.567050 containerd[1639]: time="2026-03-13T00:35:35.566897148Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 13 00:35:35.567050 containerd[1639]: time="2026-03-13T00:35:35.566913761Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 13 00:35:35.567050 containerd[1639]: time="2026-03-13T00:35:35.566924688Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 13 00:35:35.567050 containerd[1639]: time="2026-03-13T00:35:35.566933776Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 13 00:35:35.567050 containerd[1639]: time="2026-03-13T00:35:35.566944909Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 13 00:35:35.567215 containerd[1639]: time="2026-03-13T00:35:35.567059529Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 13 00:35:35.567215 containerd[1639]: time="2026-03-13T00:35:35.567077125Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 13 00:35:35.567215 containerd[1639]: time="2026-03-13T00:35:35.567089609Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 13 00:35:35.567215 containerd[1639]: time="2026-03-13T00:35:35.567098977Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 13 00:35:35.567215 containerd[1639]: time="2026-03-13T00:35:35.567114709Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 13 00:35:35.567215 containerd[1639]: time="2026-03-13T00:35:35.567125988Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 13 00:35:35.567215 containerd[1639]: time="2026-03-13T00:35:35.567135766Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 13 00:35:35.567215 containerd[1639]: time="2026-03-13T00:35:35.567144278Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 13 00:35:35.567215 containerd[1639]: time="2026-03-13T00:35:35.567154401Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 13 00:35:35.567215 containerd[1639]: time="2026-03-13T00:35:35.567162790Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 13 00:35:35.567215 containerd[1639]: time="2026-03-13T00:35:35.567171329Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 13 00:35:35.567215 containerd[1639]: time="2026-03-13T00:35:35.567210715Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 13 00:35:35.567677 containerd[1639]: time="2026-03-13T00:35:35.567222001Z" level=info msg="Start snapshots syncer" Mar 13 00:35:35.567677 containerd[1639]: time="2026-03-13T00:35:35.567241775Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 13 00:35:35.567677 containerd[1639]: time="2026-03-13T00:35:35.567465443Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 13 00:35:35.568052 containerd[1639]: time="2026-03-13T00:35:35.567510337Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 13 00:35:35.570944 containerd[1639]: time="2026-03-13T00:35:35.570922380Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 13 00:35:35.571050 containerd[1639]: time="2026-03-13T00:35:35.571036874Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 13 00:35:35.571193 containerd[1639]: time="2026-03-13T00:35:35.571061552Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 13 00:35:35.571193 containerd[1639]: time="2026-03-13T00:35:35.571071613Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 13 00:35:35.571193 containerd[1639]: time="2026-03-13T00:35:35.571080080Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 13 00:35:35.571193 containerd[1639]: time="2026-03-13T00:35:35.571090125Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 13 00:35:35.571193 containerd[1639]: time="2026-03-13T00:35:35.571099452Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 13 00:35:35.571193 containerd[1639]: time="2026-03-13T00:35:35.571109914Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 13 00:35:35.571193 containerd[1639]: time="2026-03-13T00:35:35.571132133Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 13 00:35:35.571193 containerd[1639]: time="2026-03-13T00:35:35.571141742Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 13 00:35:35.571193 containerd[1639]: time="2026-03-13T00:35:35.571151227Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 13 00:35:35.571193 containerd[1639]: time="2026-03-13T00:35:35.571184028Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 13 00:35:35.571613 containerd[1639]: time="2026-03-13T00:35:35.571198065Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 13 00:35:35.571613 containerd[1639]: time="2026-03-13T00:35:35.571206204Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 13 00:35:35.571613 containerd[1639]: time="2026-03-13T00:35:35.571214144Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 13 00:35:35.571613 containerd[1639]: time="2026-03-13T00:35:35.571221779Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 13 00:35:35.571613 containerd[1639]: time="2026-03-13T00:35:35.571230368Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 13 00:35:35.571613 containerd[1639]: time="2026-03-13T00:35:35.571243716Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 13 00:35:35.571613 containerd[1639]: time="2026-03-13T00:35:35.571257065Z" level=info msg="runtime interface created" Mar 13 00:35:35.571613 containerd[1639]: time="2026-03-13T00:35:35.571261550Z" level=info msg="created NRI interface" Mar 13 00:35:35.571613 containerd[1639]: time="2026-03-13T00:35:35.571269027Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 13 00:35:35.571613 containerd[1639]: time="2026-03-13T00:35:35.571279766Z" level=info msg="Connect containerd service" Mar 13 00:35:35.571613 containerd[1639]: time="2026-03-13T00:35:35.571295700Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 13 00:35:35.572497 containerd[1639]: time="2026-03-13T00:35:35.571930642Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 13 00:35:35.682454 containerd[1639]: time="2026-03-13T00:35:35.682059172Z" level=info msg="Start subscribing containerd event" Mar 13 00:35:35.682454 containerd[1639]: time="2026-03-13T00:35:35.682101221Z" level=info msg="Start recovering state" Mar 13 00:35:35.682454 containerd[1639]: time="2026-03-13T00:35:35.682182087Z" level=info msg="Start event monitor" Mar 13 00:35:35.682454 containerd[1639]: time="2026-03-13T00:35:35.682192104Z" level=info msg="Start cni network conf syncer for default" Mar 13 00:35:35.682454 containerd[1639]: time="2026-03-13T00:35:35.682198171Z" level=info msg="Start streaming server" Mar 13 00:35:35.682454 containerd[1639]: time="2026-03-13T00:35:35.682210147Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 13 00:35:35.682454 containerd[1639]: time="2026-03-13T00:35:35.682216295Z" level=info msg="runtime interface starting up..." Mar 13 00:35:35.682454 containerd[1639]: time="2026-03-13T00:35:35.682221201Z" level=info msg="starting plugins..." Mar 13 00:35:35.682454 containerd[1639]: time="2026-03-13T00:35:35.682231983Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 13 00:35:35.683087 containerd[1639]: time="2026-03-13T00:35:35.683045231Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 13 00:35:35.683087 containerd[1639]: time="2026-03-13T00:35:35.683085507Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 13 00:35:35.683210 systemd[1]: Started containerd.service - containerd container runtime. Mar 13 00:35:35.684804 containerd[1639]: time="2026-03-13T00:35:35.684294348Z" level=info msg="containerd successfully booted in 0.235083s" Mar 13 00:35:35.773586 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Mar 13 00:35:35.802340 extend-filesystems[1649]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 13 00:35:35.802340 extend-filesystems[1649]: old_desc_blocks = 1, new_desc_blocks = 6 Mar 13 00:35:35.802340 extend-filesystems[1649]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Mar 13 00:35:35.807471 extend-filesystems[1604]: Resized filesystem in /dev/vda9 Mar 13 00:35:35.803156 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 13 00:35:35.803376 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 13 00:35:35.856553 sshd_keygen[1633]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 13 00:35:35.877950 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 13 00:35:35.882658 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 13 00:35:35.897735 systemd[1]: issuegen.service: Deactivated successfully. Mar 13 00:35:35.898039 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 13 00:35:35.902844 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 13 00:35:35.913142 tar[1625]: linux-amd64/README.md Mar 13 00:35:35.922717 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 13 00:35:35.926473 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 13 00:35:35.929841 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 13 00:35:35.936751 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 13 00:35:35.937413 systemd[1]: Reached target getty.target - Login Prompts. Mar 13 00:35:36.106606 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 13 00:35:36.565596 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 13 00:35:36.697771 systemd-networkd[1523]: eth0: Gained IPv6LL Mar 13 00:35:36.700424 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 13 00:35:36.703615 systemd[1]: Reached target network-online.target - Network is Online. Mar 13 00:35:36.706648 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:35:36.710539 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 13 00:35:36.750099 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 13 00:35:37.986346 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:35:37.994057 (kubelet)[1738]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 13 00:35:38.119948 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 13 00:35:38.398061 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 13 00:35:38.402901 systemd[1]: Started sshd@0-10.0.1.99:22-4.153.228.146:58486.service - OpenSSH per-connection server daemon (4.153.228.146:58486). Mar 13 00:35:38.574619 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 13 00:35:38.797859 kubelet[1738]: E0313 00:35:38.797769 1738 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 13 00:35:38.800128 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 13 00:35:38.800252 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 13 00:35:38.800546 systemd[1]: kubelet.service: Consumed 1.056s CPU time, 266.5M memory peak. Mar 13 00:35:38.939466 sshd[1745]: Accepted publickey for core from 4.153.228.146 port 58486 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:35:38.944547 sshd-session[1745]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:38.953591 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 13 00:35:38.956617 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 13 00:35:38.975081 systemd-logind[1612]: New session 1 of user core. Mar 13 00:35:39.000146 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 13 00:35:39.009234 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 13 00:35:39.016895 (systemd)[1753]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 13 00:35:39.019036 systemd-logind[1612]: New session c1 of user core. Mar 13 00:35:39.146489 systemd[1753]: Queued start job for default target default.target. Mar 13 00:35:39.153469 systemd[1753]: Created slice app.slice - User Application Slice. Mar 13 00:35:39.153496 systemd[1753]: Reached target paths.target - Paths. Mar 13 00:35:39.153533 systemd[1753]: Reached target timers.target - Timers. Mar 13 00:35:39.154700 systemd[1753]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 13 00:35:39.172254 systemd[1753]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 13 00:35:39.172352 systemd[1753]: Reached target sockets.target - Sockets. Mar 13 00:35:39.172388 systemd[1753]: Reached target basic.target - Basic System. Mar 13 00:35:39.172418 systemd[1753]: Reached target default.target - Main User Target. Mar 13 00:35:39.172442 systemd[1753]: Startup finished in 148ms. Mar 13 00:35:39.173204 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 13 00:35:39.182726 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 13 00:35:39.491993 systemd[1]: Started sshd@1-10.0.1.99:22-4.153.228.146:58494.service - OpenSSH per-connection server daemon (4.153.228.146:58494). Mar 13 00:35:40.049855 sshd[1764]: Accepted publickey for core from 4.153.228.146 port 58494 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:35:40.052919 sshd-session[1764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:40.060083 systemd-logind[1612]: New session 2 of user core. Mar 13 00:35:40.071807 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 13 00:35:40.341671 sshd[1767]: Connection closed by 4.153.228.146 port 58494 Mar 13 00:35:40.341405 sshd-session[1764]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:40.349586 systemd[1]: sshd@1-10.0.1.99:22-4.153.228.146:58494.service: Deactivated successfully. Mar 13 00:35:40.352932 systemd[1]: session-2.scope: Deactivated successfully. Mar 13 00:35:40.356210 systemd-logind[1612]: Session 2 logged out. Waiting for processes to exit. Mar 13 00:35:40.357778 systemd-logind[1612]: Removed session 2. Mar 13 00:35:40.449376 systemd[1]: Started sshd@2-10.0.1.99:22-4.153.228.146:58506.service - OpenSSH per-connection server daemon (4.153.228.146:58506). Mar 13 00:35:40.984900 sshd[1773]: Accepted publickey for core from 4.153.228.146 port 58506 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:35:40.985998 sshd-session[1773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:40.991192 systemd-logind[1612]: New session 3 of user core. Mar 13 00:35:41.001811 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 13 00:35:41.281537 sshd[1780]: Connection closed by 4.153.228.146 port 58506 Mar 13 00:35:41.282270 sshd-session[1773]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:41.289457 systemd-logind[1612]: Session 3 logged out. Waiting for processes to exit. Mar 13 00:35:41.289958 systemd[1]: sshd@2-10.0.1.99:22-4.153.228.146:58506.service: Deactivated successfully. Mar 13 00:35:41.292222 systemd[1]: session-3.scope: Deactivated successfully. Mar 13 00:35:41.294869 systemd-logind[1612]: Removed session 3. Mar 13 00:35:42.134650 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 13 00:35:42.155193 coreos-metadata[1600]: Mar 13 00:35:42.154 WARN failed to locate config-drive, using the metadata service API instead Mar 13 00:35:42.203940 coreos-metadata[1600]: Mar 13 00:35:42.201 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 13 00:35:42.593623 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 13 00:35:42.609984 coreos-metadata[1682]: Mar 13 00:35:42.609 WARN failed to locate config-drive, using the metadata service API instead Mar 13 00:35:42.628002 coreos-metadata[1682]: Mar 13 00:35:42.627 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 13 00:35:44.035798 coreos-metadata[1600]: Mar 13 00:35:44.035 INFO Fetch successful Mar 13 00:35:44.035798 coreos-metadata[1600]: Mar 13 00:35:44.035 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 13 00:35:44.707521 coreos-metadata[1682]: Mar 13 00:35:44.707 INFO Fetch successful Mar 13 00:35:44.707521 coreos-metadata[1682]: Mar 13 00:35:44.707 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 13 00:35:45.399338 coreos-metadata[1600]: Mar 13 00:35:45.399 INFO Fetch successful Mar 13 00:35:45.399338 coreos-metadata[1600]: Mar 13 00:35:45.399 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 13 00:35:46.105022 coreos-metadata[1682]: Mar 13 00:35:46.104 INFO Fetch successful Mar 13 00:35:46.107744 unknown[1682]: wrote ssh authorized keys file for user: core Mar 13 00:35:46.137935 update-ssh-keys[1789]: Updated "/home/core/.ssh/authorized_keys" Mar 13 00:35:46.139467 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 13 00:35:46.140658 coreos-metadata[1600]: Mar 13 00:35:46.140 INFO Fetch successful Mar 13 00:35:46.140658 coreos-metadata[1600]: Mar 13 00:35:46.140 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 13 00:35:46.141887 systemd[1]: Finished sshkeys.service. Mar 13 00:35:46.822686 coreos-metadata[1600]: Mar 13 00:35:46.822 INFO Fetch successful Mar 13 00:35:46.822686 coreos-metadata[1600]: Mar 13 00:35:46.822 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 13 00:35:47.641429 coreos-metadata[1600]: Mar 13 00:35:47.641 INFO Fetch successful Mar 13 00:35:47.641429 coreos-metadata[1600]: Mar 13 00:35:47.641 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 13 00:35:48.333215 coreos-metadata[1600]: Mar 13 00:35:48.333 INFO Fetch successful Mar 13 00:35:48.360091 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 13 00:35:48.362016 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 13 00:35:48.362424 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 13 00:35:48.362710 systemd[1]: Startup finished in 4.075s (kernel) + 14.825s (initrd) + 15.957s (userspace) = 34.859s. Mar 13 00:35:49.052267 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 13 00:35:49.055809 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:35:49.185404 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:35:49.194995 (kubelet)[1806]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 13 00:35:49.233392 kubelet[1806]: E0313 00:35:49.233325 1806 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 13 00:35:49.238204 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 13 00:35:49.238408 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 13 00:35:49.239059 systemd[1]: kubelet.service: Consumed 138ms CPU time, 108.4M memory peak. Mar 13 00:35:51.395818 systemd[1]: Started sshd@3-10.0.1.99:22-4.153.228.146:46992.service - OpenSSH per-connection server daemon (4.153.228.146:46992). Mar 13 00:35:51.954821 sshd[1815]: Accepted publickey for core from 4.153.228.146 port 46992 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:35:51.956219 sshd-session[1815]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:51.963437 systemd-logind[1612]: New session 4 of user core. Mar 13 00:35:51.968728 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 13 00:35:52.245844 sshd[1818]: Connection closed by 4.153.228.146 port 46992 Mar 13 00:35:52.248069 sshd-session[1815]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:52.257172 systemd[1]: sshd@3-10.0.1.99:22-4.153.228.146:46992.service: Deactivated successfully. Mar 13 00:35:52.261501 systemd[1]: session-4.scope: Deactivated successfully. Mar 13 00:35:52.263633 systemd-logind[1612]: Session 4 logged out. Waiting for processes to exit. Mar 13 00:35:52.265910 systemd-logind[1612]: Removed session 4. Mar 13 00:35:52.353280 systemd[1]: Started sshd@4-10.0.1.99:22-4.153.228.146:47006.service - OpenSSH per-connection server daemon (4.153.228.146:47006). Mar 13 00:35:52.889669 sshd[1824]: Accepted publickey for core from 4.153.228.146 port 47006 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:35:52.891167 sshd-session[1824]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:52.897521 systemd-logind[1612]: New session 5 of user core. Mar 13 00:35:52.909156 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 13 00:35:53.174303 sshd[1827]: Connection closed by 4.153.228.146 port 47006 Mar 13 00:35:53.174900 sshd-session[1824]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:53.178786 systemd-logind[1612]: Session 5 logged out. Waiting for processes to exit. Mar 13 00:35:53.179643 systemd[1]: sshd@4-10.0.1.99:22-4.153.228.146:47006.service: Deactivated successfully. Mar 13 00:35:53.181711 systemd[1]: session-5.scope: Deactivated successfully. Mar 13 00:35:53.183324 systemd-logind[1612]: Removed session 5. Mar 13 00:35:53.283488 systemd[1]: Started sshd@5-10.0.1.99:22-4.153.228.146:47008.service - OpenSSH per-connection server daemon (4.153.228.146:47008). Mar 13 00:35:53.797663 sshd[1833]: Accepted publickey for core from 4.153.228.146 port 47008 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:35:53.798983 sshd-session[1833]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:53.805229 systemd-logind[1612]: New session 6 of user core. Mar 13 00:35:53.816047 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 13 00:35:54.083044 sshd[1836]: Connection closed by 4.153.228.146 port 47008 Mar 13 00:35:54.083766 sshd-session[1833]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:54.087672 systemd-logind[1612]: Session 6 logged out. Waiting for processes to exit. Mar 13 00:35:54.087936 systemd[1]: sshd@5-10.0.1.99:22-4.153.228.146:47008.service: Deactivated successfully. Mar 13 00:35:54.089964 systemd[1]: session-6.scope: Deactivated successfully. Mar 13 00:35:54.091481 systemd-logind[1612]: Removed session 6. Mar 13 00:35:54.202988 systemd[1]: Started sshd@6-10.0.1.99:22-4.153.228.146:47016.service - OpenSSH per-connection server daemon (4.153.228.146:47016). Mar 13 00:35:54.744319 sshd[1842]: Accepted publickey for core from 4.153.228.146 port 47016 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:35:54.745934 sshd-session[1842]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:54.755052 systemd-logind[1612]: New session 7 of user core. Mar 13 00:35:54.762224 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 13 00:35:54.962071 sudo[1846]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 13 00:35:54.962877 sudo[1846]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:35:54.976086 sudo[1846]: pam_unix(sudo:session): session closed for user root Mar 13 00:35:55.071790 sshd[1845]: Connection closed by 4.153.228.146 port 47016 Mar 13 00:35:55.073385 sshd-session[1842]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:55.085010 systemd[1]: sshd@6-10.0.1.99:22-4.153.228.146:47016.service: Deactivated successfully. Mar 13 00:35:55.089970 systemd[1]: session-7.scope: Deactivated successfully. Mar 13 00:35:55.092761 systemd-logind[1612]: Session 7 logged out. Waiting for processes to exit. Mar 13 00:35:55.096558 systemd-logind[1612]: Removed session 7. Mar 13 00:35:55.181278 systemd[1]: Started sshd@7-10.0.1.99:22-4.153.228.146:47018.service - OpenSSH per-connection server daemon (4.153.228.146:47018). Mar 13 00:35:55.687102 sshd[1852]: Accepted publickey for core from 4.153.228.146 port 47018 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:35:55.688259 sshd-session[1852]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:55.692193 systemd-logind[1612]: New session 8 of user core. Mar 13 00:35:55.698737 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 13 00:35:55.890976 sudo[1857]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 13 00:35:55.891703 sudo[1857]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:35:55.902370 sudo[1857]: pam_unix(sudo:session): session closed for user root Mar 13 00:35:55.914074 sudo[1856]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 13 00:35:55.914543 sudo[1856]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:35:55.936801 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 13 00:35:56.003774 augenrules[1879]: No rules Mar 13 00:35:56.005753 systemd[1]: audit-rules.service: Deactivated successfully. Mar 13 00:35:56.006285 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 13 00:35:56.007877 sudo[1856]: pam_unix(sudo:session): session closed for user root Mar 13 00:35:56.102669 sshd[1855]: Connection closed by 4.153.228.146 port 47018 Mar 13 00:35:56.103219 sshd-session[1852]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:56.107278 systemd[1]: sshd@7-10.0.1.99:22-4.153.228.146:47018.service: Deactivated successfully. Mar 13 00:35:56.109044 systemd[1]: session-8.scope: Deactivated successfully. Mar 13 00:35:56.109976 systemd-logind[1612]: Session 8 logged out. Waiting for processes to exit. Mar 13 00:35:56.111282 systemd-logind[1612]: Removed session 8. Mar 13 00:35:56.213871 systemd[1]: Started sshd@8-10.0.1.99:22-4.153.228.146:47022.service - OpenSSH per-connection server daemon (4.153.228.146:47022). Mar 13 00:35:56.740026 sshd[1888]: Accepted publickey for core from 4.153.228.146 port 47022 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:35:56.741918 sshd-session[1888]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:56.746513 systemd-logind[1612]: New session 9 of user core. Mar 13 00:35:56.756801 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 13 00:35:56.934356 sudo[1892]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 13 00:35:56.935036 sudo[1892]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:35:57.314814 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 13 00:35:57.337911 (dockerd)[1910]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 13 00:35:57.642112 dockerd[1910]: time="2026-03-13T00:35:57.642050467Z" level=info msg="Starting up" Mar 13 00:35:57.642957 dockerd[1910]: time="2026-03-13T00:35:57.642916233Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 13 00:35:57.658442 dockerd[1910]: time="2026-03-13T00:35:57.657780648Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 13 00:35:57.690931 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2525964207-merged.mount: Deactivated successfully. Mar 13 00:35:57.701745 systemd[1]: var-lib-docker-metacopy\x2dcheck2807211729-merged.mount: Deactivated successfully. Mar 13 00:35:57.723132 dockerd[1910]: time="2026-03-13T00:35:57.723070604Z" level=info msg="Loading containers: start." Mar 13 00:35:57.737594 kernel: Initializing XFRM netlink socket Mar 13 00:35:58.020198 systemd-networkd[1523]: docker0: Link UP Mar 13 00:35:58.026661 dockerd[1910]: time="2026-03-13T00:35:58.026628551Z" level=info msg="Loading containers: done." Mar 13 00:35:58.041268 dockerd[1910]: time="2026-03-13T00:35:58.041004882Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 13 00:35:58.041268 dockerd[1910]: time="2026-03-13T00:35:58.041084851Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 13 00:35:58.041268 dockerd[1910]: time="2026-03-13T00:35:58.041148545Z" level=info msg="Initializing buildkit" Mar 13 00:35:58.066681 dockerd[1910]: time="2026-03-13T00:35:58.066644161Z" level=info msg="Completed buildkit initialization" Mar 13 00:35:58.072970 dockerd[1910]: time="2026-03-13T00:35:58.072936420Z" level=info msg="Daemon has completed initialization" Mar 13 00:35:58.073049 dockerd[1910]: time="2026-03-13T00:35:58.073000452Z" level=info msg="API listen on /run/docker.sock" Mar 13 00:35:58.073727 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 13 00:35:58.685246 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck780506978-merged.mount: Deactivated successfully. Mar 13 00:35:58.963015 chronyd[1598]: Selected source PHC0 Mar 13 00:35:59.055901 containerd[1639]: time="2026-03-13T00:35:59.055827031Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\"" Mar 13 00:35:59.374569 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 13 00:35:59.376541 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:35:59.549299 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:35:59.558061 (kubelet)[2128]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 13 00:35:59.608944 kubelet[2128]: E0313 00:35:59.608902 2128 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 13 00:35:59.612932 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 13 00:35:59.613184 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 13 00:35:59.613527 systemd[1]: kubelet.service: Consumed 151ms CPU time, 108.6M memory peak. Mar 13 00:35:59.660300 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1662082725.mount: Deactivated successfully. Mar 13 00:36:01.045633 containerd[1639]: time="2026-03-13T00:36:01.045214549Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:01.047218 containerd[1639]: time="2026-03-13T00:36:01.047197389Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.9: active requests=0, bytes read=30116284" Mar 13 00:36:01.048925 containerd[1639]: time="2026-03-13T00:36:01.048906350Z" level=info msg="ImageCreate event name:\"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:01.052799 containerd[1639]: time="2026-03-13T00:36:01.052760565Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:01.053288 containerd[1639]: time="2026-03-13T00:36:01.053265315Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.9\" with image id \"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\", size \"30112785\" in 1.997402244s" Mar 13 00:36:01.053362 containerd[1639]: time="2026-03-13T00:36:01.053350794Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\" returns image reference \"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\"" Mar 13 00:36:01.054191 containerd[1639]: time="2026-03-13T00:36:01.054168602Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\"" Mar 13 00:36:02.536171 containerd[1639]: time="2026-03-13T00:36:02.536111908Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:02.537540 containerd[1639]: time="2026-03-13T00:36:02.537303398Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.9: active requests=0, bytes read=26021830" Mar 13 00:36:02.538943 containerd[1639]: time="2026-03-13T00:36:02.538920716Z" level=info msg="ImageCreate event name:\"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:02.541597 containerd[1639]: time="2026-03-13T00:36:02.541562834Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:02.542358 containerd[1639]: time="2026-03-13T00:36:02.542334806Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.9\" with image id \"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\", size \"27678758\" in 1.488138383s" Mar 13 00:36:02.542419 containerd[1639]: time="2026-03-13T00:36:02.542409439Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\" returns image reference \"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\"" Mar 13 00:36:02.542850 containerd[1639]: time="2026-03-13T00:36:02.542836671Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\"" Mar 13 00:36:03.739640 containerd[1639]: time="2026-03-13T00:36:03.739014707Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:03.740584 containerd[1639]: time="2026-03-13T00:36:03.740552912Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.9: active requests=0, bytes read=20162766" Mar 13 00:36:03.743028 containerd[1639]: time="2026-03-13T00:36:03.743008958Z" level=info msg="ImageCreate event name:\"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:03.746935 containerd[1639]: time="2026-03-13T00:36:03.746908200Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:03.749349 containerd[1639]: time="2026-03-13T00:36:03.749308933Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.9\" with image id \"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\", size \"21819712\" in 1.206381127s" Mar 13 00:36:03.749420 containerd[1639]: time="2026-03-13T00:36:03.749355695Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\" returns image reference \"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\"" Mar 13 00:36:03.751126 containerd[1639]: time="2026-03-13T00:36:03.751107752Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\"" Mar 13 00:36:05.502928 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1545994764.mount: Deactivated successfully. Mar 13 00:36:05.879170 containerd[1639]: time="2026-03-13T00:36:05.878547791Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:05.880512 containerd[1639]: time="2026-03-13T00:36:05.880479162Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.9: active requests=0, bytes read=31828673" Mar 13 00:36:05.882726 containerd[1639]: time="2026-03-13T00:36:05.882690964Z" level=info msg="ImageCreate event name:\"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:05.886382 containerd[1639]: time="2026-03-13T00:36:05.886353403Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:05.886916 containerd[1639]: time="2026-03-13T00:36:05.886892168Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.9\" with image id \"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\", repo tag \"registry.k8s.io/kube-proxy:v1.33.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\", size \"31827666\" in 2.135696634s" Mar 13 00:36:05.886978 containerd[1639]: time="2026-03-13T00:36:05.886967379Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\" returns image reference \"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\"" Mar 13 00:36:05.887507 containerd[1639]: time="2026-03-13T00:36:05.887482877Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Mar 13 00:36:06.387395 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3863241538.mount: Deactivated successfully. Mar 13 00:36:07.218667 containerd[1639]: time="2026-03-13T00:36:07.218602747Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:07.220121 containerd[1639]: time="2026-03-13T00:36:07.219925922Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942330" Mar 13 00:36:07.221724 containerd[1639]: time="2026-03-13T00:36:07.221703671Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:07.226372 containerd[1639]: time="2026-03-13T00:36:07.226344810Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:07.227282 containerd[1639]: time="2026-03-13T00:36:07.227261023Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.33974946s" Mar 13 00:36:07.227335 containerd[1639]: time="2026-03-13T00:36:07.227289188Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Mar 13 00:36:07.228026 containerd[1639]: time="2026-03-13T00:36:07.228003624Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 13 00:36:07.733724 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3553422783.mount: Deactivated successfully. Mar 13 00:36:07.744517 containerd[1639]: time="2026-03-13T00:36:07.744403384Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:36:07.747111 containerd[1639]: time="2026-03-13T00:36:07.747030794Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321158" Mar 13 00:36:07.748749 containerd[1639]: time="2026-03-13T00:36:07.748675896Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:36:07.752737 containerd[1639]: time="2026-03-13T00:36:07.752563341Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:36:07.754444 containerd[1639]: time="2026-03-13T00:36:07.754162852Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 526.120929ms" Mar 13 00:36:07.754444 containerd[1639]: time="2026-03-13T00:36:07.754222036Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Mar 13 00:36:07.755336 containerd[1639]: time="2026-03-13T00:36:07.755290651Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\"" Mar 13 00:36:08.318183 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4098189010.mount: Deactivated successfully. Mar 13 00:36:09.625110 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 13 00:36:09.627533 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:36:10.389774 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:36:10.393530 (kubelet)[2322]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 13 00:36:10.430837 kubelet[2322]: E0313 00:36:10.430772 2322 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 13 00:36:10.433171 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 13 00:36:10.433299 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 13 00:36:10.433639 systemd[1]: kubelet.service: Consumed 145ms CPU time, 108.5M memory peak. Mar 13 00:36:10.786916 containerd[1639]: time="2026-03-13T00:36:10.786255027Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.24-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:10.787752 containerd[1639]: time="2026-03-13T00:36:10.787729959Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.24-0: active requests=0, bytes read=23718938" Mar 13 00:36:10.789211 containerd[1639]: time="2026-03-13T00:36:10.789189585Z" level=info msg="ImageCreate event name:\"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:10.792638 containerd[1639]: time="2026-03-13T00:36:10.792615128Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:10.794031 containerd[1639]: time="2026-03-13T00:36:10.794010148Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.24-0\" with image id \"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\", repo tag \"registry.k8s.io/etcd:3.5.24-0\", repo digest \"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\", size \"23716032\" in 3.038675646s" Mar 13 00:36:10.794065 containerd[1639]: time="2026-03-13T00:36:10.794035931Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\" returns image reference \"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\"" Mar 13 00:36:13.322251 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:36:13.322386 systemd[1]: kubelet.service: Consumed 145ms CPU time, 108.5M memory peak. Mar 13 00:36:13.330748 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:36:13.362868 systemd[1]: Reload requested from client PID 2372 ('systemctl') (unit session-9.scope)... Mar 13 00:36:13.362882 systemd[1]: Reloading... Mar 13 00:36:13.455582 zram_generator::config[2415]: No configuration found. Mar 13 00:36:13.641198 systemd[1]: Reloading finished in 278 ms. Mar 13 00:36:13.737086 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:36:13.745808 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:36:13.747263 systemd[1]: kubelet.service: Deactivated successfully. Mar 13 00:36:13.747742 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:36:13.747852 systemd[1]: kubelet.service: Consumed 93ms CPU time, 98.3M memory peak. Mar 13 00:36:13.750651 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:36:14.963380 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:36:14.969883 (kubelet)[2471]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 13 00:36:15.006507 kubelet[2471]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 00:36:15.006507 kubelet[2471]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 13 00:36:15.006507 kubelet[2471]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 00:36:15.006945 kubelet[2471]: I0313 00:36:15.006546 2471 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 00:36:15.766618 kubelet[2471]: I0313 00:36:15.765872 2471 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 13 00:36:15.766618 kubelet[2471]: I0313 00:36:15.765947 2471 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 00:36:15.767208 kubelet[2471]: I0313 00:36:15.767175 2471 server.go:956] "Client rotation is on, will bootstrap in background" Mar 13 00:36:16.562189 kubelet[2471]: E0313 00:36:16.562154 2471 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.1.99:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.1.99:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 13 00:36:16.571465 kubelet[2471]: I0313 00:36:16.570887 2471 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 13 00:36:16.602935 kubelet[2471]: I0313 00:36:16.602900 2471 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 00:36:16.606355 kubelet[2471]: I0313 00:36:16.606282 2471 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 13 00:36:16.607744 kubelet[2471]: I0313 00:36:16.607696 2471 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 00:36:16.607915 kubelet[2471]: I0313 00:36:16.607736 2471 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-23cf6448d4","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 00:36:16.607915 kubelet[2471]: I0313 00:36:16.607910 2471 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 00:36:16.608098 kubelet[2471]: I0313 00:36:16.607919 2471 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 00:36:16.608098 kubelet[2471]: I0313 00:36:16.608038 2471 state_mem.go:36] "Initialized new in-memory state store" Mar 13 00:36:16.846894 kubelet[2471]: I0313 00:36:16.846740 2471 kubelet.go:480] "Attempting to sync node with API server" Mar 13 00:36:16.846894 kubelet[2471]: I0313 00:36:16.846805 2471 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 00:36:16.846894 kubelet[2471]: I0313 00:36:16.846860 2471 kubelet.go:386] "Adding apiserver pod source" Mar 13 00:36:16.855599 kubelet[2471]: I0313 00:36:16.854764 2471 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 00:36:16.920386 kubelet[2471]: E0313 00:36:16.920346 2471 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.1.99:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-n-23cf6448d4&limit=500&resourceVersion=0\": dial tcp 10.0.1.99:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 13 00:36:16.921166 kubelet[2471]: E0313 00:36:16.921127 2471 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.1.99:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.1.99:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 13 00:36:16.928466 kubelet[2471]: I0313 00:36:16.927587 2471 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 13 00:36:16.928466 kubelet[2471]: I0313 00:36:16.928341 2471 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 13 00:36:16.930110 kubelet[2471]: W0313 00:36:16.930096 2471 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 13 00:36:16.935364 kubelet[2471]: I0313 00:36:16.935348 2471 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 13 00:36:16.935485 kubelet[2471]: I0313 00:36:16.935476 2471 server.go:1289] "Started kubelet" Mar 13 00:36:16.945366 kubelet[2471]: I0313 00:36:16.945345 2471 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 00:36:16.946862 kubelet[2471]: E0313 00:36:16.944528 2471 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.1.99:6443/api/v1/namespaces/default/events\": dial tcp 10.0.1.99:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-4-n-23cf6448d4.189c3f811659ba0e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-4-n-23cf6448d4,UID:ci-4459-2-4-n-23cf6448d4,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-23cf6448d4,},FirstTimestamp:2026-03-13 00:36:16.935442958 +0000 UTC m=+1.960957577,LastTimestamp:2026-03-13 00:36:16.935442958 +0000 UTC m=+1.960957577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-23cf6448d4,}" Mar 13 00:36:16.953152 kubelet[2471]: I0313 00:36:16.952903 2471 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 00:36:16.954206 kubelet[2471]: I0313 00:36:16.954183 2471 server.go:317] "Adding debug handlers to kubelet server" Mar 13 00:36:16.955151 kubelet[2471]: I0313 00:36:16.955138 2471 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 13 00:36:16.955430 kubelet[2471]: E0313 00:36:16.955416 2471 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-23cf6448d4\" not found" Mar 13 00:36:16.957777 kubelet[2471]: I0313 00:36:16.957651 2471 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 13 00:36:16.957777 kubelet[2471]: I0313 00:36:16.957694 2471 reconciler.go:26] "Reconciler: start to sync state" Mar 13 00:36:16.958951 kubelet[2471]: I0313 00:36:16.958888 2471 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 00:36:16.959185 kubelet[2471]: I0313 00:36:16.959168 2471 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 00:36:16.959472 kubelet[2471]: I0313 00:36:16.959430 2471 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 13 00:36:16.960469 kubelet[2471]: E0313 00:36:16.960428 2471 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.1.99:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-23cf6448d4?timeout=10s\": dial tcp 10.0.1.99:6443: connect: connection refused" interval="200ms" Mar 13 00:36:16.961043 kubelet[2471]: E0313 00:36:16.960945 2471 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.1.99:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.1.99:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 13 00:36:16.961379 kubelet[2471]: I0313 00:36:16.961350 2471 factory.go:223] Registration of the systemd container factory successfully Mar 13 00:36:16.961483 kubelet[2471]: I0313 00:36:16.961460 2471 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 13 00:36:16.962669 kubelet[2471]: E0313 00:36:16.962541 2471 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 13 00:36:16.963743 kubelet[2471]: I0313 00:36:16.962877 2471 factory.go:223] Registration of the containerd container factory successfully Mar 13 00:36:16.979734 kubelet[2471]: I0313 00:36:16.979700 2471 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 13 00:36:16.980966 kubelet[2471]: I0313 00:36:16.980956 2471 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 13 00:36:16.981050 kubelet[2471]: I0313 00:36:16.981043 2471 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 13 00:36:16.981091 kubelet[2471]: I0313 00:36:16.981086 2471 state_mem.go:36] "Initialized new in-memory state store" Mar 13 00:36:16.982329 kubelet[2471]: I0313 00:36:16.982302 2471 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 13 00:36:16.982329 kubelet[2471]: I0313 00:36:16.982329 2471 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 13 00:36:16.982407 kubelet[2471]: I0313 00:36:16.982352 2471 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 13 00:36:16.982407 kubelet[2471]: I0313 00:36:16.982360 2471 kubelet.go:2436] "Starting kubelet main sync loop" Mar 13 00:36:16.982486 kubelet[2471]: E0313 00:36:16.982404 2471 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 00:36:16.983256 kubelet[2471]: E0313 00:36:16.983229 2471 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.1.99:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.1.99:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 13 00:36:16.987910 kubelet[2471]: I0313 00:36:16.987895 2471 policy_none.go:49] "None policy: Start" Mar 13 00:36:16.987986 kubelet[2471]: I0313 00:36:16.987979 2471 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 13 00:36:16.988023 kubelet[2471]: I0313 00:36:16.988019 2471 state_mem.go:35] "Initializing new in-memory state store" Mar 13 00:36:16.995034 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 13 00:36:17.007179 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 13 00:36:17.010634 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 13 00:36:17.018263 kubelet[2471]: E0313 00:36:17.018237 2471 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 13 00:36:17.018408 kubelet[2471]: I0313 00:36:17.018396 2471 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 00:36:17.018434 kubelet[2471]: I0313 00:36:17.018409 2471 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 00:36:17.019208 kubelet[2471]: I0313 00:36:17.019151 2471 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 00:36:17.020065 kubelet[2471]: E0313 00:36:17.020051 2471 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 13 00:36:17.020321 kubelet[2471]: E0313 00:36:17.020311 2471 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-4-n-23cf6448d4\" not found" Mar 13 00:36:17.098074 systemd[1]: Created slice kubepods-burstable-podc6aa30fa97d853bf410d30775777110e.slice - libcontainer container kubepods-burstable-podc6aa30fa97d853bf410d30775777110e.slice. Mar 13 00:36:17.113202 kubelet[2471]: E0313 00:36:17.113170 2471 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-23cf6448d4\" not found" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.117331 systemd[1]: Created slice kubepods-burstable-pode121a5d3ce0b74b3dd58765f391582ab.slice - libcontainer container kubepods-burstable-pode121a5d3ce0b74b3dd58765f391582ab.slice. Mar 13 00:36:17.119806 kubelet[2471]: I0313 00:36:17.119785 2471 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.120349 kubelet[2471]: E0313 00:36:17.120321 2471 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.1.99:6443/api/v1/nodes\": dial tcp 10.0.1.99:6443: connect: connection refused" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.129944 kubelet[2471]: E0313 00:36:17.129912 2471 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-23cf6448d4\" not found" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.133463 systemd[1]: Created slice kubepods-burstable-pod53388ea298f278325c4f4e63339e330a.slice - libcontainer container kubepods-burstable-pod53388ea298f278325c4f4e63339e330a.slice. Mar 13 00:36:17.136366 kubelet[2471]: E0313 00:36:17.136329 2471 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-23cf6448d4\" not found" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.158840 kubelet[2471]: I0313 00:36:17.158790 2471 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c6aa30fa97d853bf410d30775777110e-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-23cf6448d4\" (UID: \"c6aa30fa97d853bf410d30775777110e\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.158840 kubelet[2471]: I0313 00:36:17.158825 2471 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c6aa30fa97d853bf410d30775777110e-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-23cf6448d4\" (UID: \"c6aa30fa97d853bf410d30775777110e\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.158840 kubelet[2471]: I0313 00:36:17.158844 2471 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c6aa30fa97d853bf410d30775777110e-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-23cf6448d4\" (UID: \"c6aa30fa97d853bf410d30775777110e\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.158840 kubelet[2471]: I0313 00:36:17.158859 2471 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c6aa30fa97d853bf410d30775777110e-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-23cf6448d4\" (UID: \"c6aa30fa97d853bf410d30775777110e\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.159271 kubelet[2471]: I0313 00:36:17.158888 2471 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/53388ea298f278325c4f4e63339e330a-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-23cf6448d4\" (UID: \"53388ea298f278325c4f4e63339e330a\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.159271 kubelet[2471]: I0313 00:36:17.158905 2471 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/53388ea298f278325c4f4e63339e330a-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-23cf6448d4\" (UID: \"53388ea298f278325c4f4e63339e330a\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.159271 kubelet[2471]: I0313 00:36:17.158919 2471 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c6aa30fa97d853bf410d30775777110e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-23cf6448d4\" (UID: \"c6aa30fa97d853bf410d30775777110e\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.159271 kubelet[2471]: I0313 00:36:17.158933 2471 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e121a5d3ce0b74b3dd58765f391582ab-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-23cf6448d4\" (UID: \"e121a5d3ce0b74b3dd58765f391582ab\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.159271 kubelet[2471]: I0313 00:36:17.158957 2471 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/53388ea298f278325c4f4e63339e330a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-23cf6448d4\" (UID: \"53388ea298f278325c4f4e63339e330a\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.161051 kubelet[2471]: E0313 00:36:17.161012 2471 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.1.99:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-23cf6448d4?timeout=10s\": dial tcp 10.0.1.99:6443: connect: connection refused" interval="400ms" Mar 13 00:36:17.324875 kubelet[2471]: I0313 00:36:17.324803 2471 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.325504 kubelet[2471]: E0313 00:36:17.325448 2471 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.1.99:6443/api/v1/nodes\": dial tcp 10.0.1.99:6443: connect: connection refused" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.416236 containerd[1639]: time="2026-03-13T00:36:17.416139285Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-23cf6448d4,Uid:c6aa30fa97d853bf410d30775777110e,Namespace:kube-system,Attempt:0,}" Mar 13 00:36:17.431595 containerd[1639]: time="2026-03-13T00:36:17.431523216Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-23cf6448d4,Uid:e121a5d3ce0b74b3dd58765f391582ab,Namespace:kube-system,Attempt:0,}" Mar 13 00:36:17.437780 containerd[1639]: time="2026-03-13T00:36:17.437705420Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-23cf6448d4,Uid:53388ea298f278325c4f4e63339e330a,Namespace:kube-system,Attempt:0,}" Mar 13 00:36:17.466171 containerd[1639]: time="2026-03-13T00:36:17.466091659Z" level=info msg="connecting to shim f1fb029220b9fcae0db2282066d61b054ee845aa9a2fe9e6c36e5852e36bc669" address="unix:///run/containerd/s/4b01ad3fffc8d595d0fd7ef53a333aaabb5e49fb17ad697536bce1b9869c178c" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:36:17.511603 containerd[1639]: time="2026-03-13T00:36:17.511484012Z" level=info msg="connecting to shim b0cea271265674788d782ba8146e6d9698855357895234de2f2c4b7c17c20b0b" address="unix:///run/containerd/s/a138d033dce33b6bf1a28abff3a5c8c3373d78ba768d2a12c635c55846bd0199" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:36:17.518085 containerd[1639]: time="2026-03-13T00:36:17.517457863Z" level=info msg="connecting to shim ad7621a7f325fe17e74a6168f0d404135d861561bc68a5c3d5a52e068dbb10e3" address="unix:///run/containerd/s/500622851d357e0bd3eec6092a7919d787258ecf12b60961c23ae12937075a73" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:36:17.519951 systemd[1]: Started cri-containerd-f1fb029220b9fcae0db2282066d61b054ee845aa9a2fe9e6c36e5852e36bc669.scope - libcontainer container f1fb029220b9fcae0db2282066d61b054ee845aa9a2fe9e6c36e5852e36bc669. Mar 13 00:36:17.557775 systemd[1]: Started cri-containerd-b0cea271265674788d782ba8146e6d9698855357895234de2f2c4b7c17c20b0b.scope - libcontainer container b0cea271265674788d782ba8146e6d9698855357895234de2f2c4b7c17c20b0b. Mar 13 00:36:17.561909 kubelet[2471]: E0313 00:36:17.561862 2471 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.1.99:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-23cf6448d4?timeout=10s\": dial tcp 10.0.1.99:6443: connect: connection refused" interval="800ms" Mar 13 00:36:17.563538 systemd[1]: Started cri-containerd-ad7621a7f325fe17e74a6168f0d404135d861561bc68a5c3d5a52e068dbb10e3.scope - libcontainer container ad7621a7f325fe17e74a6168f0d404135d861561bc68a5c3d5a52e068dbb10e3. Mar 13 00:36:17.599098 containerd[1639]: time="2026-03-13T00:36:17.599054637Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-23cf6448d4,Uid:c6aa30fa97d853bf410d30775777110e,Namespace:kube-system,Attempt:0,} returns sandbox id \"f1fb029220b9fcae0db2282066d61b054ee845aa9a2fe9e6c36e5852e36bc669\"" Mar 13 00:36:17.607011 containerd[1639]: time="2026-03-13T00:36:17.606974954Z" level=info msg="CreateContainer within sandbox \"f1fb029220b9fcae0db2282066d61b054ee845aa9a2fe9e6c36e5852e36bc669\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 13 00:36:17.620736 containerd[1639]: time="2026-03-13T00:36:17.620665616Z" level=info msg="Container a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:17.633105 containerd[1639]: time="2026-03-13T00:36:17.633066137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-23cf6448d4,Uid:53388ea298f278325c4f4e63339e330a,Namespace:kube-system,Attempt:0,} returns sandbox id \"b0cea271265674788d782ba8146e6d9698855357895234de2f2c4b7c17c20b0b\"" Mar 13 00:36:17.639216 containerd[1639]: time="2026-03-13T00:36:17.639151944Z" level=info msg="CreateContainer within sandbox \"b0cea271265674788d782ba8146e6d9698855357895234de2f2c4b7c17c20b0b\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 13 00:36:17.640018 containerd[1639]: time="2026-03-13T00:36:17.639972863Z" level=info msg="CreateContainer within sandbox \"f1fb029220b9fcae0db2282066d61b054ee845aa9a2fe9e6c36e5852e36bc669\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21\"" Mar 13 00:36:17.640863 containerd[1639]: time="2026-03-13T00:36:17.640799033Z" level=info msg="StartContainer for \"a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21\"" Mar 13 00:36:17.641998 containerd[1639]: time="2026-03-13T00:36:17.641978624Z" level=info msg="connecting to shim a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21" address="unix:///run/containerd/s/4b01ad3fffc8d595d0fd7ef53a333aaabb5e49fb17ad697536bce1b9869c178c" protocol=ttrpc version=3 Mar 13 00:36:17.651641 containerd[1639]: time="2026-03-13T00:36:17.651340361Z" level=info msg="Container 99b485a6d18bb6a5d21d210da5a2ca022fc3ab3a186cff1693be8ce38bfc0753: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:17.663800 containerd[1639]: time="2026-03-13T00:36:17.663776744Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-23cf6448d4,Uid:e121a5d3ce0b74b3dd58765f391582ab,Namespace:kube-system,Attempt:0,} returns sandbox id \"ad7621a7f325fe17e74a6168f0d404135d861561bc68a5c3d5a52e068dbb10e3\"" Mar 13 00:36:17.664450 containerd[1639]: time="2026-03-13T00:36:17.664434342Z" level=info msg="CreateContainer within sandbox \"b0cea271265674788d782ba8146e6d9698855357895234de2f2c4b7c17c20b0b\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"99b485a6d18bb6a5d21d210da5a2ca022fc3ab3a186cff1693be8ce38bfc0753\"" Mar 13 00:36:17.665117 containerd[1639]: time="2026-03-13T00:36:17.665102393Z" level=info msg="StartContainer for \"99b485a6d18bb6a5d21d210da5a2ca022fc3ab3a186cff1693be8ce38bfc0753\"" Mar 13 00:36:17.666049 containerd[1639]: time="2026-03-13T00:36:17.666031940Z" level=info msg="connecting to shim 99b485a6d18bb6a5d21d210da5a2ca022fc3ab3a186cff1693be8ce38bfc0753" address="unix:///run/containerd/s/a138d033dce33b6bf1a28abff3a5c8c3373d78ba768d2a12c635c55846bd0199" protocol=ttrpc version=3 Mar 13 00:36:17.666750 systemd[1]: Started cri-containerd-a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21.scope - libcontainer container a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21. Mar 13 00:36:17.676332 containerd[1639]: time="2026-03-13T00:36:17.675758327Z" level=info msg="CreateContainer within sandbox \"ad7621a7f325fe17e74a6168f0d404135d861561bc68a5c3d5a52e068dbb10e3\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 13 00:36:17.689728 containerd[1639]: time="2026-03-13T00:36:17.689705013Z" level=info msg="Container 4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:17.691722 systemd[1]: Started cri-containerd-99b485a6d18bb6a5d21d210da5a2ca022fc3ab3a186cff1693be8ce38bfc0753.scope - libcontainer container 99b485a6d18bb6a5d21d210da5a2ca022fc3ab3a186cff1693be8ce38bfc0753. Mar 13 00:36:17.698709 containerd[1639]: time="2026-03-13T00:36:17.698393604Z" level=info msg="CreateContainer within sandbox \"ad7621a7f325fe17e74a6168f0d404135d861561bc68a5c3d5a52e068dbb10e3\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c\"" Mar 13 00:36:17.699363 containerd[1639]: time="2026-03-13T00:36:17.699347427Z" level=info msg="StartContainer for \"4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c\"" Mar 13 00:36:17.700956 containerd[1639]: time="2026-03-13T00:36:17.700914046Z" level=info msg="connecting to shim 4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c" address="unix:///run/containerd/s/500622851d357e0bd3eec6092a7919d787258ecf12b60961c23ae12937075a73" protocol=ttrpc version=3 Mar 13 00:36:17.725766 systemd[1]: Started cri-containerd-4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c.scope - libcontainer container 4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c. Mar 13 00:36:17.730307 kubelet[2471]: I0313 00:36:17.730167 2471 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.732948 kubelet[2471]: E0313 00:36:17.732776 2471 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.1.99:6443/api/v1/nodes\": dial tcp 10.0.1.99:6443: connect: connection refused" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.749081 containerd[1639]: time="2026-03-13T00:36:17.748966941Z" level=info msg="StartContainer for \"a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21\" returns successfully" Mar 13 00:36:17.769116 containerd[1639]: time="2026-03-13T00:36:17.769088230Z" level=info msg="StartContainer for \"99b485a6d18bb6a5d21d210da5a2ca022fc3ab3a186cff1693be8ce38bfc0753\" returns successfully" Mar 13 00:36:17.796935 kubelet[2471]: E0313 00:36:17.796898 2471 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.1.99:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.1.99:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 13 00:36:17.807479 kubelet[2471]: E0313 00:36:17.807449 2471 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.1.99:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.1.99:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 13 00:36:17.835257 containerd[1639]: time="2026-03-13T00:36:17.834611000Z" level=info msg="StartContainer for \"4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c\" returns successfully" Mar 13 00:36:17.990521 kubelet[2471]: E0313 00:36:17.990258 2471 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-23cf6448d4\" not found" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.994535 kubelet[2471]: E0313 00:36:17.994340 2471 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-23cf6448d4\" not found" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:17.995679 kubelet[2471]: E0313 00:36:17.995667 2471 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-23cf6448d4\" not found" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:18.537202 kubelet[2471]: I0313 00:36:18.536963 2471 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:19.006138 kubelet[2471]: E0313 00:36:19.005922 2471 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-23cf6448d4\" not found" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:19.006758 kubelet[2471]: E0313 00:36:19.006741 2471 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-23cf6448d4\" not found" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:19.009325 kubelet[2471]: E0313 00:36:19.006945 2471 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-23cf6448d4\" not found" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:19.500738 kubelet[2471]: E0313 00:36:19.500689 2471 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-4-n-23cf6448d4\" not found" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:19.587103 kubelet[2471]: I0313 00:36:19.587059 2471 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:19.587369 kubelet[2471]: E0313 00:36:19.587232 2471 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4459-2-4-n-23cf6448d4\": node \"ci-4459-2-4-n-23cf6448d4\" not found" Mar 13 00:36:19.656155 kubelet[2471]: I0313 00:36:19.656110 2471 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:19.662989 kubelet[2471]: E0313 00:36:19.662896 2471 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-n-23cf6448d4\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:19.662989 kubelet[2471]: I0313 00:36:19.662957 2471 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:19.664503 kubelet[2471]: E0313 00:36:19.664473 2471 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-23cf6448d4\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:19.664503 kubelet[2471]: I0313 00:36:19.664491 2471 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:19.665674 kubelet[2471]: E0313 00:36:19.665659 2471 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-23cf6448d4\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:19.923411 kubelet[2471]: I0313 00:36:19.923035 2471 apiserver.go:52] "Watching apiserver" Mar 13 00:36:19.958507 kubelet[2471]: I0313 00:36:19.958471 2471 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 13 00:36:19.998764 kubelet[2471]: I0313 00:36:19.998209 2471 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:20.001426 kubelet[2471]: E0313 00:36:20.001311 2471 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-23cf6448d4\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:20.506840 update_engine[1613]: I20260313 00:36:20.505848 1613 update_attempter.cc:509] Updating boot flags... Mar 13 00:36:21.564733 systemd[1]: Reload requested from client PID 2762 ('systemctl') (unit session-9.scope)... Mar 13 00:36:21.564769 systemd[1]: Reloading... Mar 13 00:36:21.682605 zram_generator::config[2808]: No configuration found. Mar 13 00:36:21.882599 systemd[1]: Reloading finished in 316 ms. Mar 13 00:36:21.925032 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:36:21.947507 systemd[1]: kubelet.service: Deactivated successfully. Mar 13 00:36:21.948268 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:36:21.948509 systemd[1]: kubelet.service: Consumed 1.220s CPU time, 130.9M memory peak. Mar 13 00:36:21.952362 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:36:22.098904 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:36:22.111097 (kubelet)[2856]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 13 00:36:22.339300 kubelet[2856]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 00:36:22.339300 kubelet[2856]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 13 00:36:22.339300 kubelet[2856]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 00:36:22.339660 kubelet[2856]: I0313 00:36:22.339617 2856 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 00:36:22.350646 kubelet[2856]: I0313 00:36:22.349561 2856 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 13 00:36:22.350646 kubelet[2856]: I0313 00:36:22.349631 2856 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 00:36:22.350646 kubelet[2856]: I0313 00:36:22.350190 2856 server.go:956] "Client rotation is on, will bootstrap in background" Mar 13 00:36:22.353530 kubelet[2856]: I0313 00:36:22.353495 2856 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 13 00:36:22.358058 kubelet[2856]: I0313 00:36:22.358021 2856 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 13 00:36:22.362802 kubelet[2856]: I0313 00:36:22.362774 2856 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 00:36:22.367608 kubelet[2856]: I0313 00:36:22.367052 2856 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 13 00:36:22.367608 kubelet[2856]: I0313 00:36:22.367441 2856 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 00:36:22.368223 kubelet[2856]: I0313 00:36:22.367484 2856 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-23cf6448d4","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 00:36:22.368439 kubelet[2856]: I0313 00:36:22.368422 2856 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 00:36:22.368538 kubelet[2856]: I0313 00:36:22.368527 2856 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 00:36:22.368718 kubelet[2856]: I0313 00:36:22.368706 2856 state_mem.go:36] "Initialized new in-memory state store" Mar 13 00:36:22.369137 kubelet[2856]: I0313 00:36:22.369119 2856 kubelet.go:480] "Attempting to sync node with API server" Mar 13 00:36:22.369278 kubelet[2856]: I0313 00:36:22.369264 2856 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 00:36:22.369400 kubelet[2856]: I0313 00:36:22.369388 2856 kubelet.go:386] "Adding apiserver pod source" Mar 13 00:36:22.369510 kubelet[2856]: I0313 00:36:22.369498 2856 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 00:36:22.374045 kubelet[2856]: I0313 00:36:22.374014 2856 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 13 00:36:22.374954 kubelet[2856]: I0313 00:36:22.374933 2856 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 13 00:36:22.381631 kubelet[2856]: I0313 00:36:22.381154 2856 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 13 00:36:22.381631 kubelet[2856]: I0313 00:36:22.381214 2856 server.go:1289] "Started kubelet" Mar 13 00:36:22.384706 kubelet[2856]: I0313 00:36:22.384692 2856 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 00:36:22.395476 kubelet[2856]: I0313 00:36:22.395437 2856 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 00:36:22.397274 kubelet[2856]: I0313 00:36:22.397253 2856 server.go:317] "Adding debug handlers to kubelet server" Mar 13 00:36:22.402583 kubelet[2856]: I0313 00:36:22.401300 2856 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 13 00:36:22.402583 kubelet[2856]: I0313 00:36:22.401654 2856 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 00:36:22.402583 kubelet[2856]: I0313 00:36:22.401834 2856 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 00:36:22.402583 kubelet[2856]: I0313 00:36:22.402029 2856 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 13 00:36:22.408219 kubelet[2856]: I0313 00:36:22.407369 2856 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 13 00:36:22.408219 kubelet[2856]: I0313 00:36:22.407603 2856 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 13 00:36:22.408219 kubelet[2856]: I0313 00:36:22.407957 2856 reconciler.go:26] "Reconciler: start to sync state" Mar 13 00:36:22.408515 kubelet[2856]: I0313 00:36:22.408488 2856 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 13 00:36:22.408515 kubelet[2856]: I0313 00:36:22.408507 2856 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 13 00:36:22.408590 kubelet[2856]: I0313 00:36:22.408526 2856 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 13 00:36:22.408590 kubelet[2856]: I0313 00:36:22.408534 2856 kubelet.go:2436] "Starting kubelet main sync loop" Mar 13 00:36:22.408646 kubelet[2856]: E0313 00:36:22.408620 2856 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 00:36:22.411403 kubelet[2856]: I0313 00:36:22.411135 2856 factory.go:223] Registration of the systemd container factory successfully Mar 13 00:36:22.411636 kubelet[2856]: I0313 00:36:22.411616 2856 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 13 00:36:22.416211 kubelet[2856]: E0313 00:36:22.416184 2856 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 13 00:36:22.416384 kubelet[2856]: I0313 00:36:22.416375 2856 factory.go:223] Registration of the containerd container factory successfully Mar 13 00:36:22.495341 kubelet[2856]: I0313 00:36:22.495307 2856 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 13 00:36:22.495341 kubelet[2856]: I0313 00:36:22.495324 2856 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 13 00:36:22.495341 kubelet[2856]: I0313 00:36:22.495342 2856 state_mem.go:36] "Initialized new in-memory state store" Mar 13 00:36:22.495496 kubelet[2856]: I0313 00:36:22.495463 2856 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 13 00:36:22.495496 kubelet[2856]: I0313 00:36:22.495471 2856 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 13 00:36:22.495496 kubelet[2856]: I0313 00:36:22.495486 2856 policy_none.go:49] "None policy: Start" Mar 13 00:36:22.495496 kubelet[2856]: I0313 00:36:22.495496 2856 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 13 00:36:22.495588 kubelet[2856]: I0313 00:36:22.495506 2856 state_mem.go:35] "Initializing new in-memory state store" Mar 13 00:36:22.495613 kubelet[2856]: I0313 00:36:22.495603 2856 state_mem.go:75] "Updated machine memory state" Mar 13 00:36:22.499328 kubelet[2856]: E0313 00:36:22.499311 2856 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 13 00:36:22.499865 kubelet[2856]: I0313 00:36:22.499842 2856 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 00:36:22.499942 kubelet[2856]: I0313 00:36:22.499854 2856 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 00:36:22.500220 kubelet[2856]: I0313 00:36:22.500184 2856 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 00:36:22.501784 kubelet[2856]: E0313 00:36:22.501773 2856 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 13 00:36:22.510641 kubelet[2856]: I0313 00:36:22.510606 2856 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:22.510811 kubelet[2856]: I0313 00:36:22.510700 2856 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:22.511377 kubelet[2856]: I0313 00:36:22.511330 2856 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:22.606094 kubelet[2856]: I0313 00:36:22.606018 2856 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:22.609005 kubelet[2856]: I0313 00:36:22.608757 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c6aa30fa97d853bf410d30775777110e-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-23cf6448d4\" (UID: \"c6aa30fa97d853bf410d30775777110e\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:22.609005 kubelet[2856]: I0313 00:36:22.608797 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c6aa30fa97d853bf410d30775777110e-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-23cf6448d4\" (UID: \"c6aa30fa97d853bf410d30775777110e\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:22.609005 kubelet[2856]: I0313 00:36:22.608832 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c6aa30fa97d853bf410d30775777110e-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-23cf6448d4\" (UID: \"c6aa30fa97d853bf410d30775777110e\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:22.609005 kubelet[2856]: I0313 00:36:22.608861 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c6aa30fa97d853bf410d30775777110e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-23cf6448d4\" (UID: \"c6aa30fa97d853bf410d30775777110e\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:22.609005 kubelet[2856]: I0313 00:36:22.608891 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e121a5d3ce0b74b3dd58765f391582ab-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-23cf6448d4\" (UID: \"e121a5d3ce0b74b3dd58765f391582ab\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:22.609211 kubelet[2856]: I0313 00:36:22.608916 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c6aa30fa97d853bf410d30775777110e-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-23cf6448d4\" (UID: \"c6aa30fa97d853bf410d30775777110e\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:22.609211 kubelet[2856]: I0313 00:36:22.608935 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/53388ea298f278325c4f4e63339e330a-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-23cf6448d4\" (UID: \"53388ea298f278325c4f4e63339e330a\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:22.609211 kubelet[2856]: I0313 00:36:22.608951 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/53388ea298f278325c4f4e63339e330a-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-23cf6448d4\" (UID: \"53388ea298f278325c4f4e63339e330a\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:22.609211 kubelet[2856]: I0313 00:36:22.608967 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/53388ea298f278325c4f4e63339e330a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-23cf6448d4\" (UID: \"53388ea298f278325c4f4e63339e330a\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:22.616309 kubelet[2856]: I0313 00:36:22.616259 2856 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:22.616406 kubelet[2856]: I0313 00:36:22.616332 2856 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:23.373116 kubelet[2856]: I0313 00:36:23.373075 2856 apiserver.go:52] "Watching apiserver" Mar 13 00:36:23.407810 kubelet[2856]: I0313 00:36:23.407745 2856 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 13 00:36:23.433285 kubelet[2856]: I0313 00:36:23.433145 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" podStartSLOduration=1.433129522 podStartE2EDuration="1.433129522s" podCreationTimestamp="2026-03-13 00:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:36:23.432435954 +0000 UTC m=+1.311542834" watchObservedRunningTime="2026-03-13 00:36:23.433129522 +0000 UTC m=+1.312236400" Mar 13 00:36:23.439522 kubelet[2856]: I0313 00:36:23.439338 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-4-n-23cf6448d4" podStartSLOduration=1.439239066 podStartE2EDuration="1.439239066s" podCreationTimestamp="2026-03-13 00:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:36:23.438749524 +0000 UTC m=+1.317856401" watchObservedRunningTime="2026-03-13 00:36:23.439239066 +0000 UTC m=+1.318345935" Mar 13 00:36:23.457506 kubelet[2856]: I0313 00:36:23.457440 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-4-n-23cf6448d4" podStartSLOduration=1.457418168 podStartE2EDuration="1.457418168s" podCreationTimestamp="2026-03-13 00:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:36:23.446909139 +0000 UTC m=+1.326016016" watchObservedRunningTime="2026-03-13 00:36:23.457418168 +0000 UTC m=+1.336525079" Mar 13 00:36:23.474255 kubelet[2856]: I0313 00:36:23.473640 2856 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:23.474367 kubelet[2856]: I0313 00:36:23.474332 2856 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:23.480773 kubelet[2856]: E0313 00:36:23.480666 2856 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-23cf6448d4\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:23.481544 kubelet[2856]: E0313 00:36:23.481443 2856 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-n-23cf6448d4\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-23cf6448d4" Mar 13 00:36:28.106793 kubelet[2856]: I0313 00:36:28.106713 2856 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 13 00:36:28.107876 containerd[1639]: time="2026-03-13T00:36:28.107794010Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 13 00:36:28.108341 kubelet[2856]: I0313 00:36:28.108303 2856 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 13 00:36:28.877836 systemd[1]: Created slice kubepods-besteffort-pod8bd85026_ebcb_41c2_bf48_d899b642a97d.slice - libcontainer container kubepods-besteffort-pod8bd85026_ebcb_41c2_bf48_d899b642a97d.slice. Mar 13 00:36:28.950593 kubelet[2856]: I0313 00:36:28.950534 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/8bd85026-ebcb-41c2-bf48-d899b642a97d-kube-proxy\") pod \"kube-proxy-96k79\" (UID: \"8bd85026-ebcb-41c2-bf48-d899b642a97d\") " pod="kube-system/kube-proxy-96k79" Mar 13 00:36:28.950593 kubelet[2856]: I0313 00:36:28.950588 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8bd85026-ebcb-41c2-bf48-d899b642a97d-xtables-lock\") pod \"kube-proxy-96k79\" (UID: \"8bd85026-ebcb-41c2-bf48-d899b642a97d\") " pod="kube-system/kube-proxy-96k79" Mar 13 00:36:28.950793 kubelet[2856]: I0313 00:36:28.950610 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8bd85026-ebcb-41c2-bf48-d899b642a97d-lib-modules\") pod \"kube-proxy-96k79\" (UID: \"8bd85026-ebcb-41c2-bf48-d899b642a97d\") " pod="kube-system/kube-proxy-96k79" Mar 13 00:36:28.950793 kubelet[2856]: I0313 00:36:28.950637 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfmlh\" (UniqueName: \"kubernetes.io/projected/8bd85026-ebcb-41c2-bf48-d899b642a97d-kube-api-access-pfmlh\") pod \"kube-proxy-96k79\" (UID: \"8bd85026-ebcb-41c2-bf48-d899b642a97d\") " pod="kube-system/kube-proxy-96k79" Mar 13 00:36:29.106488 systemd[1]: Created slice kubepods-besteffort-pode8121ae7_2f38_420f_be40_022a2ccf3f12.slice - libcontainer container kubepods-besteffort-pode8121ae7_2f38_420f_be40_022a2ccf3f12.slice. Mar 13 00:36:29.152243 kubelet[2856]: I0313 00:36:29.152091 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e8121ae7-2f38-420f-be40-022a2ccf3f12-var-lib-calico\") pod \"tigera-operator-6bf85f8dd-bgcdh\" (UID: \"e8121ae7-2f38-420f-be40-022a2ccf3f12\") " pod="tigera-operator/tigera-operator-6bf85f8dd-bgcdh" Mar 13 00:36:29.152243 kubelet[2856]: I0313 00:36:29.152127 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrbm4\" (UniqueName: \"kubernetes.io/projected/e8121ae7-2f38-420f-be40-022a2ccf3f12-kube-api-access-jrbm4\") pod \"tigera-operator-6bf85f8dd-bgcdh\" (UID: \"e8121ae7-2f38-420f-be40-022a2ccf3f12\") " pod="tigera-operator/tigera-operator-6bf85f8dd-bgcdh" Mar 13 00:36:29.185326 containerd[1639]: time="2026-03-13T00:36:29.185212002Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-96k79,Uid:8bd85026-ebcb-41c2-bf48-d899b642a97d,Namespace:kube-system,Attempt:0,}" Mar 13 00:36:29.213977 containerd[1639]: time="2026-03-13T00:36:29.213934628Z" level=info msg="connecting to shim b10811c44810f380e2fbeac01a0cb632d279befed05689bafcc772bb2a8708a7" address="unix:///run/containerd/s/5c18a29078b26c0b8ba780c8c2bb083d875e0e0be1f2c8af28ffbb79badafa31" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:36:29.238870 systemd[1]: Started cri-containerd-b10811c44810f380e2fbeac01a0cb632d279befed05689bafcc772bb2a8708a7.scope - libcontainer container b10811c44810f380e2fbeac01a0cb632d279befed05689bafcc772bb2a8708a7. Mar 13 00:36:29.269163 containerd[1639]: time="2026-03-13T00:36:29.269117031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-96k79,Uid:8bd85026-ebcb-41c2-bf48-d899b642a97d,Namespace:kube-system,Attempt:0,} returns sandbox id \"b10811c44810f380e2fbeac01a0cb632d279befed05689bafcc772bb2a8708a7\"" Mar 13 00:36:29.276588 containerd[1639]: time="2026-03-13T00:36:29.276476446Z" level=info msg="CreateContainer within sandbox \"b10811c44810f380e2fbeac01a0cb632d279befed05689bafcc772bb2a8708a7\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 13 00:36:29.289243 containerd[1639]: time="2026-03-13T00:36:29.289200907Z" level=info msg="Container ee9af8e66bdf414b4a514d936f120b69d984724e4362dd8c54e49601eb204799: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:29.300151 containerd[1639]: time="2026-03-13T00:36:29.300102628Z" level=info msg="CreateContainer within sandbox \"b10811c44810f380e2fbeac01a0cb632d279befed05689bafcc772bb2a8708a7\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ee9af8e66bdf414b4a514d936f120b69d984724e4362dd8c54e49601eb204799\"" Mar 13 00:36:29.300771 containerd[1639]: time="2026-03-13T00:36:29.300750329Z" level=info msg="StartContainer for \"ee9af8e66bdf414b4a514d936f120b69d984724e4362dd8c54e49601eb204799\"" Mar 13 00:36:29.302898 containerd[1639]: time="2026-03-13T00:36:29.302825742Z" level=info msg="connecting to shim ee9af8e66bdf414b4a514d936f120b69d984724e4362dd8c54e49601eb204799" address="unix:///run/containerd/s/5c18a29078b26c0b8ba780c8c2bb083d875e0e0be1f2c8af28ffbb79badafa31" protocol=ttrpc version=3 Mar 13 00:36:29.321791 systemd[1]: Started cri-containerd-ee9af8e66bdf414b4a514d936f120b69d984724e4362dd8c54e49601eb204799.scope - libcontainer container ee9af8e66bdf414b4a514d936f120b69d984724e4362dd8c54e49601eb204799. Mar 13 00:36:29.393157 containerd[1639]: time="2026-03-13T00:36:29.393041604Z" level=info msg="StartContainer for \"ee9af8e66bdf414b4a514d936f120b69d984724e4362dd8c54e49601eb204799\" returns successfully" Mar 13 00:36:29.410301 containerd[1639]: time="2026-03-13T00:36:29.409974441Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-bgcdh,Uid:e8121ae7-2f38-420f-be40-022a2ccf3f12,Namespace:tigera-operator,Attempt:0,}" Mar 13 00:36:29.433494 containerd[1639]: time="2026-03-13T00:36:29.433407515Z" level=info msg="connecting to shim b004ad50701d8f52c4649d3047e5c72dff7b78e4981c18e6f39dc0282bf653a3" address="unix:///run/containerd/s/903172a71d70268301427c8b1b9b53c411d47b480d4a7275627dc5289cf1cf73" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:36:29.460701 systemd[1]: Started cri-containerd-b004ad50701d8f52c4649d3047e5c72dff7b78e4981c18e6f39dc0282bf653a3.scope - libcontainer container b004ad50701d8f52c4649d3047e5c72dff7b78e4981c18e6f39dc0282bf653a3. Mar 13 00:36:29.534156 containerd[1639]: time="2026-03-13T00:36:29.534104008Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-bgcdh,Uid:e8121ae7-2f38-420f-be40-022a2ccf3f12,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"b004ad50701d8f52c4649d3047e5c72dff7b78e4981c18e6f39dc0282bf653a3\"" Mar 13 00:36:29.536133 containerd[1639]: time="2026-03-13T00:36:29.536100640Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 13 00:36:30.074208 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4100315266.mount: Deactivated successfully. Mar 13 00:36:31.249134 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount772745680.mount: Deactivated successfully. Mar 13 00:36:31.965642 containerd[1639]: time="2026-03-13T00:36:31.965608006Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:31.966780 containerd[1639]: time="2026-03-13T00:36:31.966758662Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 13 00:36:31.968183 containerd[1639]: time="2026-03-13T00:36:31.968151070Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:31.970457 containerd[1639]: time="2026-03-13T00:36:31.970378306Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:31.971126 containerd[1639]: time="2026-03-13T00:36:31.970770574Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 2.434632073s" Mar 13 00:36:31.971126 containerd[1639]: time="2026-03-13T00:36:31.970796139Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 13 00:36:31.976459 containerd[1639]: time="2026-03-13T00:36:31.976189376Z" level=info msg="CreateContainer within sandbox \"b004ad50701d8f52c4649d3047e5c72dff7b78e4981c18e6f39dc0282bf653a3\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 13 00:36:31.984921 containerd[1639]: time="2026-03-13T00:36:31.984900704Z" level=info msg="Container f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:31.987711 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount125872077.mount: Deactivated successfully. Mar 13 00:36:31.996222 containerd[1639]: time="2026-03-13T00:36:31.996192502Z" level=info msg="CreateContainer within sandbox \"b004ad50701d8f52c4649d3047e5c72dff7b78e4981c18e6f39dc0282bf653a3\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8\"" Mar 13 00:36:31.996783 containerd[1639]: time="2026-03-13T00:36:31.996765382Z" level=info msg="StartContainer for \"f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8\"" Mar 13 00:36:31.997364 containerd[1639]: time="2026-03-13T00:36:31.997343641Z" level=info msg="connecting to shim f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8" address="unix:///run/containerd/s/903172a71d70268301427c8b1b9b53c411d47b480d4a7275627dc5289cf1cf73" protocol=ttrpc version=3 Mar 13 00:36:32.025747 systemd[1]: Started cri-containerd-f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8.scope - libcontainer container f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8. Mar 13 00:36:32.054136 containerd[1639]: time="2026-03-13T00:36:32.054107693Z" level=info msg="StartContainer for \"f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8\" returns successfully" Mar 13 00:36:32.524831 kubelet[2856]: I0313 00:36:32.524509 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-96k79" podStartSLOduration=4.52313456 podStartE2EDuration="4.52313456s" podCreationTimestamp="2026-03-13 00:36:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:36:29.50105835 +0000 UTC m=+7.380165225" watchObservedRunningTime="2026-03-13 00:36:32.52313456 +0000 UTC m=+10.402241480" Mar 13 00:36:32.526486 kubelet[2856]: I0313 00:36:32.525976 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6bf85f8dd-bgcdh" podStartSLOduration=1.089534251 podStartE2EDuration="3.525916505s" podCreationTimestamp="2026-03-13 00:36:29 +0000 UTC" firstStartedPulling="2026-03-13 00:36:29.535463951 +0000 UTC m=+7.414570807" lastFinishedPulling="2026-03-13 00:36:31.971846205 +0000 UTC m=+9.850953061" observedRunningTime="2026-03-13 00:36:32.518959863 +0000 UTC m=+10.398066768" watchObservedRunningTime="2026-03-13 00:36:32.525916505 +0000 UTC m=+10.405023489" Mar 13 00:36:37.633180 sudo[1892]: pam_unix(sudo:session): session closed for user root Mar 13 00:36:37.729329 sshd[1891]: Connection closed by 4.153.228.146 port 47022 Mar 13 00:36:37.731045 sshd-session[1888]: pam_unix(sshd:session): session closed for user core Mar 13 00:36:37.735891 systemd[1]: sshd@8-10.0.1.99:22-4.153.228.146:47022.service: Deactivated successfully. Mar 13 00:36:37.738474 systemd[1]: session-9.scope: Deactivated successfully. Mar 13 00:36:37.739628 systemd[1]: session-9.scope: Consumed 4.834s CPU time, 236.7M memory peak. Mar 13 00:36:37.742102 systemd-logind[1612]: Session 9 logged out. Waiting for processes to exit. Mar 13 00:36:37.744875 systemd-logind[1612]: Removed session 9. Mar 13 00:36:39.679734 systemd[1]: Created slice kubepods-besteffort-pod06c8cb07_8df1_4187_9829_d6be12016332.slice - libcontainer container kubepods-besteffort-pod06c8cb07_8df1_4187_9829_d6be12016332.slice. Mar 13 00:36:39.726448 kubelet[2856]: I0313 00:36:39.726260 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/06c8cb07-8df1-4187-9829-d6be12016332-typha-certs\") pod \"calico-typha-769cd78c74-4knsp\" (UID: \"06c8cb07-8df1-4187-9829-d6be12016332\") " pod="calico-system/calico-typha-769cd78c74-4knsp" Mar 13 00:36:39.726448 kubelet[2856]: I0313 00:36:39.726310 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06c8cb07-8df1-4187-9829-d6be12016332-tigera-ca-bundle\") pod \"calico-typha-769cd78c74-4knsp\" (UID: \"06c8cb07-8df1-4187-9829-d6be12016332\") " pod="calico-system/calico-typha-769cd78c74-4knsp" Mar 13 00:36:39.726448 kubelet[2856]: I0313 00:36:39.726328 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx4s9\" (UniqueName: \"kubernetes.io/projected/06c8cb07-8df1-4187-9829-d6be12016332-kube-api-access-bx4s9\") pod \"calico-typha-769cd78c74-4knsp\" (UID: \"06c8cb07-8df1-4187-9829-d6be12016332\") " pod="calico-system/calico-typha-769cd78c74-4knsp" Mar 13 00:36:39.744444 systemd[1]: Created slice kubepods-besteffort-pod8f31e2ea_71f8_4cd4_a2ab_16f303fe34a4.slice - libcontainer container kubepods-besteffort-pod8f31e2ea_71f8_4cd4_a2ab_16f303fe34a4.slice. Mar 13 00:36:39.828774 kubelet[2856]: I0313 00:36:39.828032 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-nodeproc\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.829070 kubelet[2856]: I0313 00:36:39.828963 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-tigera-ca-bundle\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.829070 kubelet[2856]: I0313 00:36:39.829027 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g995\" (UniqueName: \"kubernetes.io/projected/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-kube-api-access-4g995\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.831191 kubelet[2856]: I0313 00:36:39.829843 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-cni-log-dir\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.831191 kubelet[2856]: I0313 00:36:39.829890 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-bpffs\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.831191 kubelet[2856]: I0313 00:36:39.830594 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-var-run-calico\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.831191 kubelet[2856]: I0313 00:36:39.830620 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-xtables-lock\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.831191 kubelet[2856]: I0313 00:36:39.830635 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-policysync\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.831391 kubelet[2856]: I0313 00:36:39.830676 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-cni-bin-dir\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.831391 kubelet[2856]: I0313 00:36:39.830694 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-cni-net-dir\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.831391 kubelet[2856]: I0313 00:36:39.830716 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-sys-fs\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.831391 kubelet[2856]: I0313 00:36:39.830743 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-flexvol-driver-host\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.831391 kubelet[2856]: I0313 00:36:39.830760 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-node-certs\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.831507 kubelet[2856]: I0313 00:36:39.830774 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-var-lib-calico\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.831507 kubelet[2856]: I0313 00:36:39.830790 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4-lib-modules\") pod \"calico-node-5n78g\" (UID: \"8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4\") " pod="calico-system/calico-node-5n78g" Mar 13 00:36:39.858739 kubelet[2856]: E0313 00:36:39.858340 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7z8mv" podUID="129d677b-52bb-4b01-a36f-4d698e8848e6" Mar 13 00:36:39.931307 kubelet[2856]: I0313 00:36:39.931224 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/129d677b-52bb-4b01-a36f-4d698e8848e6-varrun\") pod \"csi-node-driver-7z8mv\" (UID: \"129d677b-52bb-4b01-a36f-4d698e8848e6\") " pod="calico-system/csi-node-driver-7z8mv" Mar 13 00:36:39.931953 kubelet[2856]: I0313 00:36:39.931455 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/129d677b-52bb-4b01-a36f-4d698e8848e6-kubelet-dir\") pod \"csi-node-driver-7z8mv\" (UID: \"129d677b-52bb-4b01-a36f-4d698e8848e6\") " pod="calico-system/csi-node-driver-7z8mv" Mar 13 00:36:39.931953 kubelet[2856]: I0313 00:36:39.931506 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/129d677b-52bb-4b01-a36f-4d698e8848e6-socket-dir\") pod \"csi-node-driver-7z8mv\" (UID: \"129d677b-52bb-4b01-a36f-4d698e8848e6\") " pod="calico-system/csi-node-driver-7z8mv" Mar 13 00:36:39.931953 kubelet[2856]: I0313 00:36:39.931522 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm8nj\" (UniqueName: \"kubernetes.io/projected/129d677b-52bb-4b01-a36f-4d698e8848e6-kube-api-access-nm8nj\") pod \"csi-node-driver-7z8mv\" (UID: \"129d677b-52bb-4b01-a36f-4d698e8848e6\") " pod="calico-system/csi-node-driver-7z8mv" Mar 13 00:36:39.932126 kubelet[2856]: I0313 00:36:39.932110 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/129d677b-52bb-4b01-a36f-4d698e8848e6-registration-dir\") pod \"csi-node-driver-7z8mv\" (UID: \"129d677b-52bb-4b01-a36f-4d698e8848e6\") " pod="calico-system/csi-node-driver-7z8mv" Mar 13 00:36:39.933657 kubelet[2856]: E0313 00:36:39.933609 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.933657 kubelet[2856]: W0313 00:36:39.933627 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.933797 kubelet[2856]: E0313 00:36:39.933644 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.933988 kubelet[2856]: E0313 00:36:39.933972 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.934059 kubelet[2856]: W0313 00:36:39.934023 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.934059 kubelet[2856]: E0313 00:36:39.934033 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.934262 kubelet[2856]: E0313 00:36:39.934225 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.934262 kubelet[2856]: W0313 00:36:39.934232 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.934262 kubelet[2856]: E0313 00:36:39.934238 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.934518 kubelet[2856]: E0313 00:36:39.934468 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.934518 kubelet[2856]: W0313 00:36:39.934475 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.934518 kubelet[2856]: E0313 00:36:39.934482 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.934756 kubelet[2856]: E0313 00:36:39.934717 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.934756 kubelet[2856]: W0313 00:36:39.934723 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.934756 kubelet[2856]: E0313 00:36:39.934730 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.934975 kubelet[2856]: E0313 00:36:39.934933 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.934975 kubelet[2856]: W0313 00:36:39.934940 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.934975 kubelet[2856]: E0313 00:36:39.934947 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.935721 kubelet[2856]: E0313 00:36:39.935713 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.935839 kubelet[2856]: W0313 00:36:39.935790 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.935839 kubelet[2856]: E0313 00:36:39.935801 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.936057 kubelet[2856]: E0313 00:36:39.936016 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.936057 kubelet[2856]: W0313 00:36:39.936022 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.936057 kubelet[2856]: E0313 00:36:39.936029 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.937579 kubelet[2856]: E0313 00:36:39.936780 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.937579 kubelet[2856]: W0313 00:36:39.936825 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.937579 kubelet[2856]: E0313 00:36:39.936832 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.937804 kubelet[2856]: E0313 00:36:39.937796 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.939676 kubelet[2856]: W0313 00:36:39.939589 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.939676 kubelet[2856]: E0313 00:36:39.939602 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.939863 kubelet[2856]: E0313 00:36:39.939857 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.939951 kubelet[2856]: W0313 00:36:39.939897 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.939951 kubelet[2856]: E0313 00:36:39.939905 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.940150 kubelet[2856]: E0313 00:36:39.940106 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.940150 kubelet[2856]: W0313 00:36:39.940113 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.940150 kubelet[2856]: E0313 00:36:39.940120 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.940380 kubelet[2856]: E0313 00:36:39.940330 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.940380 kubelet[2856]: W0313 00:36:39.940336 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.940380 kubelet[2856]: E0313 00:36:39.940342 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.940548 kubelet[2856]: E0313 00:36:39.940542 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.940643 kubelet[2856]: W0313 00:36:39.940593 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.940643 kubelet[2856]: E0313 00:36:39.940601 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.940802 kubelet[2856]: E0313 00:36:39.940781 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.940802 kubelet[2856]: W0313 00:36:39.940788 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.940802 kubelet[2856]: E0313 00:36:39.940795 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.941032 kubelet[2856]: E0313 00:36:39.941001 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.941032 kubelet[2856]: W0313 00:36:39.941008 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.941032 kubelet[2856]: E0313 00:36:39.941018 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.941268 kubelet[2856]: E0313 00:36:39.941255 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.941297 kubelet[2856]: W0313 00:36:39.941269 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.941297 kubelet[2856]: E0313 00:36:39.941281 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.941413 kubelet[2856]: E0313 00:36:39.941404 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.941437 kubelet[2856]: W0313 00:36:39.941412 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.941437 kubelet[2856]: E0313 00:36:39.941419 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.941562 kubelet[2856]: E0313 00:36:39.941554 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.941602 kubelet[2856]: W0313 00:36:39.941562 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.941602 kubelet[2856]: E0313 00:36:39.941580 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.941891 kubelet[2856]: E0313 00:36:39.941882 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.941922 kubelet[2856]: W0313 00:36:39.941890 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.941922 kubelet[2856]: E0313 00:36:39.941898 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.942010 kubelet[2856]: E0313 00:36:39.942002 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.942055 kubelet[2856]: W0313 00:36:39.942010 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.942055 kubelet[2856]: E0313 00:36:39.942017 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.942653 kubelet[2856]: E0313 00:36:39.942643 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.942653 kubelet[2856]: W0313 00:36:39.942652 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.942723 kubelet[2856]: E0313 00:36:39.942660 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.942788 kubelet[2856]: E0313 00:36:39.942779 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.942788 kubelet[2856]: W0313 00:36:39.942787 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.942851 kubelet[2856]: E0313 00:36:39.942793 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.942917 kubelet[2856]: E0313 00:36:39.942909 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.942963 kubelet[2856]: W0313 00:36:39.942917 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.942963 kubelet[2856]: E0313 00:36:39.942923 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.943034 kubelet[2856]: E0313 00:36:39.943022 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.943034 kubelet[2856]: W0313 00:36:39.943029 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.943215 kubelet[2856]: E0313 00:36:39.943035 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.943215 kubelet[2856]: E0313 00:36:39.943141 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.943215 kubelet[2856]: W0313 00:36:39.943146 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.943215 kubelet[2856]: E0313 00:36:39.943152 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.943327 kubelet[2856]: E0313 00:36:39.943252 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.943327 kubelet[2856]: W0313 00:36:39.943256 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.943327 kubelet[2856]: E0313 00:36:39.943262 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.943651 kubelet[2856]: E0313 00:36:39.943641 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.943651 kubelet[2856]: W0313 00:36:39.943650 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.943700 kubelet[2856]: E0313 00:36:39.943658 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.943820 kubelet[2856]: E0313 00:36:39.943812 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.943820 kubelet[2856]: W0313 00:36:39.943820 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.943867 kubelet[2856]: E0313 00:36:39.943826 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.944881 kubelet[2856]: E0313 00:36:39.944709 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.944881 kubelet[2856]: W0313 00:36:39.944718 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.944881 kubelet[2856]: E0313 00:36:39.944727 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.945210 kubelet[2856]: E0313 00:36:39.945126 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.945257 kubelet[2856]: W0313 00:36:39.945251 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.945300 kubelet[2856]: E0313 00:36:39.945288 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.946580 kubelet[2856]: E0313 00:36:39.945619 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.946580 kubelet[2856]: W0313 00:36:39.945628 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.946580 kubelet[2856]: E0313 00:36:39.945636 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.946848 kubelet[2856]: E0313 00:36:39.946839 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.946897 kubelet[2856]: W0313 00:36:39.946890 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.946938 kubelet[2856]: E0313 00:36:39.946932 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.947836 kubelet[2856]: E0313 00:36:39.947826 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.947895 kubelet[2856]: W0313 00:36:39.947888 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.948314 kubelet[2856]: E0313 00:36:39.948306 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.948506 kubelet[2856]: E0313 00:36:39.948500 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.948547 kubelet[2856]: W0313 00:36:39.948542 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.948596 kubelet[2856]: E0313 00:36:39.948590 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.952333 kubelet[2856]: E0313 00:36:39.952319 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.952333 kubelet[2856]: W0313 00:36:39.952331 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.952408 kubelet[2856]: E0313 00:36:39.952341 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.969785 kubelet[2856]: E0313 00:36:39.969719 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:39.969785 kubelet[2856]: W0313 00:36:39.969736 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:39.969785 kubelet[2856]: E0313 00:36:39.969753 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:39.983344 containerd[1639]: time="2026-03-13T00:36:39.983299306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-769cd78c74-4knsp,Uid:06c8cb07-8df1-4187-9829-d6be12016332,Namespace:calico-system,Attempt:0,}" Mar 13 00:36:40.014792 containerd[1639]: time="2026-03-13T00:36:40.014707737Z" level=info msg="connecting to shim a88f013756efdd7c73a82f74bf9d4b0fbf0fce9d8cb9da6316d2b57053b834b4" address="unix:///run/containerd/s/8a865d850b29fe79fa500ebd85438a555be9051b1b6c8efa7115bbf1b19576fb" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:36:40.034608 kubelet[2856]: E0313 00:36:40.034585 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.034608 kubelet[2856]: W0313 00:36:40.034602 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.034808 kubelet[2856]: E0313 00:36:40.034620 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.034808 kubelet[2856]: E0313 00:36:40.034754 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.034808 kubelet[2856]: W0313 00:36:40.034759 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.034808 kubelet[2856]: E0313 00:36:40.034765 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.035145 kubelet[2856]: E0313 00:36:40.035082 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.035145 kubelet[2856]: W0313 00:36:40.035094 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.035145 kubelet[2856]: E0313 00:36:40.035106 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.035380 kubelet[2856]: E0313 00:36:40.035335 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.035380 kubelet[2856]: W0313 00:36:40.035343 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.035380 kubelet[2856]: E0313 00:36:40.035349 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.035681 kubelet[2856]: E0313 00:36:40.035562 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.035681 kubelet[2856]: W0313 00:36:40.035579 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.035681 kubelet[2856]: E0313 00:36:40.035586 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.035791 kubelet[2856]: E0313 00:36:40.035778 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.035791 kubelet[2856]: W0313 00:36:40.035789 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.035837 kubelet[2856]: E0313 00:36:40.035796 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.035920 kubelet[2856]: E0313 00:36:40.035911 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.036032 kubelet[2856]: W0313 00:36:40.035920 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.036032 kubelet[2856]: E0313 00:36:40.035926 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.036141 kubelet[2856]: E0313 00:36:40.036117 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.036141 kubelet[2856]: W0313 00:36:40.036125 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.036141 kubelet[2856]: E0313 00:36:40.036132 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.036349 kubelet[2856]: E0313 00:36:40.036331 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.036349 kubelet[2856]: W0313 00:36:40.036338 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.036465 kubelet[2856]: E0313 00:36:40.036406 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.036557 kubelet[2856]: E0313 00:36:40.036538 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.036557 kubelet[2856]: W0313 00:36:40.036544 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.036557 kubelet[2856]: E0313 00:36:40.036550 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.036759 systemd[1]: Started cri-containerd-a88f013756efdd7c73a82f74bf9d4b0fbf0fce9d8cb9da6316d2b57053b834b4.scope - libcontainer container a88f013756efdd7c73a82f74bf9d4b0fbf0fce9d8cb9da6316d2b57053b834b4. Mar 13 00:36:40.037813 kubelet[2856]: E0313 00:36:40.037689 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.037813 kubelet[2856]: W0313 00:36:40.037699 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.037813 kubelet[2856]: E0313 00:36:40.037708 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.038157 kubelet[2856]: E0313 00:36:40.038030 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.038157 kubelet[2856]: W0313 00:36:40.038038 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.038157 kubelet[2856]: E0313 00:36:40.038046 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.038462 kubelet[2856]: E0313 00:36:40.038455 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.038514 kubelet[2856]: W0313 00:36:40.038507 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.038553 kubelet[2856]: E0313 00:36:40.038547 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.038762 kubelet[2856]: E0313 00:36:40.038740 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.038762 kubelet[2856]: W0313 00:36:40.038747 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.038762 kubelet[2856]: E0313 00:36:40.038754 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.039001 kubelet[2856]: E0313 00:36:40.038975 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.039001 kubelet[2856]: W0313 00:36:40.038987 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.039001 kubelet[2856]: E0313 00:36:40.038993 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.039251 kubelet[2856]: E0313 00:36:40.039211 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.039251 kubelet[2856]: W0313 00:36:40.039218 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.039251 kubelet[2856]: E0313 00:36:40.039224 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.039427 kubelet[2856]: E0313 00:36:40.039418 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.039500 kubelet[2856]: W0313 00:36:40.039458 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.039500 kubelet[2856]: E0313 00:36:40.039466 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.039654 kubelet[2856]: E0313 00:36:40.039645 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.039747 kubelet[2856]: W0313 00:36:40.039689 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.039747 kubelet[2856]: E0313 00:36:40.039697 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.039948 kubelet[2856]: E0313 00:36:40.039926 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.039948 kubelet[2856]: W0313 00:36:40.039934 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.039948 kubelet[2856]: E0313 00:36:40.039940 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.040161 kubelet[2856]: E0313 00:36:40.040139 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.040161 kubelet[2856]: W0313 00:36:40.040147 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.040161 kubelet[2856]: E0313 00:36:40.040154 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.040359 kubelet[2856]: E0313 00:36:40.040340 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.040359 kubelet[2856]: W0313 00:36:40.040346 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.040359 kubelet[2856]: E0313 00:36:40.040352 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.040671 kubelet[2856]: E0313 00:36:40.040646 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.040671 kubelet[2856]: W0313 00:36:40.040654 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.040671 kubelet[2856]: E0313 00:36:40.040662 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.040991 kubelet[2856]: E0313 00:36:40.040938 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.040991 kubelet[2856]: W0313 00:36:40.040945 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.040991 kubelet[2856]: E0313 00:36:40.040954 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.041230 kubelet[2856]: E0313 00:36:40.041224 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.041295 kubelet[2856]: W0313 00:36:40.041259 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.041295 kubelet[2856]: E0313 00:36:40.041268 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.041506 kubelet[2856]: E0313 00:36:40.041494 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.041531 kubelet[2856]: W0313 00:36:40.041506 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.041531 kubelet[2856]: E0313 00:36:40.041516 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.048115 containerd[1639]: time="2026-03-13T00:36:40.048077477Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5n78g,Uid:8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4,Namespace:calico-system,Attempt:0,}" Mar 13 00:36:40.051551 kubelet[2856]: E0313 00:36:40.051464 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:40.051551 kubelet[2856]: W0313 00:36:40.051491 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:40.051551 kubelet[2856]: E0313 00:36:40.051504 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:40.073869 containerd[1639]: time="2026-03-13T00:36:40.073833444Z" level=info msg="connecting to shim cee249322bb838c79b6107e34b71f57b6fb7c6577f1ba04dac3bd2c2940e7e54" address="unix:///run/containerd/s/11587104b787309f31d7f0f14f16784c0047e26f0c7fa0af637ad6acdc1a0c7b" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:36:40.094830 containerd[1639]: time="2026-03-13T00:36:40.094640296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-769cd78c74-4knsp,Uid:06c8cb07-8df1-4187-9829-d6be12016332,Namespace:calico-system,Attempt:0,} returns sandbox id \"a88f013756efdd7c73a82f74bf9d4b0fbf0fce9d8cb9da6316d2b57053b834b4\"" Mar 13 00:36:40.098146 containerd[1639]: time="2026-03-13T00:36:40.098087267Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 13 00:36:40.103723 systemd[1]: Started cri-containerd-cee249322bb838c79b6107e34b71f57b6fb7c6577f1ba04dac3bd2c2940e7e54.scope - libcontainer container cee249322bb838c79b6107e34b71f57b6fb7c6577f1ba04dac3bd2c2940e7e54. Mar 13 00:36:40.129222 containerd[1639]: time="2026-03-13T00:36:40.129189953Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5n78g,Uid:8f31e2ea-71f8-4cd4-a2ab-16f303fe34a4,Namespace:calico-system,Attempt:0,} returns sandbox id \"cee249322bb838c79b6107e34b71f57b6fb7c6577f1ba04dac3bd2c2940e7e54\"" Mar 13 00:36:41.394520 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2982261719.mount: Deactivated successfully. Mar 13 00:36:41.409186 kubelet[2856]: E0313 00:36:41.409152 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7z8mv" podUID="129d677b-52bb-4b01-a36f-4d698e8848e6" Mar 13 00:36:41.903600 containerd[1639]: time="2026-03-13T00:36:41.903532934Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:41.904782 containerd[1639]: time="2026-03-13T00:36:41.904702663Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 13 00:36:41.906442 containerd[1639]: time="2026-03-13T00:36:41.906422403Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:41.908807 containerd[1639]: time="2026-03-13T00:36:41.908784214Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:41.910322 containerd[1639]: time="2026-03-13T00:36:41.909941590Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 1.81150162s" Mar 13 00:36:41.910322 containerd[1639]: time="2026-03-13T00:36:41.909965310Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 13 00:36:41.911993 containerd[1639]: time="2026-03-13T00:36:41.911838277Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 13 00:36:41.925122 containerd[1639]: time="2026-03-13T00:36:41.925094533Z" level=info msg="CreateContainer within sandbox \"a88f013756efdd7c73a82f74bf9d4b0fbf0fce9d8cb9da6316d2b57053b834b4\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 13 00:36:41.952834 containerd[1639]: time="2026-03-13T00:36:41.952799602Z" level=info msg="Container dba798a8ac2ed04c209c4ca8eb816a05403165fa596e199df817049f671204cd: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:41.957712 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3741551791.mount: Deactivated successfully. Mar 13 00:36:42.199479 containerd[1639]: time="2026-03-13T00:36:42.199284302Z" level=info msg="CreateContainer within sandbox \"a88f013756efdd7c73a82f74bf9d4b0fbf0fce9d8cb9da6316d2b57053b834b4\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"dba798a8ac2ed04c209c4ca8eb816a05403165fa596e199df817049f671204cd\"" Mar 13 00:36:42.200611 containerd[1639]: time="2026-03-13T00:36:42.200413564Z" level=info msg="StartContainer for \"dba798a8ac2ed04c209c4ca8eb816a05403165fa596e199df817049f671204cd\"" Mar 13 00:36:42.202259 containerd[1639]: time="2026-03-13T00:36:42.202229551Z" level=info msg="connecting to shim dba798a8ac2ed04c209c4ca8eb816a05403165fa596e199df817049f671204cd" address="unix:///run/containerd/s/8a865d850b29fe79fa500ebd85438a555be9051b1b6c8efa7115bbf1b19576fb" protocol=ttrpc version=3 Mar 13 00:36:42.230829 systemd[1]: Started cri-containerd-dba798a8ac2ed04c209c4ca8eb816a05403165fa596e199df817049f671204cd.scope - libcontainer container dba798a8ac2ed04c209c4ca8eb816a05403165fa596e199df817049f671204cd. Mar 13 00:36:42.288974 containerd[1639]: time="2026-03-13T00:36:42.288895404Z" level=info msg="StartContainer for \"dba798a8ac2ed04c209c4ca8eb816a05403165fa596e199df817049f671204cd\" returns successfully" Mar 13 00:36:42.529521 kubelet[2856]: E0313 00:36:42.529428 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.529521 kubelet[2856]: W0313 00:36:42.529449 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.529521 kubelet[2856]: E0313 00:36:42.529465 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.530091 kubelet[2856]: E0313 00:36:42.529934 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.530091 kubelet[2856]: W0313 00:36:42.529943 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.530091 kubelet[2856]: E0313 00:36:42.529952 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.530163 kubelet[2856]: E0313 00:36:42.530127 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.530163 kubelet[2856]: W0313 00:36:42.530133 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.530163 kubelet[2856]: E0313 00:36:42.530140 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.530419 kubelet[2856]: E0313 00:36:42.530408 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.530419 kubelet[2856]: W0313 00:36:42.530417 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.530473 kubelet[2856]: E0313 00:36:42.530425 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.530735 kubelet[2856]: E0313 00:36:42.530725 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.530735 kubelet[2856]: W0313 00:36:42.530734 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.530792 kubelet[2856]: E0313 00:36:42.530742 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.530909 kubelet[2856]: E0313 00:36:42.530901 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.530935 kubelet[2856]: W0313 00:36:42.530909 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.530935 kubelet[2856]: E0313 00:36:42.530915 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.531154 kubelet[2856]: E0313 00:36:42.531146 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.531180 kubelet[2856]: W0313 00:36:42.531154 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.531180 kubelet[2856]: E0313 00:36:42.531161 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.531589 kubelet[2856]: E0313 00:36:42.531526 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.531589 kubelet[2856]: W0313 00:36:42.531534 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.531589 kubelet[2856]: E0313 00:36:42.531543 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.531941 kubelet[2856]: E0313 00:36:42.531881 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.531941 kubelet[2856]: W0313 00:36:42.531888 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.531941 kubelet[2856]: E0313 00:36:42.531895 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.532187 kubelet[2856]: E0313 00:36:42.532177 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.532213 kubelet[2856]: W0313 00:36:42.532186 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.532213 kubelet[2856]: E0313 00:36:42.532194 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.532440 kubelet[2856]: E0313 00:36:42.532430 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.532440 kubelet[2856]: W0313 00:36:42.532439 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.532491 kubelet[2856]: E0313 00:36:42.532446 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.532603 kubelet[2856]: E0313 00:36:42.532594 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.532603 kubelet[2856]: W0313 00:36:42.532602 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.532656 kubelet[2856]: E0313 00:36:42.532608 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.532728 kubelet[2856]: E0313 00:36:42.532723 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.532752 kubelet[2856]: W0313 00:36:42.532728 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.532752 kubelet[2856]: E0313 00:36:42.532734 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.532848 kubelet[2856]: E0313 00:36:42.532840 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.532848 kubelet[2856]: W0313 00:36:42.532847 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.532893 kubelet[2856]: E0313 00:36:42.532853 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.532960 kubelet[2856]: E0313 00:36:42.532957 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.532984 kubelet[2856]: W0313 00:36:42.532962 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.532984 kubelet[2856]: E0313 00:36:42.532968 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.537298 kubelet[2856]: I0313 00:36:42.536727 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-769cd78c74-4knsp" podStartSLOduration=1.723187145 podStartE2EDuration="3.536715435s" podCreationTimestamp="2026-03-13 00:36:39 +0000 UTC" firstStartedPulling="2026-03-13 00:36:40.097102727 +0000 UTC m=+17.976209583" lastFinishedPulling="2026-03-13 00:36:41.910631017 +0000 UTC m=+19.789737873" observedRunningTime="2026-03-13 00:36:42.535496571 +0000 UTC m=+20.414603428" watchObservedRunningTime="2026-03-13 00:36:42.536715435 +0000 UTC m=+20.415822312" Mar 13 00:36:42.556918 kubelet[2856]: E0313 00:36:42.556894 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.557056 kubelet[2856]: W0313 00:36:42.557043 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.557177 kubelet[2856]: E0313 00:36:42.557100 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.557284 kubelet[2856]: E0313 00:36:42.557278 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.557321 kubelet[2856]: W0313 00:36:42.557316 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.557414 kubelet[2856]: E0313 00:36:42.557351 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.557507 kubelet[2856]: E0313 00:36:42.557502 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.557548 kubelet[2856]: W0313 00:36:42.557542 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.557671 kubelet[2856]: E0313 00:36:42.557597 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.557762 kubelet[2856]: E0313 00:36:42.557757 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.557795 kubelet[2856]: W0313 00:36:42.557789 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.557869 kubelet[2856]: E0313 00:36:42.557821 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.557963 kubelet[2856]: E0313 00:36:42.557958 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.558046 kubelet[2856]: W0313 00:36:42.557995 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.558046 kubelet[2856]: E0313 00:36:42.558003 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.558232 kubelet[2856]: E0313 00:36:42.558150 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.558232 kubelet[2856]: W0313 00:36:42.558156 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.558232 kubelet[2856]: E0313 00:36:42.558163 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.558330 kubelet[2856]: E0313 00:36:42.558324 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.558371 kubelet[2856]: W0313 00:36:42.558365 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.558620 kubelet[2856]: E0313 00:36:42.558400 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.558802 kubelet[2856]: E0313 00:36:42.558793 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.558850 kubelet[2856]: W0313 00:36:42.558843 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.558886 kubelet[2856]: E0313 00:36:42.558880 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.559289 kubelet[2856]: E0313 00:36:42.559279 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.559415 kubelet[2856]: W0313 00:36:42.559340 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.559415 kubelet[2856]: E0313 00:36:42.559351 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.559995 kubelet[2856]: E0313 00:36:42.559904 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.559995 kubelet[2856]: W0313 00:36:42.559913 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.559995 kubelet[2856]: E0313 00:36:42.559922 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.560308 kubelet[2856]: E0313 00:36:42.560215 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.560308 kubelet[2856]: W0313 00:36:42.560222 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.560308 kubelet[2856]: E0313 00:36:42.560229 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.560997 kubelet[2856]: E0313 00:36:42.560745 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.560997 kubelet[2856]: W0313 00:36:42.560753 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.560997 kubelet[2856]: E0313 00:36:42.560761 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.561333 kubelet[2856]: E0313 00:36:42.561325 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.561854 kubelet[2856]: W0313 00:36:42.561392 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.561854 kubelet[2856]: E0313 00:36:42.561402 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.562134 kubelet[2856]: E0313 00:36:42.562050 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.562134 kubelet[2856]: W0313 00:36:42.562059 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.562134 kubelet[2856]: E0313 00:36:42.562067 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.562311 kubelet[2856]: E0313 00:36:42.562240 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.562311 kubelet[2856]: W0313 00:36:42.562246 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.562311 kubelet[2856]: E0313 00:36:42.562253 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.562452 kubelet[2856]: E0313 00:36:42.562447 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.562529 kubelet[2856]: W0313 00:36:42.562479 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.562529 kubelet[2856]: E0313 00:36:42.562487 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.563962 kubelet[2856]: E0313 00:36:42.563596 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.563962 kubelet[2856]: W0313 00:36:42.563607 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.563962 kubelet[2856]: E0313 00:36:42.563616 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:42.565739 kubelet[2856]: E0313 00:36:42.565686 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:36:42.565739 kubelet[2856]: W0313 00:36:42.565698 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:36:42.565739 kubelet[2856]: E0313 00:36:42.565710 2856 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:36:43.236988 containerd[1639]: time="2026-03-13T00:36:43.236920259Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:43.238844 containerd[1639]: time="2026-03-13T00:36:43.238717112Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 13 00:36:43.239975 containerd[1639]: time="2026-03-13T00:36:43.239959178Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:43.242268 containerd[1639]: time="2026-03-13T00:36:43.242246820Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:43.242805 containerd[1639]: time="2026-03-13T00:36:43.242782964Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.330916601s" Mar 13 00:36:43.242874 containerd[1639]: time="2026-03-13T00:36:43.242863505Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 13 00:36:43.247830 containerd[1639]: time="2026-03-13T00:36:43.247803898Z" level=info msg="CreateContainer within sandbox \"cee249322bb838c79b6107e34b71f57b6fb7c6577f1ba04dac3bd2c2940e7e54\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 13 00:36:43.259591 containerd[1639]: time="2026-03-13T00:36:43.258522638Z" level=info msg="Container a2fb896d5f920aeb1f07e9c556fc4e30208a577c14f3798fefe57874ca719b35: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:43.274048 containerd[1639]: time="2026-03-13T00:36:43.274004696Z" level=info msg="CreateContainer within sandbox \"cee249322bb838c79b6107e34b71f57b6fb7c6577f1ba04dac3bd2c2940e7e54\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a2fb896d5f920aeb1f07e9c556fc4e30208a577c14f3798fefe57874ca719b35\"" Mar 13 00:36:43.276109 containerd[1639]: time="2026-03-13T00:36:43.276089946Z" level=info msg="StartContainer for \"a2fb896d5f920aeb1f07e9c556fc4e30208a577c14f3798fefe57874ca719b35\"" Mar 13 00:36:43.277390 containerd[1639]: time="2026-03-13T00:36:43.277368411Z" level=info msg="connecting to shim a2fb896d5f920aeb1f07e9c556fc4e30208a577c14f3798fefe57874ca719b35" address="unix:///run/containerd/s/11587104b787309f31d7f0f14f16784c0047e26f0c7fa0af637ad6acdc1a0c7b" protocol=ttrpc version=3 Mar 13 00:36:43.300731 systemd[1]: Started cri-containerd-a2fb896d5f920aeb1f07e9c556fc4e30208a577c14f3798fefe57874ca719b35.scope - libcontainer container a2fb896d5f920aeb1f07e9c556fc4e30208a577c14f3798fefe57874ca719b35. Mar 13 00:36:43.365900 containerd[1639]: time="2026-03-13T00:36:43.365815626Z" level=info msg="StartContainer for \"a2fb896d5f920aeb1f07e9c556fc4e30208a577c14f3798fefe57874ca719b35\" returns successfully" Mar 13 00:36:43.373000 systemd[1]: cri-containerd-a2fb896d5f920aeb1f07e9c556fc4e30208a577c14f3798fefe57874ca719b35.scope: Deactivated successfully. Mar 13 00:36:43.376866 containerd[1639]: time="2026-03-13T00:36:43.376668742Z" level=info msg="received container exit event container_id:\"a2fb896d5f920aeb1f07e9c556fc4e30208a577c14f3798fefe57874ca719b35\" id:\"a2fb896d5f920aeb1f07e9c556fc4e30208a577c14f3798fefe57874ca719b35\" pid:3515 exited_at:{seconds:1773362203 nanos:376348018}" Mar 13 00:36:43.396058 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a2fb896d5f920aeb1f07e9c556fc4e30208a577c14f3798fefe57874ca719b35-rootfs.mount: Deactivated successfully. Mar 13 00:36:43.410093 kubelet[2856]: E0313 00:36:43.409107 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7z8mv" podUID="129d677b-52bb-4b01-a36f-4d698e8848e6" Mar 13 00:36:43.523412 kubelet[2856]: I0313 00:36:43.523320 2856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:36:45.409107 kubelet[2856]: E0313 00:36:45.409067 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7z8mv" podUID="129d677b-52bb-4b01-a36f-4d698e8848e6" Mar 13 00:36:45.540324 containerd[1639]: time="2026-03-13T00:36:45.539687359Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 13 00:36:47.409343 kubelet[2856]: E0313 00:36:47.409305 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7z8mv" podUID="129d677b-52bb-4b01-a36f-4d698e8848e6" Mar 13 00:36:49.409711 kubelet[2856]: E0313 00:36:49.409661 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7z8mv" podUID="129d677b-52bb-4b01-a36f-4d698e8848e6" Mar 13 00:36:49.581970 kubelet[2856]: I0313 00:36:49.581943 2856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:36:50.769127 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount312868238.mount: Deactivated successfully. Mar 13 00:36:50.799356 containerd[1639]: time="2026-03-13T00:36:50.799320135Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:50.800428 containerd[1639]: time="2026-03-13T00:36:50.800412762Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 13 00:36:50.801781 containerd[1639]: time="2026-03-13T00:36:50.801732267Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:50.803751 containerd[1639]: time="2026-03-13T00:36:50.803719337Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:50.804446 containerd[1639]: time="2026-03-13T00:36:50.804131220Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 5.264382535s" Mar 13 00:36:50.804446 containerd[1639]: time="2026-03-13T00:36:50.804158917Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 13 00:36:50.809084 containerd[1639]: time="2026-03-13T00:36:50.809053493Z" level=info msg="CreateContainer within sandbox \"cee249322bb838c79b6107e34b71f57b6fb7c6577f1ba04dac3bd2c2940e7e54\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 13 00:36:50.819686 containerd[1639]: time="2026-03-13T00:36:50.819663191Z" level=info msg="Container 5219d63e8c29d2c200924701108c5977b67f1a4e92a86d4432aca0dc910ed953: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:51.409592 kubelet[2856]: E0313 00:36:51.409494 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7z8mv" podUID="129d677b-52bb-4b01-a36f-4d698e8848e6" Mar 13 00:36:51.462209 containerd[1639]: time="2026-03-13T00:36:51.462156782Z" level=info msg="CreateContainer within sandbox \"cee249322bb838c79b6107e34b71f57b6fb7c6577f1ba04dac3bd2c2940e7e54\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"5219d63e8c29d2c200924701108c5977b67f1a4e92a86d4432aca0dc910ed953\"" Mar 13 00:36:51.463184 containerd[1639]: time="2026-03-13T00:36:51.462932053Z" level=info msg="StartContainer for \"5219d63e8c29d2c200924701108c5977b67f1a4e92a86d4432aca0dc910ed953\"" Mar 13 00:36:51.465849 containerd[1639]: time="2026-03-13T00:36:51.465826046Z" level=info msg="connecting to shim 5219d63e8c29d2c200924701108c5977b67f1a4e92a86d4432aca0dc910ed953" address="unix:///run/containerd/s/11587104b787309f31d7f0f14f16784c0047e26f0c7fa0af637ad6acdc1a0c7b" protocol=ttrpc version=3 Mar 13 00:36:51.490731 systemd[1]: Started cri-containerd-5219d63e8c29d2c200924701108c5977b67f1a4e92a86d4432aca0dc910ed953.scope - libcontainer container 5219d63e8c29d2c200924701108c5977b67f1a4e92a86d4432aca0dc910ed953. Mar 13 00:36:51.554076 containerd[1639]: time="2026-03-13T00:36:51.554005586Z" level=info msg="StartContainer for \"5219d63e8c29d2c200924701108c5977b67f1a4e92a86d4432aca0dc910ed953\" returns successfully" Mar 13 00:36:51.607829 systemd[1]: cri-containerd-5219d63e8c29d2c200924701108c5977b67f1a4e92a86d4432aca0dc910ed953.scope: Deactivated successfully. Mar 13 00:36:51.609673 containerd[1639]: time="2026-03-13T00:36:51.609648498Z" level=info msg="received container exit event container_id:\"5219d63e8c29d2c200924701108c5977b67f1a4e92a86d4432aca0dc910ed953\" id:\"5219d63e8c29d2c200924701108c5977b67f1a4e92a86d4432aca0dc910ed953\" pid:3574 exited_at:{seconds:1773362211 nanos:608904860}" Mar 13 00:36:51.769668 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5219d63e8c29d2c200924701108c5977b67f1a4e92a86d4432aca0dc910ed953-rootfs.mount: Deactivated successfully. Mar 13 00:36:53.409372 kubelet[2856]: E0313 00:36:53.409288 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7z8mv" podUID="129d677b-52bb-4b01-a36f-4d698e8848e6" Mar 13 00:36:53.568837 containerd[1639]: time="2026-03-13T00:36:53.568784505Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 13 00:36:55.409871 kubelet[2856]: E0313 00:36:55.409562 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7z8mv" podUID="129d677b-52bb-4b01-a36f-4d698e8848e6" Mar 13 00:36:56.197127 containerd[1639]: time="2026-03-13T00:36:56.197081580Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:56.198345 containerd[1639]: time="2026-03-13T00:36:56.198218232Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 13 00:36:56.199701 containerd[1639]: time="2026-03-13T00:36:56.199681767Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:56.202934 containerd[1639]: time="2026-03-13T00:36:56.202909018Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:36:56.203515 containerd[1639]: time="2026-03-13T00:36:56.203493004Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 2.63466595s" Mar 13 00:36:56.203596 containerd[1639]: time="2026-03-13T00:36:56.203585774Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 13 00:36:56.208903 containerd[1639]: time="2026-03-13T00:36:56.208858983Z" level=info msg="CreateContainer within sandbox \"cee249322bb838c79b6107e34b71f57b6fb7c6577f1ba04dac3bd2c2940e7e54\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 13 00:36:56.220708 containerd[1639]: time="2026-03-13T00:36:56.220680257Z" level=info msg="Container 063a31245143886432d361e2c8922017a0fb7eb57ce11b9ee12c7fc0ef91fb47: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:56.223009 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1348444298.mount: Deactivated successfully. Mar 13 00:36:56.235293 containerd[1639]: time="2026-03-13T00:36:56.235248974Z" level=info msg="CreateContainer within sandbox \"cee249322bb838c79b6107e34b71f57b6fb7c6577f1ba04dac3bd2c2940e7e54\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"063a31245143886432d361e2c8922017a0fb7eb57ce11b9ee12c7fc0ef91fb47\"" Mar 13 00:36:56.236700 containerd[1639]: time="2026-03-13T00:36:56.236678334Z" level=info msg="StartContainer for \"063a31245143886432d361e2c8922017a0fb7eb57ce11b9ee12c7fc0ef91fb47\"" Mar 13 00:36:56.237968 containerd[1639]: time="2026-03-13T00:36:56.237948035Z" level=info msg="connecting to shim 063a31245143886432d361e2c8922017a0fb7eb57ce11b9ee12c7fc0ef91fb47" address="unix:///run/containerd/s/11587104b787309f31d7f0f14f16784c0047e26f0c7fa0af637ad6acdc1a0c7b" protocol=ttrpc version=3 Mar 13 00:36:56.256701 systemd[1]: Started cri-containerd-063a31245143886432d361e2c8922017a0fb7eb57ce11b9ee12c7fc0ef91fb47.scope - libcontainer container 063a31245143886432d361e2c8922017a0fb7eb57ce11b9ee12c7fc0ef91fb47. Mar 13 00:36:56.324081 containerd[1639]: time="2026-03-13T00:36:56.324050848Z" level=info msg="StartContainer for \"063a31245143886432d361e2c8922017a0fb7eb57ce11b9ee12c7fc0ef91fb47\" returns successfully" Mar 13 00:36:57.409810 kubelet[2856]: E0313 00:36:57.409709 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7z8mv" podUID="129d677b-52bb-4b01-a36f-4d698e8848e6" Mar 13 00:36:58.931624 containerd[1639]: time="2026-03-13T00:36:58.931584874Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 13 00:36:58.933831 systemd[1]: cri-containerd-063a31245143886432d361e2c8922017a0fb7eb57ce11b9ee12c7fc0ef91fb47.scope: Deactivated successfully. Mar 13 00:36:58.934056 systemd[1]: cri-containerd-063a31245143886432d361e2c8922017a0fb7eb57ce11b9ee12c7fc0ef91fb47.scope: Consumed 489ms CPU time, 186.2M memory peak, 1.1M read from disk, 177M written to disk. Mar 13 00:36:58.935438 containerd[1639]: time="2026-03-13T00:36:58.935418686Z" level=info msg="received container exit event container_id:\"063a31245143886432d361e2c8922017a0fb7eb57ce11b9ee12c7fc0ef91fb47\" id:\"063a31245143886432d361e2c8922017a0fb7eb57ce11b9ee12c7fc0ef91fb47\" pid:3635 exited_at:{seconds:1773362218 nanos:935201035}" Mar 13 00:36:58.988089 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-063a31245143886432d361e2c8922017a0fb7eb57ce11b9ee12c7fc0ef91fb47-rootfs.mount: Deactivated successfully. Mar 13 00:36:59.016189 kubelet[2856]: I0313 00:36:59.016168 2856 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Mar 13 00:36:59.217914 systemd[1]: Created slice kubepods-burstable-pode3354a6d_6ea0_4ffe_b3e8_3e508fd79239.slice - libcontainer container kubepods-burstable-pode3354a6d_6ea0_4ffe_b3e8_3e508fd79239.slice. Mar 13 00:36:59.242505 systemd[1]: Created slice kubepods-besteffort-podcc4c988f_bdbe_466c_886b_517505203e3b.slice - libcontainer container kubepods-besteffort-podcc4c988f_bdbe_466c_886b_517505203e3b.slice. Mar 13 00:36:59.255325 systemd[1]: Created slice kubepods-besteffort-pod95fefb4b_b753_41cf_8c25_ddb2b4541f4b.slice - libcontainer container kubepods-besteffort-pod95fefb4b_b753_41cf_8c25_ddb2b4541f4b.slice. Mar 13 00:36:59.262608 systemd[1]: Created slice kubepods-besteffort-pod3a859d32_64de_4fa7_bc30_293dcef00461.slice - libcontainer container kubepods-besteffort-pod3a859d32_64de_4fa7_bc30_293dcef00461.slice. Mar 13 00:36:59.271088 systemd[1]: Created slice kubepods-besteffort-pod7218a6dc_e47d_4c9f_9b07_d8349ede2479.slice - libcontainer container kubepods-besteffort-pod7218a6dc_e47d_4c9f_9b07_d8349ede2479.slice. Mar 13 00:36:59.278013 systemd[1]: Created slice kubepods-besteffort-podba1a71bc_658c_494d_92f9_f4db1f8ea894.slice - libcontainer container kubepods-besteffort-podba1a71bc_658c_494d_92f9_f4db1f8ea894.slice. Mar 13 00:36:59.283368 kubelet[2856]: I0313 00:36:59.283305 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/7218a6dc-e47d-4c9f-9b07-d8349ede2479-goldmane-key-pair\") pod \"goldmane-5b85766d88-w6hs4\" (UID: \"7218a6dc-e47d-4c9f-9b07-d8349ede2479\") " pod="calico-system/goldmane-5b85766d88-w6hs4" Mar 13 00:36:59.284116 kubelet[2856]: I0313 00:36:59.283469 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218a6dc-e47d-4c9f-9b07-d8349ede2479-goldmane-ca-bundle\") pod \"goldmane-5b85766d88-w6hs4\" (UID: \"7218a6dc-e47d-4c9f-9b07-d8349ede2479\") " pod="calico-system/goldmane-5b85766d88-w6hs4" Mar 13 00:36:59.284838 kubelet[2856]: I0313 00:36:59.284794 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a859d32-64de-4fa7-bc30-293dcef00461-tigera-ca-bundle\") pod \"calico-kube-controllers-697dc6db9-lsdmd\" (UID: \"3a859d32-64de-4fa7-bc30-293dcef00461\") " pod="calico-system/calico-kube-controllers-697dc6db9-lsdmd" Mar 13 00:36:59.286010 kubelet[2856]: I0313 00:36:59.284996 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbcft\" (UniqueName: \"kubernetes.io/projected/3a859d32-64de-4fa7-bc30-293dcef00461-kube-api-access-nbcft\") pod \"calico-kube-controllers-697dc6db9-lsdmd\" (UID: \"3a859d32-64de-4fa7-bc30-293dcef00461\") " pod="calico-system/calico-kube-controllers-697dc6db9-lsdmd" Mar 13 00:36:59.286010 kubelet[2856]: I0313 00:36:59.285013 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10c18fc6-35a9-4a29-b0ed-122436522760-config-volume\") pod \"coredns-674b8bbfcf-4hngh\" (UID: \"10c18fc6-35a9-4a29-b0ed-122436522760\") " pod="kube-system/coredns-674b8bbfcf-4hngh" Mar 13 00:36:59.286010 kubelet[2856]: I0313 00:36:59.285029 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/cc4c988f-bdbe-466c-886b-517505203e3b-nginx-config\") pod \"whisker-f874f7b7c-7zt6p\" (UID: \"cc4c988f-bdbe-466c-886b-517505203e3b\") " pod="calico-system/whisker-f874f7b7c-7zt6p" Mar 13 00:36:59.286010 kubelet[2856]: I0313 00:36:59.285043 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4lfq\" (UniqueName: \"kubernetes.io/projected/10c18fc6-35a9-4a29-b0ed-122436522760-kube-api-access-h4lfq\") pod \"coredns-674b8bbfcf-4hngh\" (UID: \"10c18fc6-35a9-4a29-b0ed-122436522760\") " pod="kube-system/coredns-674b8bbfcf-4hngh" Mar 13 00:36:59.286010 kubelet[2856]: I0313 00:36:59.285056 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ba1a71bc-658c-494d-92f9-f4db1f8ea894-calico-apiserver-certs\") pod \"calico-apiserver-658d87f884-8zfwm\" (UID: \"ba1a71bc-658c-494d-92f9-f4db1f8ea894\") " pod="calico-system/calico-apiserver-658d87f884-8zfwm" Mar 13 00:36:59.285761 systemd[1]: Created slice kubepods-burstable-pod10c18fc6_35a9_4a29_b0ed_122436522760.slice - libcontainer container kubepods-burstable-pod10c18fc6_35a9_4a29_b0ed_122436522760.slice. Mar 13 00:36:59.286233 kubelet[2856]: I0313 00:36:59.285070 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcb2n\" (UniqueName: \"kubernetes.io/projected/ba1a71bc-658c-494d-92f9-f4db1f8ea894-kube-api-access-dcb2n\") pod \"calico-apiserver-658d87f884-8zfwm\" (UID: \"ba1a71bc-658c-494d-92f9-f4db1f8ea894\") " pod="calico-system/calico-apiserver-658d87f884-8zfwm" Mar 13 00:36:59.286233 kubelet[2856]: I0313 00:36:59.285082 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3354a6d-6ea0-4ffe-b3e8-3e508fd79239-config-volume\") pod \"coredns-674b8bbfcf-mqhpg\" (UID: \"e3354a6d-6ea0-4ffe-b3e8-3e508fd79239\") " pod="kube-system/coredns-674b8bbfcf-mqhpg" Mar 13 00:36:59.286233 kubelet[2856]: I0313 00:36:59.285101 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cc4c988f-bdbe-466c-886b-517505203e3b-whisker-backend-key-pair\") pod \"whisker-f874f7b7c-7zt6p\" (UID: \"cc4c988f-bdbe-466c-886b-517505203e3b\") " pod="calico-system/whisker-f874f7b7c-7zt6p" Mar 13 00:36:59.286233 kubelet[2856]: I0313 00:36:59.285116 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8flx\" (UniqueName: \"kubernetes.io/projected/cc4c988f-bdbe-466c-886b-517505203e3b-kube-api-access-d8flx\") pod \"whisker-f874f7b7c-7zt6p\" (UID: \"cc4c988f-bdbe-466c-886b-517505203e3b\") " pod="calico-system/whisker-f874f7b7c-7zt6p" Mar 13 00:36:59.286233 kubelet[2856]: I0313 00:36:59.285130 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc4c988f-bdbe-466c-886b-517505203e3b-whisker-ca-bundle\") pod \"whisker-f874f7b7c-7zt6p\" (UID: \"cc4c988f-bdbe-466c-886b-517505203e3b\") " pod="calico-system/whisker-f874f7b7c-7zt6p" Mar 13 00:36:59.286356 kubelet[2856]: I0313 00:36:59.285143 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7218a6dc-e47d-4c9f-9b07-d8349ede2479-config\") pod \"goldmane-5b85766d88-w6hs4\" (UID: \"7218a6dc-e47d-4c9f-9b07-d8349ede2479\") " pod="calico-system/goldmane-5b85766d88-w6hs4" Mar 13 00:36:59.286356 kubelet[2856]: I0313 00:36:59.285159 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jf92\" (UniqueName: \"kubernetes.io/projected/7218a6dc-e47d-4c9f-9b07-d8349ede2479-kube-api-access-6jf92\") pod \"goldmane-5b85766d88-w6hs4\" (UID: \"7218a6dc-e47d-4c9f-9b07-d8349ede2479\") " pod="calico-system/goldmane-5b85766d88-w6hs4" Mar 13 00:36:59.286356 kubelet[2856]: I0313 00:36:59.285173 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/95fefb4b-b753-41cf-8c25-ddb2b4541f4b-calico-apiserver-certs\") pod \"calico-apiserver-658d87f884-kc54z\" (UID: \"95fefb4b-b753-41cf-8c25-ddb2b4541f4b\") " pod="calico-system/calico-apiserver-658d87f884-kc54z" Mar 13 00:36:59.286356 kubelet[2856]: I0313 00:36:59.285188 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2dlt\" (UniqueName: \"kubernetes.io/projected/e3354a6d-6ea0-4ffe-b3e8-3e508fd79239-kube-api-access-k2dlt\") pod \"coredns-674b8bbfcf-mqhpg\" (UID: \"e3354a6d-6ea0-4ffe-b3e8-3e508fd79239\") " pod="kube-system/coredns-674b8bbfcf-mqhpg" Mar 13 00:36:59.286356 kubelet[2856]: I0313 00:36:59.285203 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzmqb\" (UniqueName: \"kubernetes.io/projected/95fefb4b-b753-41cf-8c25-ddb2b4541f4b-kube-api-access-gzmqb\") pod \"calico-apiserver-658d87f884-kc54z\" (UID: \"95fefb4b-b753-41cf-8c25-ddb2b4541f4b\") " pod="calico-system/calico-apiserver-658d87f884-kc54z" Mar 13 00:36:59.438191 systemd[1]: Created slice kubepods-besteffort-pod129d677b_52bb_4b01_a36f_4d698e8848e6.slice - libcontainer container kubepods-besteffort-pod129d677b_52bb_4b01_a36f_4d698e8848e6.slice. Mar 13 00:36:59.442823 containerd[1639]: time="2026-03-13T00:36:59.442794870Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7z8mv,Uid:129d677b-52bb-4b01-a36f-4d698e8848e6,Namespace:calico-system,Attempt:0,}" Mar 13 00:36:59.532599 containerd[1639]: time="2026-03-13T00:36:59.532488090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mqhpg,Uid:e3354a6d-6ea0-4ffe-b3e8-3e508fd79239,Namespace:kube-system,Attempt:0,}" Mar 13 00:36:59.554635 containerd[1639]: time="2026-03-13T00:36:59.554590052Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-f874f7b7c-7zt6p,Uid:cc4c988f-bdbe-466c-886b-517505203e3b,Namespace:calico-system,Attempt:0,}" Mar 13 00:36:59.561512 containerd[1639]: time="2026-03-13T00:36:59.561477083Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658d87f884-kc54z,Uid:95fefb4b-b753-41cf-8c25-ddb2b4541f4b,Namespace:calico-system,Attempt:0,}" Mar 13 00:36:59.568649 containerd[1639]: time="2026-03-13T00:36:59.568622069Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697dc6db9-lsdmd,Uid:3a859d32-64de-4fa7-bc30-293dcef00461,Namespace:calico-system,Attempt:0,}" Mar 13 00:36:59.575724 containerd[1639]: time="2026-03-13T00:36:59.575695692Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-w6hs4,Uid:7218a6dc-e47d-4c9f-9b07-d8349ede2479,Namespace:calico-system,Attempt:0,}" Mar 13 00:36:59.585248 containerd[1639]: time="2026-03-13T00:36:59.585220096Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658d87f884-8zfwm,Uid:ba1a71bc-658c-494d-92f9-f4db1f8ea894,Namespace:calico-system,Attempt:0,}" Mar 13 00:36:59.593395 containerd[1639]: time="2026-03-13T00:36:59.593350369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4hngh,Uid:10c18fc6-35a9-4a29-b0ed-122436522760,Namespace:kube-system,Attempt:0,}" Mar 13 00:36:59.643199 containerd[1639]: time="2026-03-13T00:36:59.643146286Z" level=error msg="Failed to destroy network for sandbox \"dc34f6922ac4753f1267cdc81fbf4be4ea3c0bf7556d85f2d846525ea2f955ae\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.645404 containerd[1639]: time="2026-03-13T00:36:59.645365058Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7z8mv,Uid:129d677b-52bb-4b01-a36f-4d698e8848e6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dc34f6922ac4753f1267cdc81fbf4be4ea3c0bf7556d85f2d846525ea2f955ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.646174 kubelet[2856]: E0313 00:36:59.645766 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dc34f6922ac4753f1267cdc81fbf4be4ea3c0bf7556d85f2d846525ea2f955ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.646174 kubelet[2856]: E0313 00:36:59.645833 2856 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dc34f6922ac4753f1267cdc81fbf4be4ea3c0bf7556d85f2d846525ea2f955ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7z8mv" Mar 13 00:36:59.646174 kubelet[2856]: E0313 00:36:59.645852 2856 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dc34f6922ac4753f1267cdc81fbf4be4ea3c0bf7556d85f2d846525ea2f955ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7z8mv" Mar 13 00:36:59.646668 kubelet[2856]: E0313 00:36:59.645902 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7z8mv_calico-system(129d677b-52bb-4b01-a36f-4d698e8848e6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7z8mv_calico-system(129d677b-52bb-4b01-a36f-4d698e8848e6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dc34f6922ac4753f1267cdc81fbf4be4ea3c0bf7556d85f2d846525ea2f955ae\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7z8mv" podUID="129d677b-52bb-4b01-a36f-4d698e8848e6" Mar 13 00:36:59.691476 containerd[1639]: time="2026-03-13T00:36:59.691435459Z" level=error msg="Failed to destroy network for sandbox \"d308573cfdd50613aac2608e9593a371881e04c597c4695af6bab3fe72cef4b5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.696668 containerd[1639]: time="2026-03-13T00:36:59.696631196Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mqhpg,Uid:e3354a6d-6ea0-4ffe-b3e8-3e508fd79239,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d308573cfdd50613aac2608e9593a371881e04c597c4695af6bab3fe72cef4b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.696854 kubelet[2856]: E0313 00:36:59.696824 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d308573cfdd50613aac2608e9593a371881e04c597c4695af6bab3fe72cef4b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.696913 kubelet[2856]: E0313 00:36:59.696878 2856 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d308573cfdd50613aac2608e9593a371881e04c597c4695af6bab3fe72cef4b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-mqhpg" Mar 13 00:36:59.696913 kubelet[2856]: E0313 00:36:59.696897 2856 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d308573cfdd50613aac2608e9593a371881e04c597c4695af6bab3fe72cef4b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-mqhpg" Mar 13 00:36:59.696964 kubelet[2856]: E0313 00:36:59.696943 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-mqhpg_kube-system(e3354a6d-6ea0-4ffe-b3e8-3e508fd79239)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-mqhpg_kube-system(e3354a6d-6ea0-4ffe-b3e8-3e508fd79239)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d308573cfdd50613aac2608e9593a371881e04c597c4695af6bab3fe72cef4b5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-mqhpg" podUID="e3354a6d-6ea0-4ffe-b3e8-3e508fd79239" Mar 13 00:36:59.726201 containerd[1639]: time="2026-03-13T00:36:59.726160834Z" level=error msg="Failed to destroy network for sandbox \"2e9bc9471e770eff07e9b315f1c9b59f02c9032c276b9b76af5c05682a3a27dd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.728634 containerd[1639]: time="2026-03-13T00:36:59.728603406Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-f874f7b7c-7zt6p,Uid:cc4c988f-bdbe-466c-886b-517505203e3b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e9bc9471e770eff07e9b315f1c9b59f02c9032c276b9b76af5c05682a3a27dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.729395 kubelet[2856]: E0313 00:36:59.728948 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e9bc9471e770eff07e9b315f1c9b59f02c9032c276b9b76af5c05682a3a27dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.729395 kubelet[2856]: E0313 00:36:59.729035 2856 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e9bc9471e770eff07e9b315f1c9b59f02c9032c276b9b76af5c05682a3a27dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-f874f7b7c-7zt6p" Mar 13 00:36:59.729395 kubelet[2856]: E0313 00:36:59.729054 2856 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e9bc9471e770eff07e9b315f1c9b59f02c9032c276b9b76af5c05682a3a27dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-f874f7b7c-7zt6p" Mar 13 00:36:59.730420 kubelet[2856]: E0313 00:36:59.729109 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-f874f7b7c-7zt6p_calico-system(cc4c988f-bdbe-466c-886b-517505203e3b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-f874f7b7c-7zt6p_calico-system(cc4c988f-bdbe-466c-886b-517505203e3b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2e9bc9471e770eff07e9b315f1c9b59f02c9032c276b9b76af5c05682a3a27dd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-f874f7b7c-7zt6p" podUID="cc4c988f-bdbe-466c-886b-517505203e3b" Mar 13 00:36:59.755342 containerd[1639]: time="2026-03-13T00:36:59.755300109Z" level=error msg="Failed to destroy network for sandbox \"73cbaa3114e08afd140e3bb162f4366c5d416dff26fadec5a5627fc7c26d7ff1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.756344 containerd[1639]: time="2026-03-13T00:36:59.756281888Z" level=error msg="Failed to destroy network for sandbox \"97bd38704d70388d802b1877480843719a0b616e1681d1a08029696e514cd2e5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.757306 containerd[1639]: time="2026-03-13T00:36:59.757279861Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658d87f884-8zfwm,Uid:ba1a71bc-658c-494d-92f9-f4db1f8ea894,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"73cbaa3114e08afd140e3bb162f4366c5d416dff26fadec5a5627fc7c26d7ff1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.757802 kubelet[2856]: E0313 00:36:59.757554 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73cbaa3114e08afd140e3bb162f4366c5d416dff26fadec5a5627fc7c26d7ff1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.757802 kubelet[2856]: E0313 00:36:59.757638 2856 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73cbaa3114e08afd140e3bb162f4366c5d416dff26fadec5a5627fc7c26d7ff1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-658d87f884-8zfwm" Mar 13 00:36:59.757802 kubelet[2856]: E0313 00:36:59.757747 2856 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73cbaa3114e08afd140e3bb162f4366c5d416dff26fadec5a5627fc7c26d7ff1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-658d87f884-8zfwm" Mar 13 00:36:59.757938 kubelet[2856]: E0313 00:36:59.757918 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-658d87f884-8zfwm_calico-system(ba1a71bc-658c-494d-92f9-f4db1f8ea894)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-658d87f884-8zfwm_calico-system(ba1a71bc-658c-494d-92f9-f4db1f8ea894)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"73cbaa3114e08afd140e3bb162f4366c5d416dff26fadec5a5627fc7c26d7ff1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-658d87f884-8zfwm" podUID="ba1a71bc-658c-494d-92f9-f4db1f8ea894" Mar 13 00:36:59.759411 containerd[1639]: time="2026-03-13T00:36:59.759371642Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-w6hs4,Uid:7218a6dc-e47d-4c9f-9b07-d8349ede2479,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"97bd38704d70388d802b1877480843719a0b616e1681d1a08029696e514cd2e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.759980 kubelet[2856]: E0313 00:36:59.759944 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97bd38704d70388d802b1877480843719a0b616e1681d1a08029696e514cd2e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.760266 kubelet[2856]: E0313 00:36:59.760164 2856 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97bd38704d70388d802b1877480843719a0b616e1681d1a08029696e514cd2e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-w6hs4" Mar 13 00:36:59.760266 kubelet[2856]: E0313 00:36:59.760185 2856 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97bd38704d70388d802b1877480843719a0b616e1681d1a08029696e514cd2e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-w6hs4" Mar 13 00:36:59.760266 kubelet[2856]: E0313 00:36:59.760239 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-5b85766d88-w6hs4_calico-system(7218a6dc-e47d-4c9f-9b07-d8349ede2479)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-5b85766d88-w6hs4_calico-system(7218a6dc-e47d-4c9f-9b07-d8349ede2479)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"97bd38704d70388d802b1877480843719a0b616e1681d1a08029696e514cd2e5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-5b85766d88-w6hs4" podUID="7218a6dc-e47d-4c9f-9b07-d8349ede2479" Mar 13 00:36:59.760715 containerd[1639]: time="2026-03-13T00:36:59.760683630Z" level=error msg="Failed to destroy network for sandbox \"49ea5b855ed4d9149225cb5ed856d43756842650de6ccaca261f7c0ca59a0e68\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.762377 containerd[1639]: time="2026-03-13T00:36:59.762341372Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4hngh,Uid:10c18fc6-35a9-4a29-b0ed-122436522760,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"49ea5b855ed4d9149225cb5ed856d43756842650de6ccaca261f7c0ca59a0e68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.763018 kubelet[2856]: E0313 00:36:59.762977 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49ea5b855ed4d9149225cb5ed856d43756842650de6ccaca261f7c0ca59a0e68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.763238 kubelet[2856]: E0313 00:36:59.763028 2856 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49ea5b855ed4d9149225cb5ed856d43756842650de6ccaca261f7c0ca59a0e68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-4hngh" Mar 13 00:36:59.763238 kubelet[2856]: E0313 00:36:59.763045 2856 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49ea5b855ed4d9149225cb5ed856d43756842650de6ccaca261f7c0ca59a0e68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-4hngh" Mar 13 00:36:59.763238 kubelet[2856]: E0313 00:36:59.763092 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-4hngh_kube-system(10c18fc6-35a9-4a29-b0ed-122436522760)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-4hngh_kube-system(10c18fc6-35a9-4a29-b0ed-122436522760)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"49ea5b855ed4d9149225cb5ed856d43756842650de6ccaca261f7c0ca59a0e68\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-4hngh" podUID="10c18fc6-35a9-4a29-b0ed-122436522760" Mar 13 00:36:59.766260 containerd[1639]: time="2026-03-13T00:36:59.766236064Z" level=error msg="Failed to destroy network for sandbox \"bae0f1d2d6cdcf0e9b70cdb86925bc8b6517fc2ff81a8d58a53b3a490bed1db4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.769218 containerd[1639]: time="2026-03-13T00:36:59.769190293Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658d87f884-kc54z,Uid:95fefb4b-b753-41cf-8c25-ddb2b4541f4b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bae0f1d2d6cdcf0e9b70cdb86925bc8b6517fc2ff81a8d58a53b3a490bed1db4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.769449 kubelet[2856]: E0313 00:36:59.769426 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bae0f1d2d6cdcf0e9b70cdb86925bc8b6517fc2ff81a8d58a53b3a490bed1db4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.769523 kubelet[2856]: E0313 00:36:59.769513 2856 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bae0f1d2d6cdcf0e9b70cdb86925bc8b6517fc2ff81a8d58a53b3a490bed1db4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-658d87f884-kc54z" Mar 13 00:36:59.769640 kubelet[2856]: E0313 00:36:59.769573 2856 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bae0f1d2d6cdcf0e9b70cdb86925bc8b6517fc2ff81a8d58a53b3a490bed1db4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-658d87f884-kc54z" Mar 13 00:36:59.769640 kubelet[2856]: E0313 00:36:59.769613 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-658d87f884-kc54z_calico-system(95fefb4b-b753-41cf-8c25-ddb2b4541f4b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-658d87f884-kc54z_calico-system(95fefb4b-b753-41cf-8c25-ddb2b4541f4b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bae0f1d2d6cdcf0e9b70cdb86925bc8b6517fc2ff81a8d58a53b3a490bed1db4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-658d87f884-kc54z" podUID="95fefb4b-b753-41cf-8c25-ddb2b4541f4b" Mar 13 00:36:59.772678 containerd[1639]: time="2026-03-13T00:36:59.772654621Z" level=error msg="Failed to destroy network for sandbox \"de507e34182a16af79614f3ce639ed623d91814ce5e718d80b2b8351ce49a3ec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.774299 containerd[1639]: time="2026-03-13T00:36:59.774276810Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697dc6db9-lsdmd,Uid:3a859d32-64de-4fa7-bc30-293dcef00461,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"de507e34182a16af79614f3ce639ed623d91814ce5e718d80b2b8351ce49a3ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.774456 kubelet[2856]: E0313 00:36:59.774419 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de507e34182a16af79614f3ce639ed623d91814ce5e718d80b2b8351ce49a3ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:36:59.774562 kubelet[2856]: E0313 00:36:59.774520 2856 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de507e34182a16af79614f3ce639ed623d91814ce5e718d80b2b8351ce49a3ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697dc6db9-lsdmd" Mar 13 00:36:59.774562 kubelet[2856]: E0313 00:36:59.774540 2856 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de507e34182a16af79614f3ce639ed623d91814ce5e718d80b2b8351ce49a3ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697dc6db9-lsdmd" Mar 13 00:36:59.774679 kubelet[2856]: E0313 00:36:59.774652 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-697dc6db9-lsdmd_calico-system(3a859d32-64de-4fa7-bc30-293dcef00461)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-697dc6db9-lsdmd_calico-system(3a859d32-64de-4fa7-bc30-293dcef00461)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"de507e34182a16af79614f3ce639ed623d91814ce5e718d80b2b8351ce49a3ec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-697dc6db9-lsdmd" podUID="3a859d32-64de-4fa7-bc30-293dcef00461" Mar 13 00:36:59.974360 containerd[1639]: time="2026-03-13T00:36:59.974317607Z" level=info msg="CreateContainer within sandbox \"cee249322bb838c79b6107e34b71f57b6fb7c6577f1ba04dac3bd2c2940e7e54\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 13 00:36:59.988591 containerd[1639]: time="2026-03-13T00:36:59.985784754Z" level=info msg="Container 72c4204159c9eea20f6165e4b8722fe227fc9601e06fbe5ccdeb19cfc2949e9d: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:00.003291 containerd[1639]: time="2026-03-13T00:37:00.003244926Z" level=info msg="CreateContainer within sandbox \"cee249322bb838c79b6107e34b71f57b6fb7c6577f1ba04dac3bd2c2940e7e54\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"72c4204159c9eea20f6165e4b8722fe227fc9601e06fbe5ccdeb19cfc2949e9d\"" Mar 13 00:37:00.004046 containerd[1639]: time="2026-03-13T00:37:00.004019833Z" level=info msg="StartContainer for \"72c4204159c9eea20f6165e4b8722fe227fc9601e06fbe5ccdeb19cfc2949e9d\"" Mar 13 00:37:00.006038 containerd[1639]: time="2026-03-13T00:37:00.006001102Z" level=info msg="connecting to shim 72c4204159c9eea20f6165e4b8722fe227fc9601e06fbe5ccdeb19cfc2949e9d" address="unix:///run/containerd/s/11587104b787309f31d7f0f14f16784c0047e26f0c7fa0af637ad6acdc1a0c7b" protocol=ttrpc version=3 Mar 13 00:37:00.036736 systemd[1]: Started cri-containerd-72c4204159c9eea20f6165e4b8722fe227fc9601e06fbe5ccdeb19cfc2949e9d.scope - libcontainer container 72c4204159c9eea20f6165e4b8722fe227fc9601e06fbe5ccdeb19cfc2949e9d. Mar 13 00:37:00.106801 containerd[1639]: time="2026-03-13T00:37:00.106713155Z" level=info msg="StartContainer for \"72c4204159c9eea20f6165e4b8722fe227fc9601e06fbe5ccdeb19cfc2949e9d\" returns successfully" Mar 13 00:37:00.290872 kubelet[2856]: I0313 00:37:00.290766 2856 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/cc4c988f-bdbe-466c-886b-517505203e3b-nginx-config\") pod \"cc4c988f-bdbe-466c-886b-517505203e3b\" (UID: \"cc4c988f-bdbe-466c-886b-517505203e3b\") " Mar 13 00:37:00.290872 kubelet[2856]: I0313 00:37:00.290807 2856 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cc4c988f-bdbe-466c-886b-517505203e3b-whisker-backend-key-pair\") pod \"cc4c988f-bdbe-466c-886b-517505203e3b\" (UID: \"cc4c988f-bdbe-466c-886b-517505203e3b\") " Mar 13 00:37:00.290872 kubelet[2856]: I0313 00:37:00.290832 2856 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8flx\" (UniqueName: \"kubernetes.io/projected/cc4c988f-bdbe-466c-886b-517505203e3b-kube-api-access-d8flx\") pod \"cc4c988f-bdbe-466c-886b-517505203e3b\" (UID: \"cc4c988f-bdbe-466c-886b-517505203e3b\") " Mar 13 00:37:00.290872 kubelet[2856]: I0313 00:37:00.290856 2856 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc4c988f-bdbe-466c-886b-517505203e3b-whisker-ca-bundle\") pod \"cc4c988f-bdbe-466c-886b-517505203e3b\" (UID: \"cc4c988f-bdbe-466c-886b-517505203e3b\") " Mar 13 00:37:00.293133 kubelet[2856]: I0313 00:37:00.293102 2856 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc4c988f-bdbe-466c-886b-517505203e3b-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "cc4c988f-bdbe-466c-886b-517505203e3b" (UID: "cc4c988f-bdbe-466c-886b-517505203e3b"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 13 00:37:00.293533 kubelet[2856]: I0313 00:37:00.293513 2856 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc4c988f-bdbe-466c-886b-517505203e3b-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "cc4c988f-bdbe-466c-886b-517505203e3b" (UID: "cc4c988f-bdbe-466c-886b-517505203e3b"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 13 00:37:00.296379 systemd[1]: var-lib-kubelet-pods-cc4c988f\x2dbdbe\x2d466c\x2d886b\x2d517505203e3b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dd8flx.mount: Deactivated successfully. Mar 13 00:37:00.297614 kubelet[2856]: I0313 00:37:00.296886 2856 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc4c988f-bdbe-466c-886b-517505203e3b-kube-api-access-d8flx" (OuterVolumeSpecName: "kube-api-access-d8flx") pod "cc4c988f-bdbe-466c-886b-517505203e3b" (UID: "cc4c988f-bdbe-466c-886b-517505203e3b"). InnerVolumeSpecName "kube-api-access-d8flx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 13 00:37:00.299374 kubelet[2856]: I0313 00:37:00.299344 2856 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4c988f-bdbe-466c-886b-517505203e3b-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "cc4c988f-bdbe-466c-886b-517505203e3b" (UID: "cc4c988f-bdbe-466c-886b-517505203e3b"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 13 00:37:00.300425 systemd[1]: var-lib-kubelet-pods-cc4c988f\x2dbdbe\x2d466c\x2d886b\x2d517505203e3b-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 13 00:37:00.391212 kubelet[2856]: I0313 00:37:00.391137 2856 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/cc4c988f-bdbe-466c-886b-517505203e3b-nginx-config\") on node \"ci-4459-2-4-n-23cf6448d4\" DevicePath \"\"" Mar 13 00:37:00.391212 kubelet[2856]: I0313 00:37:00.391170 2856 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cc4c988f-bdbe-466c-886b-517505203e3b-whisker-backend-key-pair\") on node \"ci-4459-2-4-n-23cf6448d4\" DevicePath \"\"" Mar 13 00:37:00.391212 kubelet[2856]: I0313 00:37:00.391180 2856 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d8flx\" (UniqueName: \"kubernetes.io/projected/cc4c988f-bdbe-466c-886b-517505203e3b-kube-api-access-d8flx\") on node \"ci-4459-2-4-n-23cf6448d4\" DevicePath \"\"" Mar 13 00:37:00.391212 kubelet[2856]: I0313 00:37:00.391188 2856 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc4c988f-bdbe-466c-886b-517505203e3b-whisker-ca-bundle\") on node \"ci-4459-2-4-n-23cf6448d4\" DevicePath \"\"" Mar 13 00:37:00.417481 systemd[1]: Removed slice kubepods-besteffort-podcc4c988f_bdbe_466c_886b_517505203e3b.slice - libcontainer container kubepods-besteffort-podcc4c988f_bdbe_466c_886b_517505203e3b.slice. Mar 13 00:37:01.009189 kubelet[2856]: I0313 00:37:01.008541 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-5n78g" podStartSLOduration=5.934502725 podStartE2EDuration="22.008520655s" podCreationTimestamp="2026-03-13 00:36:39 +0000 UTC" firstStartedPulling="2026-03-13 00:36:40.130289909 +0000 UTC m=+18.009396763" lastFinishedPulling="2026-03-13 00:36:56.204307833 +0000 UTC m=+34.083414693" observedRunningTime="2026-03-13 00:37:00.994350535 +0000 UTC m=+38.873457417" watchObservedRunningTime="2026-03-13 00:37:01.008520655 +0000 UTC m=+38.887627510" Mar 13 00:37:01.053495 systemd[1]: Created slice kubepods-besteffort-pod2629a7a3_baaf_4147_8546_f8ce9619b915.slice - libcontainer container kubepods-besteffort-pod2629a7a3_baaf_4147_8546_f8ce9619b915.slice. Mar 13 00:37:01.095899 kubelet[2856]: I0313 00:37:01.095770 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2bmm\" (UniqueName: \"kubernetes.io/projected/2629a7a3-baaf-4147-8546-f8ce9619b915-kube-api-access-c2bmm\") pod \"whisker-5d9cfcb886-p76dp\" (UID: \"2629a7a3-baaf-4147-8546-f8ce9619b915\") " pod="calico-system/whisker-5d9cfcb886-p76dp" Mar 13 00:37:01.095899 kubelet[2856]: I0313 00:37:01.095826 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/2629a7a3-baaf-4147-8546-f8ce9619b915-nginx-config\") pod \"whisker-5d9cfcb886-p76dp\" (UID: \"2629a7a3-baaf-4147-8546-f8ce9619b915\") " pod="calico-system/whisker-5d9cfcb886-p76dp" Mar 13 00:37:01.095899 kubelet[2856]: I0313 00:37:01.095842 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2629a7a3-baaf-4147-8546-f8ce9619b915-whisker-backend-key-pair\") pod \"whisker-5d9cfcb886-p76dp\" (UID: \"2629a7a3-baaf-4147-8546-f8ce9619b915\") " pod="calico-system/whisker-5d9cfcb886-p76dp" Mar 13 00:37:01.095899 kubelet[2856]: I0313 00:37:01.095891 2856 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2629a7a3-baaf-4147-8546-f8ce9619b915-whisker-ca-bundle\") pod \"whisker-5d9cfcb886-p76dp\" (UID: \"2629a7a3-baaf-4147-8546-f8ce9619b915\") " pod="calico-system/whisker-5d9cfcb886-p76dp" Mar 13 00:37:01.357434 containerd[1639]: time="2026-03-13T00:37:01.357082994Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5d9cfcb886-p76dp,Uid:2629a7a3-baaf-4147-8546-f8ce9619b915,Namespace:calico-system,Attempt:0,}" Mar 13 00:37:01.576081 systemd-networkd[1523]: cali6eaff358f55: Link UP Mar 13 00:37:01.576236 systemd-networkd[1523]: cali6eaff358f55: Gained carrier Mar 13 00:37:01.618702 containerd[1639]: 2026-03-13 00:37:01.382 [ERROR][3971] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 13 00:37:01.618702 containerd[1639]: 2026-03-13 00:37:01.454 [INFO][3971] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--23cf6448d4-k8s-whisker--5d9cfcb886--p76dp-eth0 whisker-5d9cfcb886- calico-system 2629a7a3-baaf-4147-8546-f8ce9619b915 895 0 2026-03-13 00:37:01 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5d9cfcb886 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-4-n-23cf6448d4 whisker-5d9cfcb886-p76dp eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali6eaff358f55 [] [] }} ContainerID="228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" Namespace="calico-system" Pod="whisker-5d9cfcb886-p76dp" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-whisker--5d9cfcb886--p76dp-" Mar 13 00:37:01.618702 containerd[1639]: 2026-03-13 00:37:01.454 [INFO][3971] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" Namespace="calico-system" Pod="whisker-5d9cfcb886-p76dp" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-whisker--5d9cfcb886--p76dp-eth0" Mar 13 00:37:01.618702 containerd[1639]: 2026-03-13 00:37:01.491 [INFO][4036] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" HandleID="k8s-pod-network.228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" Workload="ci--4459--2--4--n--23cf6448d4-k8s-whisker--5d9cfcb886--p76dp-eth0" Mar 13 00:37:01.618938 containerd[1639]: 2026-03-13 00:37:01.499 [INFO][4036] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" HandleID="k8s-pod-network.228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" Workload="ci--4459--2--4--n--23cf6448d4-k8s-whisker--5d9cfcb886--p76dp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fddd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-23cf6448d4", "pod":"whisker-5d9cfcb886-p76dp", "timestamp":"2026-03-13 00:37:01.491328072 +0000 UTC"}, Hostname:"ci-4459-2-4-n-23cf6448d4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002dbb80)} Mar 13 00:37:01.618938 containerd[1639]: 2026-03-13 00:37:01.499 [INFO][4036] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:37:01.618938 containerd[1639]: 2026-03-13 00:37:01.499 [INFO][4036] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:37:01.618938 containerd[1639]: 2026-03-13 00:37:01.499 [INFO][4036] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-23cf6448d4' Mar 13 00:37:01.618938 containerd[1639]: 2026-03-13 00:37:01.502 [INFO][4036] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:01.618938 containerd[1639]: 2026-03-13 00:37:01.508 [INFO][4036] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:01.618938 containerd[1639]: 2026-03-13 00:37:01.512 [INFO][4036] ipam/ipam.go 526: Trying affinity for 192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:01.618938 containerd[1639]: 2026-03-13 00:37:01.515 [INFO][4036] ipam/ipam.go 160: Attempting to load block cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:01.618938 containerd[1639]: 2026-03-13 00:37:01.518 [INFO][4036] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:01.619119 containerd[1639]: 2026-03-13 00:37:01.518 [INFO][4036] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.107.0/26 handle="k8s-pod-network.228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:01.619119 containerd[1639]: 2026-03-13 00:37:01.527 [INFO][4036] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add Mar 13 00:37:01.619119 containerd[1639]: 2026-03-13 00:37:01.539 [INFO][4036] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.107.0/26 handle="k8s-pod-network.228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:01.619119 containerd[1639]: 2026-03-13 00:37:01.557 [INFO][4036] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.107.1/26] block=192.168.107.0/26 handle="k8s-pod-network.228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:01.619119 containerd[1639]: 2026-03-13 00:37:01.557 [INFO][4036] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.107.1/26] handle="k8s-pod-network.228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:01.619119 containerd[1639]: 2026-03-13 00:37:01.558 [INFO][4036] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:37:01.619119 containerd[1639]: 2026-03-13 00:37:01.558 [INFO][4036] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.107.1/26] IPv6=[] ContainerID="228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" HandleID="k8s-pod-network.228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" Workload="ci--4459--2--4--n--23cf6448d4-k8s-whisker--5d9cfcb886--p76dp-eth0" Mar 13 00:37:01.619248 containerd[1639]: 2026-03-13 00:37:01.561 [INFO][3971] cni-plugin/k8s.go 418: Populated endpoint ContainerID="228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" Namespace="calico-system" Pod="whisker-5d9cfcb886-p76dp" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-whisker--5d9cfcb886--p76dp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-whisker--5d9cfcb886--p76dp-eth0", GenerateName:"whisker-5d9cfcb886-", Namespace:"calico-system", SelfLink:"", UID:"2629a7a3-baaf-4147-8546-f8ce9619b915", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5d9cfcb886", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"", Pod:"whisker-5d9cfcb886-p76dp", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.107.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali6eaff358f55", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:01.619248 containerd[1639]: 2026-03-13 00:37:01.561 [INFO][3971] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.1/32] ContainerID="228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" Namespace="calico-system" Pod="whisker-5d9cfcb886-p76dp" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-whisker--5d9cfcb886--p76dp-eth0" Mar 13 00:37:01.619321 containerd[1639]: 2026-03-13 00:37:01.561 [INFO][3971] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6eaff358f55 ContainerID="228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" Namespace="calico-system" Pod="whisker-5d9cfcb886-p76dp" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-whisker--5d9cfcb886--p76dp-eth0" Mar 13 00:37:01.619321 containerd[1639]: 2026-03-13 00:37:01.576 [INFO][3971] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" Namespace="calico-system" Pod="whisker-5d9cfcb886-p76dp" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-whisker--5d9cfcb886--p76dp-eth0" Mar 13 00:37:01.619361 containerd[1639]: 2026-03-13 00:37:01.577 [INFO][3971] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" Namespace="calico-system" Pod="whisker-5d9cfcb886-p76dp" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-whisker--5d9cfcb886--p76dp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-whisker--5d9cfcb886--p76dp-eth0", GenerateName:"whisker-5d9cfcb886-", Namespace:"calico-system", SelfLink:"", UID:"2629a7a3-baaf-4147-8546-f8ce9619b915", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 37, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5d9cfcb886", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add", Pod:"whisker-5d9cfcb886-p76dp", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.107.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali6eaff358f55", MAC:"46:f1:40:7a:d8:2e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:01.619409 containerd[1639]: 2026-03-13 00:37:01.613 [INFO][3971] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" Namespace="calico-system" Pod="whisker-5d9cfcb886-p76dp" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-whisker--5d9cfcb886--p76dp-eth0" Mar 13 00:37:01.692028 containerd[1639]: time="2026-03-13T00:37:01.691988505Z" level=info msg="connecting to shim 228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add" address="unix:///run/containerd/s/94aaae408103928c64178379a31d10d45097da2522256a1e432560c75e5504df" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:01.731752 systemd[1]: Started cri-containerd-228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add.scope - libcontainer container 228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add. Mar 13 00:37:01.818458 containerd[1639]: time="2026-03-13T00:37:01.818163640Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5d9cfcb886-p76dp,Uid:2629a7a3-baaf-4147-8546-f8ce9619b915,Namespace:calico-system,Attempt:0,} returns sandbox id \"228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add\"" Mar 13 00:37:01.820467 containerd[1639]: time="2026-03-13T00:37:01.820415944Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 13 00:37:02.414758 systemd-networkd[1523]: vxlan.calico: Link UP Mar 13 00:37:02.416708 kubelet[2856]: I0313 00:37:02.416488 2856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc4c988f-bdbe-466c-886b-517505203e3b" path="/var/lib/kubelet/pods/cc4c988f-bdbe-466c-886b-517505203e3b/volumes" Mar 13 00:37:02.414767 systemd-networkd[1523]: vxlan.calico: Gained carrier Mar 13 00:37:03.188598 containerd[1639]: time="2026-03-13T00:37:03.188518837Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:03.189702 containerd[1639]: time="2026-03-13T00:37:03.189669033Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 13 00:37:03.191220 containerd[1639]: time="2026-03-13T00:37:03.191042396Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:03.193229 containerd[1639]: time="2026-03-13T00:37:03.193184918Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:03.193803 containerd[1639]: time="2026-03-13T00:37:03.193711962Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 1.3732682s" Mar 13 00:37:03.193803 containerd[1639]: time="2026-03-13T00:37:03.193734194Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 13 00:37:03.198803 containerd[1639]: time="2026-03-13T00:37:03.198780960Z" level=info msg="CreateContainer within sandbox \"228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 13 00:37:03.208441 containerd[1639]: time="2026-03-13T00:37:03.207694238Z" level=info msg="Container dccdb53ed6e64250062f86ccc1630ec5d58c4abe0b882c2b067d968ed260b7e2: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:03.220729 containerd[1639]: time="2026-03-13T00:37:03.220702462Z" level=info msg="CreateContainer within sandbox \"228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"dccdb53ed6e64250062f86ccc1630ec5d58c4abe0b882c2b067d968ed260b7e2\"" Mar 13 00:37:03.221898 containerd[1639]: time="2026-03-13T00:37:03.221871629Z" level=info msg="StartContainer for \"dccdb53ed6e64250062f86ccc1630ec5d58c4abe0b882c2b067d968ed260b7e2\"" Mar 13 00:37:03.223189 containerd[1639]: time="2026-03-13T00:37:03.223168040Z" level=info msg="connecting to shim dccdb53ed6e64250062f86ccc1630ec5d58c4abe0b882c2b067d968ed260b7e2" address="unix:///run/containerd/s/94aaae408103928c64178379a31d10d45097da2522256a1e432560c75e5504df" protocol=ttrpc version=3 Mar 13 00:37:03.251705 systemd[1]: Started cri-containerd-dccdb53ed6e64250062f86ccc1630ec5d58c4abe0b882c2b067d968ed260b7e2.scope - libcontainer container dccdb53ed6e64250062f86ccc1630ec5d58c4abe0b882c2b067d968ed260b7e2. Mar 13 00:37:03.294401 containerd[1639]: time="2026-03-13T00:37:03.294375378Z" level=info msg="StartContainer for \"dccdb53ed6e64250062f86ccc1630ec5d58c4abe0b882c2b067d968ed260b7e2\" returns successfully" Mar 13 00:37:03.295699 containerd[1639]: time="2026-03-13T00:37:03.295675656Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 13 00:37:03.353797 systemd-networkd[1523]: cali6eaff358f55: Gained IPv6LL Mar 13 00:37:03.547712 systemd-networkd[1523]: vxlan.calico: Gained IPv6LL Mar 13 00:37:04.823507 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1342042083.mount: Deactivated successfully. Mar 13 00:37:04.847041 containerd[1639]: time="2026-03-13T00:37:04.846993557Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:04.848371 containerd[1639]: time="2026-03-13T00:37:04.848242722Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 13 00:37:04.849945 containerd[1639]: time="2026-03-13T00:37:04.849926079Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:04.852373 containerd[1639]: time="2026-03-13T00:37:04.852347675Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:04.853119 containerd[1639]: time="2026-03-13T00:37:04.852793712Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 1.556988472s" Mar 13 00:37:04.853119 containerd[1639]: time="2026-03-13T00:37:04.852819688Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 13 00:37:04.858139 containerd[1639]: time="2026-03-13T00:37:04.858116306Z" level=info msg="CreateContainer within sandbox \"228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 13 00:37:04.870162 containerd[1639]: time="2026-03-13T00:37:04.868668815Z" level=info msg="Container 6d075f56fc0a314d38c2edb4af60ec720e483e523b2957d562d3956b67aabcc4: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:04.878867 containerd[1639]: time="2026-03-13T00:37:04.878837853Z" level=info msg="CreateContainer within sandbox \"228dffd3aa947acf0a8582fe040583cecf177f5964ede2c88554eb4b27d26add\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"6d075f56fc0a314d38c2edb4af60ec720e483e523b2957d562d3956b67aabcc4\"" Mar 13 00:37:04.879536 containerd[1639]: time="2026-03-13T00:37:04.879287308Z" level=info msg="StartContainer for \"6d075f56fc0a314d38c2edb4af60ec720e483e523b2957d562d3956b67aabcc4\"" Mar 13 00:37:04.880274 containerd[1639]: time="2026-03-13T00:37:04.880236705Z" level=info msg="connecting to shim 6d075f56fc0a314d38c2edb4af60ec720e483e523b2957d562d3956b67aabcc4" address="unix:///run/containerd/s/94aaae408103928c64178379a31d10d45097da2522256a1e432560c75e5504df" protocol=ttrpc version=3 Mar 13 00:37:04.899708 systemd[1]: Started cri-containerd-6d075f56fc0a314d38c2edb4af60ec720e483e523b2957d562d3956b67aabcc4.scope - libcontainer container 6d075f56fc0a314d38c2edb4af60ec720e483e523b2957d562d3956b67aabcc4. Mar 13 00:37:04.945610 containerd[1639]: time="2026-03-13T00:37:04.945576914Z" level=info msg="StartContainer for \"6d075f56fc0a314d38c2edb4af60ec720e483e523b2957d562d3956b67aabcc4\" returns successfully" Mar 13 00:37:11.409982 containerd[1639]: time="2026-03-13T00:37:11.409908454Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658d87f884-kc54z,Uid:95fefb4b-b753-41cf-8c25-ddb2b4541f4b,Namespace:calico-system,Attempt:0,}" Mar 13 00:37:11.410896 containerd[1639]: time="2026-03-13T00:37:11.410862582Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697dc6db9-lsdmd,Uid:3a859d32-64de-4fa7-bc30-293dcef00461,Namespace:calico-system,Attempt:0,}" Mar 13 00:37:11.554827 systemd-networkd[1523]: calif2ed5b0a34a: Link UP Mar 13 00:37:11.555447 systemd-networkd[1523]: calif2ed5b0a34a: Gained carrier Mar 13 00:37:11.567609 kubelet[2856]: I0313 00:37:11.567232 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5d9cfcb886-p76dp" podStartSLOduration=7.533838285 podStartE2EDuration="10.567215515s" podCreationTimestamp="2026-03-13 00:37:01 +0000 UTC" firstStartedPulling="2026-03-13 00:37:01.820161744 +0000 UTC m=+39.699268599" lastFinishedPulling="2026-03-13 00:37:04.853538974 +0000 UTC m=+42.732645829" observedRunningTime="2026-03-13 00:37:04.98340729 +0000 UTC m=+42.862514167" watchObservedRunningTime="2026-03-13 00:37:11.567215515 +0000 UTC m=+49.446322392" Mar 13 00:37:11.571529 containerd[1639]: 2026-03-13 00:37:11.479 [INFO][4370] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--kc54z-eth0 calico-apiserver-658d87f884- calico-system 95fefb4b-b753-41cf-8c25-ddb2b4541f4b 839 0 2026-03-13 00:36:39 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:658d87f884 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-23cf6448d4 calico-apiserver-658d87f884-kc54z eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calif2ed5b0a34a [] [] }} ContainerID="a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" Namespace="calico-system" Pod="calico-apiserver-658d87f884-kc54z" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--kc54z-" Mar 13 00:37:11.571529 containerd[1639]: 2026-03-13 00:37:11.479 [INFO][4370] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" Namespace="calico-system" Pod="calico-apiserver-658d87f884-kc54z" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--kc54z-eth0" Mar 13 00:37:11.571529 containerd[1639]: 2026-03-13 00:37:11.509 [INFO][4393] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" HandleID="k8s-pod-network.a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" Workload="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--kc54z-eth0" Mar 13 00:37:11.571720 containerd[1639]: 2026-03-13 00:37:11.521 [INFO][4393] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" HandleID="k8s-pod-network.a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" Workload="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--kc54z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277af0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-23cf6448d4", "pod":"calico-apiserver-658d87f884-kc54z", "timestamp":"2026-03-13 00:37:11.509429462 +0000 UTC"}, Hostname:"ci-4459-2-4-n-23cf6448d4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000204f20)} Mar 13 00:37:11.571720 containerd[1639]: 2026-03-13 00:37:11.521 [INFO][4393] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:37:11.571720 containerd[1639]: 2026-03-13 00:37:11.521 [INFO][4393] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:37:11.571720 containerd[1639]: 2026-03-13 00:37:11.521 [INFO][4393] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-23cf6448d4' Mar 13 00:37:11.571720 containerd[1639]: 2026-03-13 00:37:11.524 [INFO][4393] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.571720 containerd[1639]: 2026-03-13 00:37:11.528 [INFO][4393] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.571720 containerd[1639]: 2026-03-13 00:37:11.533 [INFO][4393] ipam/ipam.go 526: Trying affinity for 192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.571720 containerd[1639]: 2026-03-13 00:37:11.538 [INFO][4393] ipam/ipam.go 160: Attempting to load block cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.571720 containerd[1639]: 2026-03-13 00:37:11.540 [INFO][4393] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.571946 containerd[1639]: 2026-03-13 00:37:11.540 [INFO][4393] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.107.0/26 handle="k8s-pod-network.a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.571946 containerd[1639]: 2026-03-13 00:37:11.542 [INFO][4393] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae Mar 13 00:37:11.571946 containerd[1639]: 2026-03-13 00:37:11.545 [INFO][4393] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.107.0/26 handle="k8s-pod-network.a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.571946 containerd[1639]: 2026-03-13 00:37:11.549 [INFO][4393] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.107.2/26] block=192.168.107.0/26 handle="k8s-pod-network.a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.571946 containerd[1639]: 2026-03-13 00:37:11.549 [INFO][4393] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.107.2/26] handle="k8s-pod-network.a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.571946 containerd[1639]: 2026-03-13 00:37:11.549 [INFO][4393] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:37:11.571946 containerd[1639]: 2026-03-13 00:37:11.549 [INFO][4393] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.107.2/26] IPv6=[] ContainerID="a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" HandleID="k8s-pod-network.a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" Workload="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--kc54z-eth0" Mar 13 00:37:11.572095 containerd[1639]: 2026-03-13 00:37:11.552 [INFO][4370] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" Namespace="calico-system" Pod="calico-apiserver-658d87f884-kc54z" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--kc54z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--kc54z-eth0", GenerateName:"calico-apiserver-658d87f884-", Namespace:"calico-system", SelfLink:"", UID:"95fefb4b-b753-41cf-8c25-ddb2b4541f4b", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 36, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658d87f884", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"", Pod:"calico-apiserver-658d87f884-kc54z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.107.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calif2ed5b0a34a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:11.572146 containerd[1639]: 2026-03-13 00:37:11.552 [INFO][4370] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.2/32] ContainerID="a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" Namespace="calico-system" Pod="calico-apiserver-658d87f884-kc54z" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--kc54z-eth0" Mar 13 00:37:11.572146 containerd[1639]: 2026-03-13 00:37:11.552 [INFO][4370] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif2ed5b0a34a ContainerID="a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" Namespace="calico-system" Pod="calico-apiserver-658d87f884-kc54z" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--kc54z-eth0" Mar 13 00:37:11.572146 containerd[1639]: 2026-03-13 00:37:11.555 [INFO][4370] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" Namespace="calico-system" Pod="calico-apiserver-658d87f884-kc54z" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--kc54z-eth0" Mar 13 00:37:11.572225 containerd[1639]: 2026-03-13 00:37:11.556 [INFO][4370] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" Namespace="calico-system" Pod="calico-apiserver-658d87f884-kc54z" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--kc54z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--kc54z-eth0", GenerateName:"calico-apiserver-658d87f884-", Namespace:"calico-system", SelfLink:"", UID:"95fefb4b-b753-41cf-8c25-ddb2b4541f4b", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 36, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658d87f884", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae", Pod:"calico-apiserver-658d87f884-kc54z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.107.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calif2ed5b0a34a", MAC:"a2:cb:ff:0e:31:5d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:11.572275 containerd[1639]: 2026-03-13 00:37:11.563 [INFO][4370] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" Namespace="calico-system" Pod="calico-apiserver-658d87f884-kc54z" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--kc54z-eth0" Mar 13 00:37:11.603474 containerd[1639]: time="2026-03-13T00:37:11.603402335Z" level=info msg="connecting to shim a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae" address="unix:///run/containerd/s/e4182be685784ca579fa91edaf4be422d364c135006556805edc021ab57271e3" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:11.634736 systemd[1]: Started cri-containerd-a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae.scope - libcontainer container a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae. Mar 13 00:37:11.665970 systemd-networkd[1523]: cali751c9f271fc: Link UP Mar 13 00:37:11.666806 systemd-networkd[1523]: cali751c9f271fc: Gained carrier Mar 13 00:37:11.686267 containerd[1639]: 2026-03-13 00:37:11.493 [INFO][4376] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--23cf6448d4-k8s-calico--kube--controllers--697dc6db9--lsdmd-eth0 calico-kube-controllers-697dc6db9- calico-system 3a859d32-64de-4fa7-bc30-293dcef00461 841 0 2026-03-13 00:36:39 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:697dc6db9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-4-n-23cf6448d4 calico-kube-controllers-697dc6db9-lsdmd eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali751c9f271fc [] [] }} ContainerID="ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" Namespace="calico-system" Pod="calico-kube-controllers-697dc6db9-lsdmd" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--kube--controllers--697dc6db9--lsdmd-" Mar 13 00:37:11.686267 containerd[1639]: 2026-03-13 00:37:11.493 [INFO][4376] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" Namespace="calico-system" Pod="calico-kube-controllers-697dc6db9-lsdmd" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--kube--controllers--697dc6db9--lsdmd-eth0" Mar 13 00:37:11.686267 containerd[1639]: 2026-03-13 00:37:11.532 [INFO][4398] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" HandleID="k8s-pod-network.ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" Workload="ci--4459--2--4--n--23cf6448d4-k8s-calico--kube--controllers--697dc6db9--lsdmd-eth0" Mar 13 00:37:11.686449 containerd[1639]: 2026-03-13 00:37:11.539 [INFO][4398] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" HandleID="k8s-pod-network.ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" Workload="ci--4459--2--4--n--23cf6448d4-k8s-calico--kube--controllers--697dc6db9--lsdmd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277c20), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-23cf6448d4", "pod":"calico-kube-controllers-697dc6db9-lsdmd", "timestamp":"2026-03-13 00:37:11.532537719 +0000 UTC"}, Hostname:"ci-4459-2-4-n-23cf6448d4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000112dc0)} Mar 13 00:37:11.686449 containerd[1639]: 2026-03-13 00:37:11.539 [INFO][4398] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:37:11.686449 containerd[1639]: 2026-03-13 00:37:11.549 [INFO][4398] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:37:11.686449 containerd[1639]: 2026-03-13 00:37:11.550 [INFO][4398] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-23cf6448d4' Mar 13 00:37:11.686449 containerd[1639]: 2026-03-13 00:37:11.626 [INFO][4398] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.686449 containerd[1639]: 2026-03-13 00:37:11.633 [INFO][4398] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.686449 containerd[1639]: 2026-03-13 00:37:11.639 [INFO][4398] ipam/ipam.go 526: Trying affinity for 192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.686449 containerd[1639]: 2026-03-13 00:37:11.641 [INFO][4398] ipam/ipam.go 160: Attempting to load block cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.686449 containerd[1639]: 2026-03-13 00:37:11.645 [INFO][4398] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.687951 containerd[1639]: 2026-03-13 00:37:11.645 [INFO][4398] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.107.0/26 handle="k8s-pod-network.ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.687951 containerd[1639]: 2026-03-13 00:37:11.648 [INFO][4398] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f Mar 13 00:37:11.687951 containerd[1639]: 2026-03-13 00:37:11.653 [INFO][4398] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.107.0/26 handle="k8s-pod-network.ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.687951 containerd[1639]: 2026-03-13 00:37:11.657 [INFO][4398] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.107.3/26] block=192.168.107.0/26 handle="k8s-pod-network.ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.687951 containerd[1639]: 2026-03-13 00:37:11.658 [INFO][4398] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.107.3/26] handle="k8s-pod-network.ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:11.687951 containerd[1639]: 2026-03-13 00:37:11.658 [INFO][4398] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:37:11.687951 containerd[1639]: 2026-03-13 00:37:11.658 [INFO][4398] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.107.3/26] IPv6=[] ContainerID="ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" HandleID="k8s-pod-network.ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" Workload="ci--4459--2--4--n--23cf6448d4-k8s-calico--kube--controllers--697dc6db9--lsdmd-eth0" Mar 13 00:37:11.688084 containerd[1639]: 2026-03-13 00:37:11.662 [INFO][4376] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" Namespace="calico-system" Pod="calico-kube-controllers-697dc6db9-lsdmd" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--kube--controllers--697dc6db9--lsdmd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-calico--kube--controllers--697dc6db9--lsdmd-eth0", GenerateName:"calico-kube-controllers-697dc6db9-", Namespace:"calico-system", SelfLink:"", UID:"3a859d32-64de-4fa7-bc30-293dcef00461", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 36, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"697dc6db9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"", Pod:"calico-kube-controllers-697dc6db9-lsdmd", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.107.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali751c9f271fc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:11.688145 containerd[1639]: 2026-03-13 00:37:11.662 [INFO][4376] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.3/32] ContainerID="ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" Namespace="calico-system" Pod="calico-kube-controllers-697dc6db9-lsdmd" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--kube--controllers--697dc6db9--lsdmd-eth0" Mar 13 00:37:11.688145 containerd[1639]: 2026-03-13 00:37:11.663 [INFO][4376] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali751c9f271fc ContainerID="ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" Namespace="calico-system" Pod="calico-kube-controllers-697dc6db9-lsdmd" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--kube--controllers--697dc6db9--lsdmd-eth0" Mar 13 00:37:11.688145 containerd[1639]: 2026-03-13 00:37:11.666 [INFO][4376] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" Namespace="calico-system" Pod="calico-kube-controllers-697dc6db9-lsdmd" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--kube--controllers--697dc6db9--lsdmd-eth0" Mar 13 00:37:11.688207 containerd[1639]: 2026-03-13 00:37:11.667 [INFO][4376] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" Namespace="calico-system" Pod="calico-kube-controllers-697dc6db9-lsdmd" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--kube--controllers--697dc6db9--lsdmd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-calico--kube--controllers--697dc6db9--lsdmd-eth0", GenerateName:"calico-kube-controllers-697dc6db9-", Namespace:"calico-system", SelfLink:"", UID:"3a859d32-64de-4fa7-bc30-293dcef00461", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 36, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"697dc6db9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f", Pod:"calico-kube-controllers-697dc6db9-lsdmd", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.107.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali751c9f271fc", MAC:"ca:86:e6:67:b8:d9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:11.688257 containerd[1639]: 2026-03-13 00:37:11.684 [INFO][4376] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" Namespace="calico-system" Pod="calico-kube-controllers-697dc6db9-lsdmd" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--kube--controllers--697dc6db9--lsdmd-eth0" Mar 13 00:37:11.711514 containerd[1639]: time="2026-03-13T00:37:11.710946101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658d87f884-kc54z,Uid:95fefb4b-b753-41cf-8c25-ddb2b4541f4b,Namespace:calico-system,Attempt:0,} returns sandbox id \"a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae\"" Mar 13 00:37:11.715606 containerd[1639]: time="2026-03-13T00:37:11.713996860Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 13 00:37:11.727149 containerd[1639]: time="2026-03-13T00:37:11.727110382Z" level=info msg="connecting to shim ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f" address="unix:///run/containerd/s/f423147fc82d6101ba6b63b0e52a2b2268889b755480e295df3bd27cd0a18147" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:11.747739 systemd[1]: Started cri-containerd-ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f.scope - libcontainer container ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f. Mar 13 00:37:11.798353 containerd[1639]: time="2026-03-13T00:37:11.798318473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697dc6db9-lsdmd,Uid:3a859d32-64de-4fa7-bc30-293dcef00461,Namespace:calico-system,Attempt:0,} returns sandbox id \"ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f\"" Mar 13 00:37:12.411500 containerd[1639]: time="2026-03-13T00:37:12.411430908Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-w6hs4,Uid:7218a6dc-e47d-4c9f-9b07-d8349ede2479,Namespace:calico-system,Attempt:0,}" Mar 13 00:37:12.589144 systemd-networkd[1523]: cali04a94f46bcc: Link UP Mar 13 00:37:12.590074 systemd-networkd[1523]: cali04a94f46bcc: Gained carrier Mar 13 00:37:12.605675 containerd[1639]: 2026-03-13 00:37:12.499 [INFO][4544] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--23cf6448d4-k8s-goldmane--5b85766d88--w6hs4-eth0 goldmane-5b85766d88- calico-system 7218a6dc-e47d-4c9f-9b07-d8349ede2479 842 0 2026-03-13 00:36:39 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:5b85766d88 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-4-n-23cf6448d4 goldmane-5b85766d88-w6hs4 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali04a94f46bcc [] [] }} ContainerID="f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" Namespace="calico-system" Pod="goldmane-5b85766d88-w6hs4" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-goldmane--5b85766d88--w6hs4-" Mar 13 00:37:12.605675 containerd[1639]: 2026-03-13 00:37:12.499 [INFO][4544] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" Namespace="calico-system" Pod="goldmane-5b85766d88-w6hs4" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-goldmane--5b85766d88--w6hs4-eth0" Mar 13 00:37:12.605675 containerd[1639]: 2026-03-13 00:37:12.540 [INFO][4552] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" HandleID="k8s-pod-network.f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" Workload="ci--4459--2--4--n--23cf6448d4-k8s-goldmane--5b85766d88--w6hs4-eth0" Mar 13 00:37:12.605896 containerd[1639]: 2026-03-13 00:37:12.548 [INFO][4552] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" HandleID="k8s-pod-network.f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" Workload="ci--4459--2--4--n--23cf6448d4-k8s-goldmane--5b85766d88--w6hs4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002f7e80), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-23cf6448d4", "pod":"goldmane-5b85766d88-w6hs4", "timestamp":"2026-03-13 00:37:12.540926989 +0000 UTC"}, Hostname:"ci-4459-2-4-n-23cf6448d4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001942c0)} Mar 13 00:37:12.605896 containerd[1639]: 2026-03-13 00:37:12.549 [INFO][4552] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:37:12.605896 containerd[1639]: 2026-03-13 00:37:12.549 [INFO][4552] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:37:12.605896 containerd[1639]: 2026-03-13 00:37:12.549 [INFO][4552] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-23cf6448d4' Mar 13 00:37:12.605896 containerd[1639]: 2026-03-13 00:37:12.553 [INFO][4552] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:12.605896 containerd[1639]: 2026-03-13 00:37:12.558 [INFO][4552] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:12.605896 containerd[1639]: 2026-03-13 00:37:12.564 [INFO][4552] ipam/ipam.go 526: Trying affinity for 192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:12.605896 containerd[1639]: 2026-03-13 00:37:12.566 [INFO][4552] ipam/ipam.go 160: Attempting to load block cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:12.605896 containerd[1639]: 2026-03-13 00:37:12.569 [INFO][4552] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:12.606103 containerd[1639]: 2026-03-13 00:37:12.569 [INFO][4552] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.107.0/26 handle="k8s-pod-network.f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:12.606103 containerd[1639]: 2026-03-13 00:37:12.571 [INFO][4552] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4 Mar 13 00:37:12.606103 containerd[1639]: 2026-03-13 00:37:12.576 [INFO][4552] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.107.0/26 handle="k8s-pod-network.f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:12.606103 containerd[1639]: 2026-03-13 00:37:12.584 [INFO][4552] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.107.4/26] block=192.168.107.0/26 handle="k8s-pod-network.f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:12.606103 containerd[1639]: 2026-03-13 00:37:12.584 [INFO][4552] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.107.4/26] handle="k8s-pod-network.f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:12.606103 containerd[1639]: 2026-03-13 00:37:12.584 [INFO][4552] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:37:12.606103 containerd[1639]: 2026-03-13 00:37:12.584 [INFO][4552] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.107.4/26] IPv6=[] ContainerID="f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" HandleID="k8s-pod-network.f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" Workload="ci--4459--2--4--n--23cf6448d4-k8s-goldmane--5b85766d88--w6hs4-eth0" Mar 13 00:37:12.606234 containerd[1639]: 2026-03-13 00:37:12.586 [INFO][4544] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" Namespace="calico-system" Pod="goldmane-5b85766d88-w6hs4" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-goldmane--5b85766d88--w6hs4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-goldmane--5b85766d88--w6hs4-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"7218a6dc-e47d-4c9f-9b07-d8349ede2479", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 36, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"", Pod:"goldmane-5b85766d88-w6hs4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.107.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali04a94f46bcc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:12.606312 containerd[1639]: 2026-03-13 00:37:12.586 [INFO][4544] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.4/32] ContainerID="f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" Namespace="calico-system" Pod="goldmane-5b85766d88-w6hs4" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-goldmane--5b85766d88--w6hs4-eth0" Mar 13 00:37:12.606312 containerd[1639]: 2026-03-13 00:37:12.586 [INFO][4544] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali04a94f46bcc ContainerID="f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" Namespace="calico-system" Pod="goldmane-5b85766d88-w6hs4" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-goldmane--5b85766d88--w6hs4-eth0" Mar 13 00:37:12.606312 containerd[1639]: 2026-03-13 00:37:12.590 [INFO][4544] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" Namespace="calico-system" Pod="goldmane-5b85766d88-w6hs4" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-goldmane--5b85766d88--w6hs4-eth0" Mar 13 00:37:12.606396 containerd[1639]: 2026-03-13 00:37:12.590 [INFO][4544] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" Namespace="calico-system" Pod="goldmane-5b85766d88-w6hs4" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-goldmane--5b85766d88--w6hs4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-goldmane--5b85766d88--w6hs4-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"7218a6dc-e47d-4c9f-9b07-d8349ede2479", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 36, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4", Pod:"goldmane-5b85766d88-w6hs4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.107.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali04a94f46bcc", MAC:"ca:d6:26:d8:36:eb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:12.606447 containerd[1639]: 2026-03-13 00:37:12.603 [INFO][4544] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" Namespace="calico-system" Pod="goldmane-5b85766d88-w6hs4" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-goldmane--5b85766d88--w6hs4-eth0" Mar 13 00:37:12.645578 containerd[1639]: time="2026-03-13T00:37:12.645269639Z" level=info msg="connecting to shim f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4" address="unix:///run/containerd/s/1d49f03140a96a697b8f609a84c0aff1e9fa34bd232c360c8e154ddfd873f2de" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:12.678749 systemd[1]: Started cri-containerd-f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4.scope - libcontainer container f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4. Mar 13 00:37:12.740206 containerd[1639]: time="2026-03-13T00:37:12.740163227Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-w6hs4,Uid:7218a6dc-e47d-4c9f-9b07-d8349ede2479,Namespace:calico-system,Attempt:0,} returns sandbox id \"f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4\"" Mar 13 00:37:12.761711 systemd-networkd[1523]: cali751c9f271fc: Gained IPv6LL Mar 13 00:37:13.410736 containerd[1639]: time="2026-03-13T00:37:13.410528109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658d87f884-8zfwm,Uid:ba1a71bc-658c-494d-92f9-f4db1f8ea894,Namespace:calico-system,Attempt:0,}" Mar 13 00:37:13.411144 containerd[1639]: time="2026-03-13T00:37:13.411117570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7z8mv,Uid:129d677b-52bb-4b01-a36f-4d698e8848e6,Namespace:calico-system,Attempt:0,}" Mar 13 00:37:13.411341 containerd[1639]: time="2026-03-13T00:37:13.411321686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4hngh,Uid:10c18fc6-35a9-4a29-b0ed-122436522760,Namespace:kube-system,Attempt:0,}" Mar 13 00:37:13.411506 containerd[1639]: time="2026-03-13T00:37:13.411487053Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mqhpg,Uid:e3354a6d-6ea0-4ffe-b3e8-3e508fd79239,Namespace:kube-system,Attempt:0,}" Mar 13 00:37:13.593911 systemd-networkd[1523]: calif2ed5b0a34a: Gained IPv6LL Mar 13 00:37:13.692976 systemd-networkd[1523]: calib938b0e46ed: Link UP Mar 13 00:37:13.694765 systemd-networkd[1523]: calib938b0e46ed: Gained carrier Mar 13 00:37:13.720967 containerd[1639]: 2026-03-13 00:37:13.515 [INFO][4653] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--4hngh-eth0 coredns-674b8bbfcf- kube-system 10c18fc6-35a9-4a29-b0ed-122436522760 843 0 2026-03-13 00:36:29 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-23cf6448d4 coredns-674b8bbfcf-4hngh eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib938b0e46ed [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" Namespace="kube-system" Pod="coredns-674b8bbfcf-4hngh" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--4hngh-" Mar 13 00:37:13.720967 containerd[1639]: 2026-03-13 00:37:13.517 [INFO][4653] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" Namespace="kube-system" Pod="coredns-674b8bbfcf-4hngh" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--4hngh-eth0" Mar 13 00:37:13.720967 containerd[1639]: 2026-03-13 00:37:13.613 [INFO][4691] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" HandleID="k8s-pod-network.a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" Workload="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--4hngh-eth0" Mar 13 00:37:13.721709 containerd[1639]: 2026-03-13 00:37:13.633 [INFO][4691] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" HandleID="k8s-pod-network.a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" Workload="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--4hngh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0005e6830), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-23cf6448d4", "pod":"coredns-674b8bbfcf-4hngh", "timestamp":"2026-03-13 00:37:13.613096389 +0000 UTC"}, Hostname:"ci-4459-2-4-n-23cf6448d4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000200dc0)} Mar 13 00:37:13.721709 containerd[1639]: 2026-03-13 00:37:13.633 [INFO][4691] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:37:13.721709 containerd[1639]: 2026-03-13 00:37:13.633 [INFO][4691] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:37:13.721709 containerd[1639]: 2026-03-13 00:37:13.633 [INFO][4691] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-23cf6448d4' Mar 13 00:37:13.721709 containerd[1639]: 2026-03-13 00:37:13.637 [INFO][4691] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.721709 containerd[1639]: 2026-03-13 00:37:13.645 [INFO][4691] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.721709 containerd[1639]: 2026-03-13 00:37:13.655 [INFO][4691] ipam/ipam.go 526: Trying affinity for 192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.721709 containerd[1639]: 2026-03-13 00:37:13.658 [INFO][4691] ipam/ipam.go 160: Attempting to load block cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.721709 containerd[1639]: 2026-03-13 00:37:13.662 [INFO][4691] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.722128 containerd[1639]: 2026-03-13 00:37:13.662 [INFO][4691] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.107.0/26 handle="k8s-pod-network.a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.722128 containerd[1639]: 2026-03-13 00:37:13.664 [INFO][4691] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d Mar 13 00:37:13.722128 containerd[1639]: 2026-03-13 00:37:13.671 [INFO][4691] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.107.0/26 handle="k8s-pod-network.a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.722128 containerd[1639]: 2026-03-13 00:37:13.679 [INFO][4691] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.107.5/26] block=192.168.107.0/26 handle="k8s-pod-network.a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.722128 containerd[1639]: 2026-03-13 00:37:13.679 [INFO][4691] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.107.5/26] handle="k8s-pod-network.a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.722128 containerd[1639]: 2026-03-13 00:37:13.680 [INFO][4691] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:37:13.722128 containerd[1639]: 2026-03-13 00:37:13.680 [INFO][4691] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.107.5/26] IPv6=[] ContainerID="a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" HandleID="k8s-pod-network.a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" Workload="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--4hngh-eth0" Mar 13 00:37:13.722370 containerd[1639]: 2026-03-13 00:37:13.686 [INFO][4653] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" Namespace="kube-system" Pod="coredns-674b8bbfcf-4hngh" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--4hngh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--4hngh-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"10c18fc6-35a9-4a29-b0ed-122436522760", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 36, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"", Pod:"coredns-674b8bbfcf-4hngh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.107.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib938b0e46ed", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:13.722370 containerd[1639]: 2026-03-13 00:37:13.686 [INFO][4653] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.5/32] ContainerID="a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" Namespace="kube-system" Pod="coredns-674b8bbfcf-4hngh" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--4hngh-eth0" Mar 13 00:37:13.722370 containerd[1639]: 2026-03-13 00:37:13.686 [INFO][4653] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib938b0e46ed ContainerID="a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" Namespace="kube-system" Pod="coredns-674b8bbfcf-4hngh" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--4hngh-eth0" Mar 13 00:37:13.722370 containerd[1639]: 2026-03-13 00:37:13.695 [INFO][4653] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" Namespace="kube-system" Pod="coredns-674b8bbfcf-4hngh" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--4hngh-eth0" Mar 13 00:37:13.722370 containerd[1639]: 2026-03-13 00:37:13.698 [INFO][4653] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" Namespace="kube-system" Pod="coredns-674b8bbfcf-4hngh" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--4hngh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--4hngh-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"10c18fc6-35a9-4a29-b0ed-122436522760", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 36, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d", Pod:"coredns-674b8bbfcf-4hngh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.107.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib938b0e46ed", MAC:"f6:d1:46:b8:a0:2f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:13.722370 containerd[1639]: 2026-03-13 00:37:13.712 [INFO][4653] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" Namespace="kube-system" Pod="coredns-674b8bbfcf-4hngh" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--4hngh-eth0" Mar 13 00:37:13.787918 containerd[1639]: time="2026-03-13T00:37:13.787867488Z" level=info msg="connecting to shim a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d" address="unix:///run/containerd/s/e91c3b07eef601e2163b7aa187977cfc9d3216505e2f9e43135b96fea3f8c676" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:13.822175 systemd-networkd[1523]: cali12d64617959: Link UP Mar 13 00:37:13.822886 systemd-networkd[1523]: cali12d64617959: Gained carrier Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.521 [INFO][4671] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--mqhpg-eth0 coredns-674b8bbfcf- kube-system e3354a6d-6ea0-4ffe-b3e8-3e508fd79239 837 0 2026-03-13 00:36:29 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-23cf6448d4 coredns-674b8bbfcf-mqhpg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali12d64617959 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" Namespace="kube-system" Pod="coredns-674b8bbfcf-mqhpg" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--mqhpg-" Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.521 [INFO][4671] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" Namespace="kube-system" Pod="coredns-674b8bbfcf-mqhpg" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--mqhpg-eth0" Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.610 [INFO][4693] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" HandleID="k8s-pod-network.e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" Workload="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--mqhpg-eth0" Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.633 [INFO][4693] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" HandleID="k8s-pod-network.e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" Workload="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--mqhpg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003039f0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-23cf6448d4", "pod":"coredns-674b8bbfcf-mqhpg", "timestamp":"2026-03-13 00:37:13.610277061 +0000 UTC"}, Hostname:"ci-4459-2-4-n-23cf6448d4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003a2420)} Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.634 [INFO][4693] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.680 [INFO][4693] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.681 [INFO][4693] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-23cf6448d4' Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.740 [INFO][4693] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.754 [INFO][4693] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.776 [INFO][4693] ipam/ipam.go 526: Trying affinity for 192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.780 [INFO][4693] ipam/ipam.go 160: Attempting to load block cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.785 [INFO][4693] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.785 [INFO][4693] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.107.0/26 handle="k8s-pod-network.e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.789 [INFO][4693] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73 Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.795 [INFO][4693] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.107.0/26 handle="k8s-pod-network.e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.810 [INFO][4693] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.107.6/26] block=192.168.107.0/26 handle="k8s-pod-network.e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.810 [INFO][4693] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.107.6/26] handle="k8s-pod-network.e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.810 [INFO][4693] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:37:13.863540 containerd[1639]: 2026-03-13 00:37:13.811 [INFO][4693] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.107.6/26] IPv6=[] ContainerID="e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" HandleID="k8s-pod-network.e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" Workload="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--mqhpg-eth0" Mar 13 00:37:13.864493 containerd[1639]: 2026-03-13 00:37:13.817 [INFO][4671] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" Namespace="kube-system" Pod="coredns-674b8bbfcf-mqhpg" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--mqhpg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--mqhpg-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"e3354a6d-6ea0-4ffe-b3e8-3e508fd79239", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 36, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"", Pod:"coredns-674b8bbfcf-mqhpg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.107.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali12d64617959", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:13.864493 containerd[1639]: 2026-03-13 00:37:13.817 [INFO][4671] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.6/32] ContainerID="e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" Namespace="kube-system" Pod="coredns-674b8bbfcf-mqhpg" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--mqhpg-eth0" Mar 13 00:37:13.864493 containerd[1639]: 2026-03-13 00:37:13.817 [INFO][4671] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali12d64617959 ContainerID="e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" Namespace="kube-system" Pod="coredns-674b8bbfcf-mqhpg" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--mqhpg-eth0" Mar 13 00:37:13.864493 containerd[1639]: 2026-03-13 00:37:13.827 [INFO][4671] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" Namespace="kube-system" Pod="coredns-674b8bbfcf-mqhpg" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--mqhpg-eth0" Mar 13 00:37:13.864493 containerd[1639]: 2026-03-13 00:37:13.829 [INFO][4671] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" Namespace="kube-system" Pod="coredns-674b8bbfcf-mqhpg" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--mqhpg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--mqhpg-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"e3354a6d-6ea0-4ffe-b3e8-3e508fd79239", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 36, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73", Pod:"coredns-674b8bbfcf-mqhpg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.107.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali12d64617959", MAC:"de:f9:4e:34:0d:a5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:13.864493 containerd[1639]: 2026-03-13 00:37:13.850 [INFO][4671] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" Namespace="kube-system" Pod="coredns-674b8bbfcf-mqhpg" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-coredns--674b8bbfcf--mqhpg-eth0" Mar 13 00:37:13.863755 systemd[1]: Started cri-containerd-a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d.scope - libcontainer container a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d. Mar 13 00:37:13.917734 containerd[1639]: time="2026-03-13T00:37:13.917435548Z" level=info msg="connecting to shim e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73" address="unix:///run/containerd/s/a9bd38c9718199297e52c088b7e54a2586a7c2c4abf0561a5b9252b704698536" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:13.956687 systemd-networkd[1523]: cali60cd579182f: Link UP Mar 13 00:37:13.958632 systemd-networkd[1523]: cali60cd579182f: Gained carrier Mar 13 00:37:13.985736 systemd[1]: Started cri-containerd-e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73.scope - libcontainer container e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73. Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.531 [INFO][4647] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--23cf6448d4-k8s-csi--node--driver--7z8mv-eth0 csi-node-driver- calico-system 129d677b-52bb-4b01-a36f-4d698e8848e6 688 0 2026-03-13 00:36:39 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6d9d697c7c k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-4-n-23cf6448d4 csi-node-driver-7z8mv eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali60cd579182f [] [] }} ContainerID="72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" Namespace="calico-system" Pod="csi-node-driver-7z8mv" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-csi--node--driver--7z8mv-" Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.532 [INFO][4647] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" Namespace="calico-system" Pod="csi-node-driver-7z8mv" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-csi--node--driver--7z8mv-eth0" Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.616 [INFO][4702] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" HandleID="k8s-pod-network.72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" Workload="ci--4459--2--4--n--23cf6448d4-k8s-csi--node--driver--7z8mv-eth0" Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.639 [INFO][4702] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" HandleID="k8s-pod-network.72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" Workload="ci--4459--2--4--n--23cf6448d4-k8s-csi--node--driver--7z8mv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fbed0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-23cf6448d4", "pod":"csi-node-driver-7z8mv", "timestamp":"2026-03-13 00:37:13.616298698 +0000 UTC"}, Hostname:"ci-4459-2-4-n-23cf6448d4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000248dc0)} Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.639 [INFO][4702] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.811 [INFO][4702] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.811 [INFO][4702] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-23cf6448d4' Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.840 [INFO][4702] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.864 [INFO][4702] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.881 [INFO][4702] ipam/ipam.go 526: Trying affinity for 192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.885 [INFO][4702] ipam/ipam.go 160: Attempting to load block cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.889 [INFO][4702] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.889 [INFO][4702] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.107.0/26 handle="k8s-pod-network.72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.892 [INFO][4702] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.900 [INFO][4702] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.107.0/26 handle="k8s-pod-network.72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.916 [INFO][4702] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.107.7/26] block=192.168.107.0/26 handle="k8s-pod-network.72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.916 [INFO][4702] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.107.7/26] handle="k8s-pod-network.72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.916 [INFO][4702] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:37:13.996481 containerd[1639]: 2026-03-13 00:37:13.916 [INFO][4702] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.107.7/26] IPv6=[] ContainerID="72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" HandleID="k8s-pod-network.72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" Workload="ci--4459--2--4--n--23cf6448d4-k8s-csi--node--driver--7z8mv-eth0" Mar 13 00:37:13.997080 containerd[1639]: 2026-03-13 00:37:13.929 [INFO][4647] cni-plugin/k8s.go 418: Populated endpoint ContainerID="72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" Namespace="calico-system" Pod="csi-node-driver-7z8mv" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-csi--node--driver--7z8mv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-csi--node--driver--7z8mv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"129d677b-52bb-4b01-a36f-4d698e8848e6", ResourceVersion:"688", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 36, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"", Pod:"csi-node-driver-7z8mv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.107.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali60cd579182f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:13.997080 containerd[1639]: 2026-03-13 00:37:13.929 [INFO][4647] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.7/32] ContainerID="72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" Namespace="calico-system" Pod="csi-node-driver-7z8mv" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-csi--node--driver--7z8mv-eth0" Mar 13 00:37:13.997080 containerd[1639]: 2026-03-13 00:37:13.929 [INFO][4647] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali60cd579182f ContainerID="72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" Namespace="calico-system" Pod="csi-node-driver-7z8mv" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-csi--node--driver--7z8mv-eth0" Mar 13 00:37:13.997080 containerd[1639]: 2026-03-13 00:37:13.960 [INFO][4647] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" Namespace="calico-system" Pod="csi-node-driver-7z8mv" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-csi--node--driver--7z8mv-eth0" Mar 13 00:37:13.997080 containerd[1639]: 2026-03-13 00:37:13.961 [INFO][4647] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" Namespace="calico-system" Pod="csi-node-driver-7z8mv" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-csi--node--driver--7z8mv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-csi--node--driver--7z8mv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"129d677b-52bb-4b01-a36f-4d698e8848e6", ResourceVersion:"688", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 36, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a", Pod:"csi-node-driver-7z8mv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.107.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali60cd579182f", MAC:"4e:d8:5e:94:8e:da", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:13.997080 containerd[1639]: 2026-03-13 00:37:13.985 [INFO][4647] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" Namespace="calico-system" Pod="csi-node-driver-7z8mv" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-csi--node--driver--7z8mv-eth0" Mar 13 00:37:14.015259 containerd[1639]: time="2026-03-13T00:37:14.015178315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4hngh,Uid:10c18fc6-35a9-4a29-b0ed-122436522760,Namespace:kube-system,Attempt:0,} returns sandbox id \"a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d\"" Mar 13 00:37:14.028451 containerd[1639]: time="2026-03-13T00:37:14.028378053Z" level=info msg="CreateContainer within sandbox \"a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 13 00:37:14.057125 containerd[1639]: time="2026-03-13T00:37:14.057096182Z" level=info msg="Container 220dcec8abbd03ab7a7e451bc29a677c44af321661736ef0b6eb6b888454d062: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:14.071480 systemd-networkd[1523]: caliddb1f97bf46: Link UP Mar 13 00:37:14.072968 systemd-networkd[1523]: caliddb1f97bf46: Gained carrier Mar 13 00:37:14.081391 containerd[1639]: time="2026-03-13T00:37:14.081286531Z" level=info msg="connecting to shim 72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a" address="unix:///run/containerd/s/c976dd3f37c2629485dada70608fbe0dd2767ca1965f03b90117c99169f340c3" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:14.091777 containerd[1639]: time="2026-03-13T00:37:14.091733168Z" level=info msg="CreateContainer within sandbox \"a7665a6b217d098e693b74d7eca357aa5735513fc93f8e79f6287ef3f101069d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"220dcec8abbd03ab7a7e451bc29a677c44af321661736ef0b6eb6b888454d062\"" Mar 13 00:37:14.094489 containerd[1639]: time="2026-03-13T00:37:14.094397101Z" level=info msg="StartContainer for \"220dcec8abbd03ab7a7e451bc29a677c44af321661736ef0b6eb6b888454d062\"" Mar 13 00:37:14.097673 containerd[1639]: time="2026-03-13T00:37:14.097648973Z" level=info msg="connecting to shim 220dcec8abbd03ab7a7e451bc29a677c44af321661736ef0b6eb6b888454d062" address="unix:///run/containerd/s/e91c3b07eef601e2163b7aa187977cfc9d3216505e2f9e43135b96fea3f8c676" protocol=ttrpc version=3 Mar 13 00:37:14.117072 containerd[1639]: time="2026-03-13T00:37:14.117033902Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mqhpg,Uid:e3354a6d-6ea0-4ffe-b3e8-3e508fd79239,Namespace:kube-system,Attempt:0,} returns sandbox id \"e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73\"" Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:13.549 [INFO][4642] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--8zfwm-eth0 calico-apiserver-658d87f884- calico-system ba1a71bc-658c-494d-92f9-f4db1f8ea894 840 0 2026-03-13 00:36:39 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:658d87f884 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-23cf6448d4 calico-apiserver-658d87f884-8zfwm eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] caliddb1f97bf46 [] [] }} ContainerID="5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" Namespace="calico-system" Pod="calico-apiserver-658d87f884-8zfwm" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--8zfwm-" Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:13.549 [INFO][4642] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" Namespace="calico-system" Pod="calico-apiserver-658d87f884-8zfwm" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--8zfwm-eth0" Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:13.639 [INFO][4707] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" HandleID="k8s-pod-network.5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" Workload="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--8zfwm-eth0" Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:13.651 [INFO][4707] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" HandleID="k8s-pod-network.5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" Workload="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--8zfwm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000287860), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-23cf6448d4", "pod":"calico-apiserver-658d87f884-8zfwm", "timestamp":"2026-03-13 00:37:13.639391386 +0000 UTC"}, Hostname:"ci-4459-2-4-n-23cf6448d4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003c1a20)} Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:13.651 [INFO][4707] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:13.917 [INFO][4707] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:13.917 [INFO][4707] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-23cf6448d4' Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:13.943 [INFO][4707] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:13.974 [INFO][4707] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:13.999 [INFO][4707] ipam/ipam.go 526: Trying affinity for 192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:14.005 [INFO][4707] ipam/ipam.go 160: Attempting to load block cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:14.011 [INFO][4707] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.107.0/26 host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:14.012 [INFO][4707] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.107.0/26 handle="k8s-pod-network.5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:14.019 [INFO][4707] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:14.029 [INFO][4707] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.107.0/26 handle="k8s-pod-network.5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:14.039 [INFO][4707] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.107.8/26] block=192.168.107.0/26 handle="k8s-pod-network.5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:14.039 [INFO][4707] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.107.8/26] handle="k8s-pod-network.5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" host="ci-4459-2-4-n-23cf6448d4" Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:14.039 [INFO][4707] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:37:14.119083 containerd[1639]: 2026-03-13 00:37:14.039 [INFO][4707] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.107.8/26] IPv6=[] ContainerID="5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" HandleID="k8s-pod-network.5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" Workload="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--8zfwm-eth0" Mar 13 00:37:14.120483 containerd[1639]: 2026-03-13 00:37:14.050 [INFO][4642] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" Namespace="calico-system" Pod="calico-apiserver-658d87f884-8zfwm" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--8zfwm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--8zfwm-eth0", GenerateName:"calico-apiserver-658d87f884-", Namespace:"calico-system", SelfLink:"", UID:"ba1a71bc-658c-494d-92f9-f4db1f8ea894", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 36, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658d87f884", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"", Pod:"calico-apiserver-658d87f884-8zfwm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.107.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliddb1f97bf46", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:14.120483 containerd[1639]: 2026-03-13 00:37:14.051 [INFO][4642] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.8/32] ContainerID="5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" Namespace="calico-system" Pod="calico-apiserver-658d87f884-8zfwm" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--8zfwm-eth0" Mar 13 00:37:14.120483 containerd[1639]: 2026-03-13 00:37:14.051 [INFO][4642] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliddb1f97bf46 ContainerID="5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" Namespace="calico-system" Pod="calico-apiserver-658d87f884-8zfwm" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--8zfwm-eth0" Mar 13 00:37:14.120483 containerd[1639]: 2026-03-13 00:37:14.073 [INFO][4642] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" Namespace="calico-system" Pod="calico-apiserver-658d87f884-8zfwm" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--8zfwm-eth0" Mar 13 00:37:14.120483 containerd[1639]: 2026-03-13 00:37:14.074 [INFO][4642] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" Namespace="calico-system" Pod="calico-apiserver-658d87f884-8zfwm" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--8zfwm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--8zfwm-eth0", GenerateName:"calico-apiserver-658d87f884-", Namespace:"calico-system", SelfLink:"", UID:"ba1a71bc-658c-494d-92f9-f4db1f8ea894", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 36, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658d87f884", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-23cf6448d4", ContainerID:"5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f", Pod:"calico-apiserver-658d87f884-8zfwm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.107.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliddb1f97bf46", MAC:"ba:5f:b3:0d:4e:c2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:37:14.120483 containerd[1639]: 2026-03-13 00:37:14.106 [INFO][4642] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" Namespace="calico-system" Pod="calico-apiserver-658d87f884-8zfwm" WorkloadEndpoint="ci--4459--2--4--n--23cf6448d4-k8s-calico--apiserver--658d87f884--8zfwm-eth0" Mar 13 00:37:14.134157 containerd[1639]: time="2026-03-13T00:37:14.134120231Z" level=info msg="CreateContainer within sandbox \"e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 13 00:37:14.149895 systemd[1]: Started cri-containerd-72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a.scope - libcontainer container 72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a. Mar 13 00:37:14.174769 systemd[1]: Started cri-containerd-220dcec8abbd03ab7a7e451bc29a677c44af321661736ef0b6eb6b888454d062.scope - libcontainer container 220dcec8abbd03ab7a7e451bc29a677c44af321661736ef0b6eb6b888454d062. Mar 13 00:37:14.177348 containerd[1639]: time="2026-03-13T00:37:14.176796880Z" level=info msg="Container e0528edf30fe0651be926d76cf52a69d744cf2b9c7976905a35406f897dd9f7a: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:14.191188 containerd[1639]: time="2026-03-13T00:37:14.191164665Z" level=info msg="CreateContainer within sandbox \"e8b6364acca7e7510afd23948dfa83868ad19bcc6ba52086ad85a78a8be4dc73\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e0528edf30fe0651be926d76cf52a69d744cf2b9c7976905a35406f897dd9f7a\"" Mar 13 00:37:14.192190 containerd[1639]: time="2026-03-13T00:37:14.192172788Z" level=info msg="StartContainer for \"e0528edf30fe0651be926d76cf52a69d744cf2b9c7976905a35406f897dd9f7a\"" Mar 13 00:37:14.195420 containerd[1639]: time="2026-03-13T00:37:14.195399266Z" level=info msg="connecting to shim e0528edf30fe0651be926d76cf52a69d744cf2b9c7976905a35406f897dd9f7a" address="unix:///run/containerd/s/a9bd38c9718199297e52c088b7e54a2586a7c2c4abf0561a5b9252b704698536" protocol=ttrpc version=3 Mar 13 00:37:14.195944 containerd[1639]: time="2026-03-13T00:37:14.192194159Z" level=info msg="connecting to shim 5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f" address="unix:///run/containerd/s/7f622903e8445fa647e374f4b4c1e8b65d5e8d9d93fb17a5c9f25acd57d88b0c" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:37:14.233712 systemd-networkd[1523]: cali04a94f46bcc: Gained IPv6LL Mar 13 00:37:14.254834 containerd[1639]: time="2026-03-13T00:37:14.254804446Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7z8mv,Uid:129d677b-52bb-4b01-a36f-4d698e8848e6,Namespace:calico-system,Attempt:0,} returns sandbox id \"72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a\"" Mar 13 00:37:14.265760 systemd[1]: Started cri-containerd-5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f.scope - libcontainer container 5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f. Mar 13 00:37:14.290045 systemd[1]: Started cri-containerd-e0528edf30fe0651be926d76cf52a69d744cf2b9c7976905a35406f897dd9f7a.scope - libcontainer container e0528edf30fe0651be926d76cf52a69d744cf2b9c7976905a35406f897dd9f7a. Mar 13 00:37:14.296048 containerd[1639]: time="2026-03-13T00:37:14.295954156Z" level=info msg="StartContainer for \"220dcec8abbd03ab7a7e451bc29a677c44af321661736ef0b6eb6b888454d062\" returns successfully" Mar 13 00:37:14.345256 containerd[1639]: time="2026-03-13T00:37:14.345223044Z" level=info msg="StartContainer for \"e0528edf30fe0651be926d76cf52a69d744cf2b9c7976905a35406f897dd9f7a\" returns successfully" Mar 13 00:37:14.380120 containerd[1639]: time="2026-03-13T00:37:14.380091728Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658d87f884-8zfwm,Uid:ba1a71bc-658c-494d-92f9-f4db1f8ea894,Namespace:calico-system,Attempt:0,} returns sandbox id \"5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f\"" Mar 13 00:37:14.638380 containerd[1639]: time="2026-03-13T00:37:14.638090174Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:14.639461 containerd[1639]: time="2026-03-13T00:37:14.639417590Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 13 00:37:14.641495 containerd[1639]: time="2026-03-13T00:37:14.641451137Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:14.644334 containerd[1639]: time="2026-03-13T00:37:14.643931824Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:14.644903 containerd[1639]: time="2026-03-13T00:37:14.644797949Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 2.929095637s" Mar 13 00:37:14.644996 containerd[1639]: time="2026-03-13T00:37:14.644984140Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 13 00:37:14.649589 containerd[1639]: time="2026-03-13T00:37:14.649551189Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 13 00:37:14.653707 containerd[1639]: time="2026-03-13T00:37:14.653608506Z" level=info msg="CreateContainer within sandbox \"a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 13 00:37:14.666861 containerd[1639]: time="2026-03-13T00:37:14.666829556Z" level=info msg="Container 6af3ab6ace7823b25b2945ba975140ae9ad3d3a1e14402ba48199285023b69da: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:14.671411 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2882815399.mount: Deactivated successfully. Mar 13 00:37:14.678537 containerd[1639]: time="2026-03-13T00:37:14.678433541Z" level=info msg="CreateContainer within sandbox \"a838c45eb908bd24eac3da7588bf28718b3c1cc8252c3ce2eb54b260e4754eae\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"6af3ab6ace7823b25b2945ba975140ae9ad3d3a1e14402ba48199285023b69da\"" Mar 13 00:37:14.679502 containerd[1639]: time="2026-03-13T00:37:14.679459242Z" level=info msg="StartContainer for \"6af3ab6ace7823b25b2945ba975140ae9ad3d3a1e14402ba48199285023b69da\"" Mar 13 00:37:14.681064 containerd[1639]: time="2026-03-13T00:37:14.681021573Z" level=info msg="connecting to shim 6af3ab6ace7823b25b2945ba975140ae9ad3d3a1e14402ba48199285023b69da" address="unix:///run/containerd/s/e4182be685784ca579fa91edaf4be422d364c135006556805edc021ab57271e3" protocol=ttrpc version=3 Mar 13 00:37:14.705741 systemd[1]: Started cri-containerd-6af3ab6ace7823b25b2945ba975140ae9ad3d3a1e14402ba48199285023b69da.scope - libcontainer container 6af3ab6ace7823b25b2945ba975140ae9ad3d3a1e14402ba48199285023b69da. Mar 13 00:37:14.747283 systemd-networkd[1523]: calib938b0e46ed: Gained IPv6LL Mar 13 00:37:14.778030 containerd[1639]: time="2026-03-13T00:37:14.777992213Z" level=info msg="StartContainer for \"6af3ab6ace7823b25b2945ba975140ae9ad3d3a1e14402ba48199285023b69da\" returns successfully" Mar 13 00:37:15.026186 kubelet[2856]: I0313 00:37:15.025923 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-4hngh" podStartSLOduration=46.025906296 podStartE2EDuration="46.025906296s" podCreationTimestamp="2026-03-13 00:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:37:15.025722544 +0000 UTC m=+52.904829422" watchObservedRunningTime="2026-03-13 00:37:15.025906296 +0000 UTC m=+52.905013166" Mar 13 00:37:15.059284 kubelet[2856]: I0313 00:37:15.058982 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-658d87f884-kc54z" podStartSLOduration=33.125500276 podStartE2EDuration="36.058964949s" podCreationTimestamp="2026-03-13 00:36:39 +0000 UTC" firstStartedPulling="2026-03-13 00:37:11.713820844 +0000 UTC m=+49.592927699" lastFinishedPulling="2026-03-13 00:37:14.647285517 +0000 UTC m=+52.526392372" observedRunningTime="2026-03-13 00:37:15.04750092 +0000 UTC m=+52.926607797" watchObservedRunningTime="2026-03-13 00:37:15.058964949 +0000 UTC m=+52.938071826" Mar 13 00:37:15.070614 kubelet[2856]: I0313 00:37:15.070258 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-mqhpg" podStartSLOduration=46.070240712 podStartE2EDuration="46.070240712s" podCreationTimestamp="2026-03-13 00:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:37:15.059726587 +0000 UTC m=+52.938833464" watchObservedRunningTime="2026-03-13 00:37:15.070240712 +0000 UTC m=+52.949347580" Mar 13 00:37:15.705821 systemd-networkd[1523]: cali60cd579182f: Gained IPv6LL Mar 13 00:37:15.898641 systemd-networkd[1523]: cali12d64617959: Gained IPv6LL Mar 13 00:37:16.017623 kubelet[2856]: I0313 00:37:16.017529 2856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:37:16.153694 systemd-networkd[1523]: caliddb1f97bf46: Gained IPv6LL Mar 13 00:37:16.820076 containerd[1639]: time="2026-03-13T00:37:16.820041717Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:16.821242 containerd[1639]: time="2026-03-13T00:37:16.821225768Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 13 00:37:16.822750 containerd[1639]: time="2026-03-13T00:37:16.822735936Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:16.825133 containerd[1639]: time="2026-03-13T00:37:16.825104430Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:16.825717 containerd[1639]: time="2026-03-13T00:37:16.825699309Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 2.175730146s" Mar 13 00:37:16.825788 containerd[1639]: time="2026-03-13T00:37:16.825778795Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 13 00:37:16.826669 containerd[1639]: time="2026-03-13T00:37:16.826652769Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 13 00:37:16.842390 containerd[1639]: time="2026-03-13T00:37:16.842329774Z" level=info msg="CreateContainer within sandbox \"ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 13 00:37:16.854838 containerd[1639]: time="2026-03-13T00:37:16.854798384Z" level=info msg="Container 8c3592f817155e85966d9a17fbdf8736c1e78490df27dcdb41cb3e2fd296a292: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:16.859235 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount811767705.mount: Deactivated successfully. Mar 13 00:37:16.865379 containerd[1639]: time="2026-03-13T00:37:16.865335268Z" level=info msg="CreateContainer within sandbox \"ad09a7723ff1ccee685f0b1b2ca1235a59d4f1ae8ea5e5cec6d0c37bd20ecc6f\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"8c3592f817155e85966d9a17fbdf8736c1e78490df27dcdb41cb3e2fd296a292\"" Mar 13 00:37:16.865969 containerd[1639]: time="2026-03-13T00:37:16.865933078Z" level=info msg="StartContainer for \"8c3592f817155e85966d9a17fbdf8736c1e78490df27dcdb41cb3e2fd296a292\"" Mar 13 00:37:16.867974 containerd[1639]: time="2026-03-13T00:37:16.867953768Z" level=info msg="connecting to shim 8c3592f817155e85966d9a17fbdf8736c1e78490df27dcdb41cb3e2fd296a292" address="unix:///run/containerd/s/f423147fc82d6101ba6b63b0e52a2b2268889b755480e295df3bd27cd0a18147" protocol=ttrpc version=3 Mar 13 00:37:16.887695 systemd[1]: Started cri-containerd-8c3592f817155e85966d9a17fbdf8736c1e78490df27dcdb41cb3e2fd296a292.scope - libcontainer container 8c3592f817155e85966d9a17fbdf8736c1e78490df27dcdb41cb3e2fd296a292. Mar 13 00:37:16.934209 containerd[1639]: time="2026-03-13T00:37:16.934182628Z" level=info msg="StartContainer for \"8c3592f817155e85966d9a17fbdf8736c1e78490df27dcdb41cb3e2fd296a292\" returns successfully" Mar 13 00:37:17.080263 kubelet[2856]: I0313 00:37:17.080115 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-697dc6db9-lsdmd" podStartSLOduration=33.053148965 podStartE2EDuration="38.080090059s" podCreationTimestamp="2026-03-13 00:36:39 +0000 UTC" firstStartedPulling="2026-03-13 00:37:11.799602067 +0000 UTC m=+49.678708922" lastFinishedPulling="2026-03-13 00:37:16.826543161 +0000 UTC m=+54.705650016" observedRunningTime="2026-03-13 00:37:17.032688561 +0000 UTC m=+54.911795438" watchObservedRunningTime="2026-03-13 00:37:17.080090059 +0000 UTC m=+54.959196929" Mar 13 00:37:19.152621 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3594211528.mount: Deactivated successfully. Mar 13 00:37:19.552446 containerd[1639]: time="2026-03-13T00:37:19.552266321Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:19.553577 containerd[1639]: time="2026-03-13T00:37:19.553552570Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 13 00:37:19.554939 containerd[1639]: time="2026-03-13T00:37:19.554922582Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:19.558003 containerd[1639]: time="2026-03-13T00:37:19.557982917Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:19.558543 containerd[1639]: time="2026-03-13T00:37:19.558527175Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 2.731852205s" Mar 13 00:37:19.558585 containerd[1639]: time="2026-03-13T00:37:19.558549397Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 13 00:37:19.560146 containerd[1639]: time="2026-03-13T00:37:19.559606185Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 13 00:37:19.562583 containerd[1639]: time="2026-03-13T00:37:19.562505373Z" level=info msg="CreateContainer within sandbox \"f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 13 00:37:19.574584 containerd[1639]: time="2026-03-13T00:37:19.574551787Z" level=info msg="Container 156fa5da225a61c389a43eb7f335dcf8a61d8d43422650432c4e7f0958a35fcd: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:19.580025 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2437006941.mount: Deactivated successfully. Mar 13 00:37:19.589231 containerd[1639]: time="2026-03-13T00:37:19.589136418Z" level=info msg="CreateContainer within sandbox \"f743e303c9c3d3f3be3d8dd4c3eed4c6389d17d29ba67133b97104a5e3a27bb4\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"156fa5da225a61c389a43eb7f335dcf8a61d8d43422650432c4e7f0958a35fcd\"" Mar 13 00:37:19.589687 containerd[1639]: time="2026-03-13T00:37:19.589644630Z" level=info msg="StartContainer for \"156fa5da225a61c389a43eb7f335dcf8a61d8d43422650432c4e7f0958a35fcd\"" Mar 13 00:37:19.592211 containerd[1639]: time="2026-03-13T00:37:19.592110764Z" level=info msg="connecting to shim 156fa5da225a61c389a43eb7f335dcf8a61d8d43422650432c4e7f0958a35fcd" address="unix:///run/containerd/s/1d49f03140a96a697b8f609a84c0aff1e9fa34bd232c360c8e154ddfd873f2de" protocol=ttrpc version=3 Mar 13 00:37:19.616704 systemd[1]: Started cri-containerd-156fa5da225a61c389a43eb7f335dcf8a61d8d43422650432c4e7f0958a35fcd.scope - libcontainer container 156fa5da225a61c389a43eb7f335dcf8a61d8d43422650432c4e7f0958a35fcd. Mar 13 00:37:19.662891 containerd[1639]: time="2026-03-13T00:37:19.662782736Z" level=info msg="StartContainer for \"156fa5da225a61c389a43eb7f335dcf8a61d8d43422650432c4e7f0958a35fcd\" returns successfully" Mar 13 00:37:20.060632 kubelet[2856]: I0313 00:37:20.059274 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-5b85766d88-w6hs4" podStartSLOduration=34.242149284999996 podStartE2EDuration="41.059248083s" podCreationTimestamp="2026-03-13 00:36:39 +0000 UTC" firstStartedPulling="2026-03-13 00:37:12.742234688 +0000 UTC m=+50.621341543" lastFinishedPulling="2026-03-13 00:37:19.559333486 +0000 UTC m=+57.438440341" observedRunningTime="2026-03-13 00:37:20.058910671 +0000 UTC m=+57.938017580" watchObservedRunningTime="2026-03-13 00:37:20.059248083 +0000 UTC m=+57.938355025" Mar 13 00:37:20.101526 kubelet[2856]: I0313 00:37:20.101490 2856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:37:21.018711 containerd[1639]: time="2026-03-13T00:37:21.018285017Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:21.020034 containerd[1639]: time="2026-03-13T00:37:21.020000000Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 13 00:37:21.021424 containerd[1639]: time="2026-03-13T00:37:21.021371182Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:21.024122 containerd[1639]: time="2026-03-13T00:37:21.023878873Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:21.024528 containerd[1639]: time="2026-03-13T00:37:21.024482755Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 1.464239823s" Mar 13 00:37:21.024528 containerd[1639]: time="2026-03-13T00:37:21.024509050Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 13 00:37:21.025688 containerd[1639]: time="2026-03-13T00:37:21.025671751Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 13 00:37:21.029319 containerd[1639]: time="2026-03-13T00:37:21.029285784Z" level=info msg="CreateContainer within sandbox \"72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 13 00:37:21.040598 containerd[1639]: time="2026-03-13T00:37:21.040083481Z" level=info msg="Container 5764d0802b609abd2254da83de3bada1fa06b250f196273035a422443a5a9b41: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:21.061346 containerd[1639]: time="2026-03-13T00:37:21.061312947Z" level=info msg="CreateContainer within sandbox \"72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"5764d0802b609abd2254da83de3bada1fa06b250f196273035a422443a5a9b41\"" Mar 13 00:37:21.061932 containerd[1639]: time="2026-03-13T00:37:21.061655332Z" level=info msg="StartContainer for \"5764d0802b609abd2254da83de3bada1fa06b250f196273035a422443a5a9b41\"" Mar 13 00:37:21.063929 containerd[1639]: time="2026-03-13T00:37:21.063902941Z" level=info msg="connecting to shim 5764d0802b609abd2254da83de3bada1fa06b250f196273035a422443a5a9b41" address="unix:///run/containerd/s/c976dd3f37c2629485dada70608fbe0dd2767ca1965f03b90117c99169f340c3" protocol=ttrpc version=3 Mar 13 00:37:21.090705 systemd[1]: Started cri-containerd-5764d0802b609abd2254da83de3bada1fa06b250f196273035a422443a5a9b41.scope - libcontainer container 5764d0802b609abd2254da83de3bada1fa06b250f196273035a422443a5a9b41. Mar 13 00:37:21.144973 containerd[1639]: time="2026-03-13T00:37:21.144844043Z" level=info msg="StartContainer for \"5764d0802b609abd2254da83de3bada1fa06b250f196273035a422443a5a9b41\" returns successfully" Mar 13 00:37:21.643639 containerd[1639]: time="2026-03-13T00:37:21.642952955Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:21.643639 containerd[1639]: time="2026-03-13T00:37:21.643069996Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 13 00:37:21.648499 containerd[1639]: time="2026-03-13T00:37:21.648426679Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 622.656419ms" Mar 13 00:37:21.648767 containerd[1639]: time="2026-03-13T00:37:21.648501716Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 13 00:37:21.653991 containerd[1639]: time="2026-03-13T00:37:21.653847908Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 13 00:37:21.659613 containerd[1639]: time="2026-03-13T00:37:21.659517649Z" level=info msg="CreateContainer within sandbox \"5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 13 00:37:21.679802 containerd[1639]: time="2026-03-13T00:37:21.679699737Z" level=info msg="Container 57f27038be7bfeb99da830c89049de2474761007a04293421153e79a6fc25227: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:21.702960 containerd[1639]: time="2026-03-13T00:37:21.702905341Z" level=info msg="CreateContainer within sandbox \"5fe61faf91e75522bdcd8ef7d0466847da8276781c5dbd9ab1feaa1ef195ef6f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"57f27038be7bfeb99da830c89049de2474761007a04293421153e79a6fc25227\"" Mar 13 00:37:21.704494 containerd[1639]: time="2026-03-13T00:37:21.704454655Z" level=info msg="StartContainer for \"57f27038be7bfeb99da830c89049de2474761007a04293421153e79a6fc25227\"" Mar 13 00:37:21.707486 containerd[1639]: time="2026-03-13T00:37:21.707438189Z" level=info msg="connecting to shim 57f27038be7bfeb99da830c89049de2474761007a04293421153e79a6fc25227" address="unix:///run/containerd/s/7f622903e8445fa647e374f4b4c1e8b65d5e8d9d93fb17a5c9f25acd57d88b0c" protocol=ttrpc version=3 Mar 13 00:37:21.744841 systemd[1]: Started cri-containerd-57f27038be7bfeb99da830c89049de2474761007a04293421153e79a6fc25227.scope - libcontainer container 57f27038be7bfeb99da830c89049de2474761007a04293421153e79a6fc25227. Mar 13 00:37:21.815730 containerd[1639]: time="2026-03-13T00:37:21.815666081Z" level=info msg="StartContainer for \"57f27038be7bfeb99da830c89049de2474761007a04293421153e79a6fc25227\" returns successfully" Mar 13 00:37:22.085358 kubelet[2856]: I0313 00:37:22.085154 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-658d87f884-8zfwm" podStartSLOduration=35.817213649 podStartE2EDuration="43.085130745s" podCreationTimestamp="2026-03-13 00:36:39 +0000 UTC" firstStartedPulling="2026-03-13 00:37:14.382006779 +0000 UTC m=+52.261113634" lastFinishedPulling="2026-03-13 00:37:21.64992381 +0000 UTC m=+59.529030730" observedRunningTime="2026-03-13 00:37:22.083654752 +0000 UTC m=+59.962761655" watchObservedRunningTime="2026-03-13 00:37:22.085130745 +0000 UTC m=+59.964237638" Mar 13 00:37:23.072272 kubelet[2856]: I0313 00:37:23.072246 2856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:37:23.152458 containerd[1639]: time="2026-03-13T00:37:23.152413756Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:23.153582 containerd[1639]: time="2026-03-13T00:37:23.153531406Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 13 00:37:23.155394 containerd[1639]: time="2026-03-13T00:37:23.155269678Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:23.157611 containerd[1639]: time="2026-03-13T00:37:23.157577122Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:37:23.158130 containerd[1639]: time="2026-03-13T00:37:23.157992159Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 1.504085889s" Mar 13 00:37:23.158130 containerd[1639]: time="2026-03-13T00:37:23.158021376Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 13 00:37:23.162995 containerd[1639]: time="2026-03-13T00:37:23.162972613Z" level=info msg="CreateContainer within sandbox \"72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 13 00:37:23.173211 containerd[1639]: time="2026-03-13T00:37:23.173003788Z" level=info msg="Container c64f26e884d9f5929e050442f4dd45ab8cc8f0e4fa5195704228c42e0b73ebc1: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:37:23.186161 containerd[1639]: time="2026-03-13T00:37:23.186138741Z" level=info msg="CreateContainer within sandbox \"72f65c87e691c1661ed61f307a9e3ebe16198ecb0ad426f4fe57b2ec29011d9a\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"c64f26e884d9f5929e050442f4dd45ab8cc8f0e4fa5195704228c42e0b73ebc1\"" Mar 13 00:37:23.186842 containerd[1639]: time="2026-03-13T00:37:23.186719898Z" level=info msg="StartContainer for \"c64f26e884d9f5929e050442f4dd45ab8cc8f0e4fa5195704228c42e0b73ebc1\"" Mar 13 00:37:23.187994 containerd[1639]: time="2026-03-13T00:37:23.187961721Z" level=info msg="connecting to shim c64f26e884d9f5929e050442f4dd45ab8cc8f0e4fa5195704228c42e0b73ebc1" address="unix:///run/containerd/s/c976dd3f37c2629485dada70608fbe0dd2767ca1965f03b90117c99169f340c3" protocol=ttrpc version=3 Mar 13 00:37:23.210713 systemd[1]: Started cri-containerd-c64f26e884d9f5929e050442f4dd45ab8cc8f0e4fa5195704228c42e0b73ebc1.scope - libcontainer container c64f26e884d9f5929e050442f4dd45ab8cc8f0e4fa5195704228c42e0b73ebc1. Mar 13 00:37:23.271254 containerd[1639]: time="2026-03-13T00:37:23.271225885Z" level=info msg="StartContainer for \"c64f26e884d9f5929e050442f4dd45ab8cc8f0e4fa5195704228c42e0b73ebc1\" returns successfully" Mar 13 00:37:23.545680 kubelet[2856]: I0313 00:37:23.545605 2856 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 13 00:37:23.549119 kubelet[2856]: I0313 00:37:23.548940 2856 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 13 00:37:24.112592 kubelet[2856]: I0313 00:37:24.112083 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-7z8mv" podStartSLOduration=36.216037314 podStartE2EDuration="45.112063837s" podCreationTimestamp="2026-03-13 00:36:39 +0000 UTC" firstStartedPulling="2026-03-13 00:37:14.262827245 +0000 UTC m=+52.141934101" lastFinishedPulling="2026-03-13 00:37:23.158853768 +0000 UTC m=+61.037960624" observedRunningTime="2026-03-13 00:37:24.111791352 +0000 UTC m=+61.990898246" watchObservedRunningTime="2026-03-13 00:37:24.112063837 +0000 UTC m=+61.991170752" Mar 13 00:37:54.393731 kubelet[2856]: I0313 00:37:54.393562 2856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:39:40.561468 systemd[1]: Started sshd@9-10.0.1.99:22-4.153.228.146:56698.service - OpenSSH per-connection server daemon (4.153.228.146:56698). Mar 13 00:39:41.156273 sshd[5918]: Accepted publickey for core from 4.153.228.146 port 56698 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:39:41.159615 sshd-session[5918]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:41.164522 systemd-logind[1612]: New session 10 of user core. Mar 13 00:39:41.171735 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 13 00:39:41.520632 sshd[5921]: Connection closed by 4.153.228.146 port 56698 Mar 13 00:39:41.520440 sshd-session[5918]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:41.525639 systemd[1]: sshd@9-10.0.1.99:22-4.153.228.146:56698.service: Deactivated successfully. Mar 13 00:39:41.527643 systemd[1]: session-10.scope: Deactivated successfully. Mar 13 00:39:41.528820 systemd-logind[1612]: Session 10 logged out. Waiting for processes to exit. Mar 13 00:39:41.530200 systemd-logind[1612]: Removed session 10. Mar 13 00:39:46.638070 systemd[1]: Started sshd@10-10.0.1.99:22-4.153.228.146:56700.service - OpenSSH per-connection server daemon (4.153.228.146:56700). Mar 13 00:39:47.201807 sshd[5970]: Accepted publickey for core from 4.153.228.146 port 56700 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:39:47.203961 sshd-session[5970]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:47.212442 systemd-logind[1612]: New session 11 of user core. Mar 13 00:39:47.221917 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 13 00:39:47.583052 sshd[5995]: Connection closed by 4.153.228.146 port 56700 Mar 13 00:39:47.582830 sshd-session[5970]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:47.588533 systemd-logind[1612]: Session 11 logged out. Waiting for processes to exit. Mar 13 00:39:47.589157 systemd[1]: sshd@10-10.0.1.99:22-4.153.228.146:56700.service: Deactivated successfully. Mar 13 00:39:47.592586 systemd[1]: session-11.scope: Deactivated successfully. Mar 13 00:39:47.595231 systemd-logind[1612]: Removed session 11. Mar 13 00:39:52.697792 systemd[1]: Started sshd@11-10.0.1.99:22-4.153.228.146:55662.service - OpenSSH per-connection server daemon (4.153.228.146:55662). Mar 13 00:39:53.227791 sshd[6029]: Accepted publickey for core from 4.153.228.146 port 55662 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:39:53.230055 sshd-session[6029]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:53.235717 systemd-logind[1612]: New session 12 of user core. Mar 13 00:39:53.242718 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 13 00:39:53.571131 sshd[6032]: Connection closed by 4.153.228.146 port 55662 Mar 13 00:39:53.572381 sshd-session[6029]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:53.579903 systemd[1]: sshd@11-10.0.1.99:22-4.153.228.146:55662.service: Deactivated successfully. Mar 13 00:39:53.584390 systemd[1]: session-12.scope: Deactivated successfully. Mar 13 00:39:53.586963 systemd-logind[1612]: Session 12 logged out. Waiting for processes to exit. Mar 13 00:39:53.590152 systemd-logind[1612]: Removed session 12. Mar 13 00:39:53.691971 systemd[1]: Started sshd@12-10.0.1.99:22-4.153.228.146:55666.service - OpenSSH per-connection server daemon (4.153.228.146:55666). Mar 13 00:39:54.259614 sshd[6044]: Accepted publickey for core from 4.153.228.146 port 55666 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:39:54.260394 sshd-session[6044]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:54.264656 systemd-logind[1612]: New session 13 of user core. Mar 13 00:39:54.272843 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 13 00:39:54.661517 sshd[6047]: Connection closed by 4.153.228.146 port 55666 Mar 13 00:39:54.662182 sshd-session[6044]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:54.666423 systemd-logind[1612]: Session 13 logged out. Waiting for processes to exit. Mar 13 00:39:54.667069 systemd[1]: sshd@12-10.0.1.99:22-4.153.228.146:55666.service: Deactivated successfully. Mar 13 00:39:54.669061 systemd[1]: session-13.scope: Deactivated successfully. Mar 13 00:39:54.670299 systemd-logind[1612]: Removed session 13. Mar 13 00:39:54.766911 systemd[1]: Started sshd@13-10.0.1.99:22-4.153.228.146:55680.service - OpenSSH per-connection server daemon (4.153.228.146:55680). Mar 13 00:39:55.302643 sshd[6078]: Accepted publickey for core from 4.153.228.146 port 55680 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:39:55.307217 sshd-session[6078]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:39:55.319187 systemd-logind[1612]: New session 14 of user core. Mar 13 00:39:55.330628 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 13 00:39:55.727993 sshd[6081]: Connection closed by 4.153.228.146 port 55680 Mar 13 00:39:55.729295 sshd-session[6078]: pam_unix(sshd:session): session closed for user core Mar 13 00:39:55.737892 systemd-logind[1612]: Session 14 logged out. Waiting for processes to exit. Mar 13 00:39:55.738728 systemd[1]: sshd@13-10.0.1.99:22-4.153.228.146:55680.service: Deactivated successfully. Mar 13 00:39:55.742967 systemd[1]: session-14.scope: Deactivated successfully. Mar 13 00:39:55.746832 systemd-logind[1612]: Removed session 14. Mar 13 00:40:00.837404 systemd[1]: Started sshd@14-10.0.1.99:22-4.153.228.146:35188.service - OpenSSH per-connection server daemon (4.153.228.146:35188). Mar 13 00:40:01.380457 sshd[6095]: Accepted publickey for core from 4.153.228.146 port 35188 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:40:01.382252 sshd-session[6095]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:40:01.389860 systemd-logind[1612]: New session 15 of user core. Mar 13 00:40:01.396916 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 13 00:40:01.730350 sshd[6098]: Connection closed by 4.153.228.146 port 35188 Mar 13 00:40:01.731257 sshd-session[6095]: pam_unix(sshd:session): session closed for user core Mar 13 00:40:01.738269 systemd[1]: sshd@14-10.0.1.99:22-4.153.228.146:35188.service: Deactivated successfully. Mar 13 00:40:01.741259 systemd[1]: session-15.scope: Deactivated successfully. Mar 13 00:40:01.743532 systemd-logind[1612]: Session 15 logged out. Waiting for processes to exit. Mar 13 00:40:01.745312 systemd-logind[1612]: Removed session 15. Mar 13 00:40:01.840377 systemd[1]: Started sshd@15-10.0.1.99:22-4.153.228.146:35200.service - OpenSSH per-connection server daemon (4.153.228.146:35200). Mar 13 00:40:02.379230 sshd[6110]: Accepted publickey for core from 4.153.228.146 port 35200 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:40:02.381157 sshd-session[6110]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:40:02.386316 systemd-logind[1612]: New session 16 of user core. Mar 13 00:40:02.393721 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 13 00:40:03.180635 sshd[6138]: Connection closed by 4.153.228.146 port 35200 Mar 13 00:40:03.181319 sshd-session[6110]: pam_unix(sshd:session): session closed for user core Mar 13 00:40:03.188551 systemd[1]: sshd@15-10.0.1.99:22-4.153.228.146:35200.service: Deactivated successfully. Mar 13 00:40:03.191445 systemd[1]: session-16.scope: Deactivated successfully. Mar 13 00:40:03.193050 systemd-logind[1612]: Session 16 logged out. Waiting for processes to exit. Mar 13 00:40:03.194745 systemd-logind[1612]: Removed session 16. Mar 13 00:40:03.289737 systemd[1]: Started sshd@16-10.0.1.99:22-4.153.228.146:35206.service - OpenSSH per-connection server daemon (4.153.228.146:35206). Mar 13 00:40:03.830670 sshd[6148]: Accepted publickey for core from 4.153.228.146 port 35206 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:40:03.832744 sshd-session[6148]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:40:03.840144 systemd-logind[1612]: New session 17 of user core. Mar 13 00:40:03.847751 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 13 00:40:04.789960 sshd[6151]: Connection closed by 4.153.228.146 port 35206 Mar 13 00:40:04.790327 sshd-session[6148]: pam_unix(sshd:session): session closed for user core Mar 13 00:40:04.794543 systemd-logind[1612]: Session 17 logged out. Waiting for processes to exit. Mar 13 00:40:04.794931 systemd[1]: sshd@16-10.0.1.99:22-4.153.228.146:35206.service: Deactivated successfully. Mar 13 00:40:04.796772 systemd[1]: session-17.scope: Deactivated successfully. Mar 13 00:40:04.798281 systemd-logind[1612]: Removed session 17. Mar 13 00:40:04.893745 systemd[1]: Started sshd@17-10.0.1.99:22-4.153.228.146:35208.service - OpenSSH per-connection server daemon (4.153.228.146:35208). Mar 13 00:40:05.410958 sshd[6176]: Accepted publickey for core from 4.153.228.146 port 35208 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:40:05.411331 sshd-session[6176]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:40:05.416047 systemd-logind[1612]: New session 18 of user core. Mar 13 00:40:05.423990 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 13 00:40:05.934551 sshd[6179]: Connection closed by 4.153.228.146 port 35208 Mar 13 00:40:05.934461 sshd-session[6176]: pam_unix(sshd:session): session closed for user core Mar 13 00:40:05.938229 systemd[1]: sshd@17-10.0.1.99:22-4.153.228.146:35208.service: Deactivated successfully. Mar 13 00:40:05.940086 systemd[1]: session-18.scope: Deactivated successfully. Mar 13 00:40:05.941828 systemd-logind[1612]: Session 18 logged out. Waiting for processes to exit. Mar 13 00:40:05.942677 systemd-logind[1612]: Removed session 18. Mar 13 00:40:06.045618 systemd[1]: Started sshd@18-10.0.1.99:22-4.153.228.146:35224.service - OpenSSH per-connection server daemon (4.153.228.146:35224). Mar 13 00:40:06.580418 sshd[6193]: Accepted publickey for core from 4.153.228.146 port 35224 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:40:06.581980 sshd-session[6193]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:40:06.591174 systemd-logind[1612]: New session 19 of user core. Mar 13 00:40:06.598846 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 13 00:40:06.969081 sshd[6196]: Connection closed by 4.153.228.146 port 35224 Mar 13 00:40:06.970721 sshd-session[6193]: pam_unix(sshd:session): session closed for user core Mar 13 00:40:06.973710 systemd[1]: sshd@18-10.0.1.99:22-4.153.228.146:35224.service: Deactivated successfully. Mar 13 00:40:06.975260 systemd[1]: session-19.scope: Deactivated successfully. Mar 13 00:40:06.975952 systemd-logind[1612]: Session 19 logged out. Waiting for processes to exit. Mar 13 00:40:06.976972 systemd-logind[1612]: Removed session 19. Mar 13 00:40:12.074875 systemd[1]: Started sshd@19-10.0.1.99:22-4.153.228.146:35654.service - OpenSSH per-connection server daemon (4.153.228.146:35654). Mar 13 00:40:12.596167 sshd[6219]: Accepted publickey for core from 4.153.228.146 port 35654 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:40:12.599279 sshd-session[6219]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:40:12.605144 systemd-logind[1612]: New session 20 of user core. Mar 13 00:40:12.611756 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 13 00:40:12.965645 sshd[6225]: Connection closed by 4.153.228.146 port 35654 Mar 13 00:40:12.966835 sshd-session[6219]: pam_unix(sshd:session): session closed for user core Mar 13 00:40:12.972716 systemd-logind[1612]: Session 20 logged out. Waiting for processes to exit. Mar 13 00:40:12.973792 systemd[1]: sshd@19-10.0.1.99:22-4.153.228.146:35654.service: Deactivated successfully. Mar 13 00:40:12.976443 systemd[1]: session-20.scope: Deactivated successfully. Mar 13 00:40:12.978820 systemd-logind[1612]: Removed session 20. Mar 13 00:40:17.506865 update_engine[1613]: I20260313 00:40:17.506731 1613 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 13 00:40:17.506865 update_engine[1613]: I20260313 00:40:17.506842 1613 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 13 00:40:17.507918 update_engine[1613]: I20260313 00:40:17.507306 1613 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 13 00:40:17.508685 update_engine[1613]: I20260313 00:40:17.508458 1613 omaha_request_params.cc:62] Current group set to stable Mar 13 00:40:17.510190 update_engine[1613]: I20260313 00:40:17.510123 1613 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 13 00:40:17.510190 update_engine[1613]: I20260313 00:40:17.510170 1613 update_attempter.cc:643] Scheduling an action processor start. Mar 13 00:40:17.510380 update_engine[1613]: I20260313 00:40:17.510210 1613 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 13 00:40:17.527950 update_engine[1613]: I20260313 00:40:17.527197 1613 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 13 00:40:17.527950 update_engine[1613]: I20260313 00:40:17.527420 1613 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 13 00:40:17.527950 update_engine[1613]: I20260313 00:40:17.527446 1613 omaha_request_action.cc:272] Request: Mar 13 00:40:17.527950 update_engine[1613]: Mar 13 00:40:17.527950 update_engine[1613]: Mar 13 00:40:17.527950 update_engine[1613]: Mar 13 00:40:17.527950 update_engine[1613]: Mar 13 00:40:17.527950 update_engine[1613]: Mar 13 00:40:17.527950 update_engine[1613]: Mar 13 00:40:17.527950 update_engine[1613]: Mar 13 00:40:17.527950 update_engine[1613]: Mar 13 00:40:17.527950 update_engine[1613]: I20260313 00:40:17.527462 1613 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 13 00:40:17.532654 locksmithd[1652]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 13 00:40:17.545729 update_engine[1613]: I20260313 00:40:17.545662 1613 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 13 00:40:17.546880 update_engine[1613]: I20260313 00:40:17.546818 1613 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 13 00:40:17.555124 update_engine[1613]: E20260313 00:40:17.555044 1613 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 13 00:40:17.555272 update_engine[1613]: I20260313 00:40:17.555191 1613 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 13 00:40:18.077831 systemd[1]: Started sshd@20-10.0.1.99:22-4.153.228.146:35664.service - OpenSSH per-connection server daemon (4.153.228.146:35664). Mar 13 00:40:18.633864 sshd[6275]: Accepted publickey for core from 4.153.228.146 port 35664 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:40:18.637144 sshd-session[6275]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:40:18.650291 systemd-logind[1612]: New session 21 of user core. Mar 13 00:40:18.657915 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 13 00:40:18.969640 sshd[6278]: Connection closed by 4.153.228.146 port 35664 Mar 13 00:40:18.971113 sshd-session[6275]: pam_unix(sshd:session): session closed for user core Mar 13 00:40:18.978335 systemd[1]: sshd@20-10.0.1.99:22-4.153.228.146:35664.service: Deactivated successfully. Mar 13 00:40:18.981440 systemd[1]: session-21.scope: Deactivated successfully. Mar 13 00:40:18.982852 systemd-logind[1612]: Session 21 logged out. Waiting for processes to exit. Mar 13 00:40:18.986337 systemd-logind[1612]: Removed session 21. Mar 13 00:40:24.086542 systemd[1]: Started sshd@21-10.0.1.99:22-4.153.228.146:33008.service - OpenSSH per-connection server daemon (4.153.228.146:33008). Mar 13 00:40:24.643864 sshd[6340]: Accepted publickey for core from 4.153.228.146 port 33008 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:40:24.646427 sshd-session[6340]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:40:24.656297 systemd-logind[1612]: New session 22 of user core. Mar 13 00:40:24.664839 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 13 00:40:25.008805 sshd[6343]: Connection closed by 4.153.228.146 port 33008 Mar 13 00:40:25.009547 sshd-session[6340]: pam_unix(sshd:session): session closed for user core Mar 13 00:40:25.016590 systemd[1]: sshd@21-10.0.1.99:22-4.153.228.146:33008.service: Deactivated successfully. Mar 13 00:40:25.019533 systemd[1]: session-22.scope: Deactivated successfully. Mar 13 00:40:25.020645 systemd-logind[1612]: Session 22 logged out. Waiting for processes to exit. Mar 13 00:40:25.022202 systemd-logind[1612]: Removed session 22. Mar 13 00:40:27.505490 update_engine[1613]: I20260313 00:40:27.504751 1613 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 13 00:40:27.505490 update_engine[1613]: I20260313 00:40:27.504870 1613 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 13 00:40:27.505490 update_engine[1613]: I20260313 00:40:27.505369 1613 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 13 00:40:27.511361 update_engine[1613]: E20260313 00:40:27.511142 1613 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 13 00:40:27.511361 update_engine[1613]: I20260313 00:40:27.511299 1613 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 13 00:40:30.117778 systemd[1]: Started sshd@22-10.0.1.99:22-4.153.228.146:33004.service - OpenSSH per-connection server daemon (4.153.228.146:33004). Mar 13 00:40:30.643135 sshd[6357]: Accepted publickey for core from 4.153.228.146 port 33004 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:40:30.646504 sshd-session[6357]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:40:30.660379 systemd-logind[1612]: New session 23 of user core. Mar 13 00:40:30.668811 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 13 00:40:31.044744 sshd[6360]: Connection closed by 4.153.228.146 port 33004 Mar 13 00:40:31.044857 sshd-session[6357]: pam_unix(sshd:session): session closed for user core Mar 13 00:40:31.047822 systemd[1]: sshd@22-10.0.1.99:22-4.153.228.146:33004.service: Deactivated successfully. Mar 13 00:40:31.049424 systemd[1]: session-23.scope: Deactivated successfully. Mar 13 00:40:31.050537 systemd-logind[1612]: Session 23 logged out. Waiting for processes to exit. Mar 13 00:40:31.052169 systemd-logind[1612]: Removed session 23. Mar 13 00:40:36.165858 systemd[1]: Started sshd@23-10.0.1.99:22-4.153.228.146:33006.service - OpenSSH per-connection server daemon (4.153.228.146:33006). Mar 13 00:40:36.699585 sshd[6395]: Accepted publickey for core from 4.153.228.146 port 33006 ssh2: RSA SHA256:vq/pKw+AvC1pwghLaTIizFiq9VFBXFrLmNBInBA4+oE Mar 13 00:40:36.700402 sshd-session[6395]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:40:36.706645 systemd-logind[1612]: New session 24 of user core. Mar 13 00:40:36.709722 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 13 00:40:37.056001 sshd[6401]: Connection closed by 4.153.228.146 port 33006 Mar 13 00:40:37.056786 sshd-session[6395]: pam_unix(sshd:session): session closed for user core Mar 13 00:40:37.062541 systemd-logind[1612]: Session 24 logged out. Waiting for processes to exit. Mar 13 00:40:37.063098 systemd[1]: sshd@23-10.0.1.99:22-4.153.228.146:33006.service: Deactivated successfully. Mar 13 00:40:37.066496 systemd[1]: session-24.scope: Deactivated successfully. Mar 13 00:40:37.069672 systemd-logind[1612]: Removed session 24. Mar 13 00:40:37.503723 update_engine[1613]: I20260313 00:40:37.503644 1613 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 13 00:40:37.504229 update_engine[1613]: I20260313 00:40:37.503734 1613 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 13 00:40:37.504229 update_engine[1613]: I20260313 00:40:37.504084 1613 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 13 00:40:37.510590 update_engine[1613]: E20260313 00:40:37.510499 1613 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 13 00:40:37.510684 update_engine[1613]: I20260313 00:40:37.510594 1613 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 13 00:40:47.510669 update_engine[1613]: I20260313 00:40:47.510602 1613 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 13 00:40:47.511093 update_engine[1613]: I20260313 00:40:47.510679 1613 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 13 00:40:47.511093 update_engine[1613]: I20260313 00:40:47.510965 1613 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 13 00:40:47.516581 update_engine[1613]: E20260313 00:40:47.516533 1613 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 13 00:40:47.516679 update_engine[1613]: I20260313 00:40:47.516616 1613 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 13 00:40:47.516679 update_engine[1613]: I20260313 00:40:47.516624 1613 omaha_request_action.cc:617] Omaha request response: Mar 13 00:40:47.516735 update_engine[1613]: E20260313 00:40:47.516692 1613 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 13 00:40:47.516735 update_engine[1613]: I20260313 00:40:47.516711 1613 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 13 00:40:47.516735 update_engine[1613]: I20260313 00:40:47.516716 1613 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 13 00:40:47.516735 update_engine[1613]: I20260313 00:40:47.516721 1613 update_attempter.cc:306] Processing Done. Mar 13 00:40:47.516735 update_engine[1613]: E20260313 00:40:47.516733 1613 update_attempter.cc:619] Update failed. Mar 13 00:40:47.516859 update_engine[1613]: I20260313 00:40:47.516739 1613 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 13 00:40:47.516859 update_engine[1613]: I20260313 00:40:47.516744 1613 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 13 00:40:47.516859 update_engine[1613]: I20260313 00:40:47.516748 1613 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 13 00:40:47.516859 update_engine[1613]: I20260313 00:40:47.516809 1613 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 13 00:40:47.516859 update_engine[1613]: I20260313 00:40:47.516829 1613 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 13 00:40:47.516859 update_engine[1613]: I20260313 00:40:47.516834 1613 omaha_request_action.cc:272] Request: Mar 13 00:40:47.516859 update_engine[1613]: Mar 13 00:40:47.516859 update_engine[1613]: Mar 13 00:40:47.516859 update_engine[1613]: Mar 13 00:40:47.516859 update_engine[1613]: Mar 13 00:40:47.516859 update_engine[1613]: Mar 13 00:40:47.516859 update_engine[1613]: Mar 13 00:40:47.516859 update_engine[1613]: I20260313 00:40:47.516839 1613 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 13 00:40:47.516859 update_engine[1613]: I20260313 00:40:47.516855 1613 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 13 00:40:47.517202 update_engine[1613]: I20260313 00:40:47.517072 1613 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 13 00:40:47.517383 locksmithd[1652]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 13 00:40:47.525521 update_engine[1613]: E20260313 00:40:47.525487 1613 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 13 00:40:47.525620 update_engine[1613]: I20260313 00:40:47.525548 1613 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 13 00:40:47.525620 update_engine[1613]: I20260313 00:40:47.525556 1613 omaha_request_action.cc:617] Omaha request response: Mar 13 00:40:47.525620 update_engine[1613]: I20260313 00:40:47.525599 1613 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 13 00:40:47.525620 update_engine[1613]: I20260313 00:40:47.525604 1613 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 13 00:40:47.525620 update_engine[1613]: I20260313 00:40:47.525609 1613 update_attempter.cc:306] Processing Done. Mar 13 00:40:47.525620 update_engine[1613]: I20260313 00:40:47.525614 1613 update_attempter.cc:310] Error event sent. Mar 13 00:40:47.525816 update_engine[1613]: I20260313 00:40:47.525621 1613 update_check_scheduler.cc:74] Next update check in 40m50s Mar 13 00:40:47.525939 locksmithd[1652]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 13 00:41:03.974392 systemd[1]: cri-containerd-a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21.scope: Deactivated successfully. Mar 13 00:41:03.975880 systemd[1]: cri-containerd-a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21.scope: Consumed 4.150s CPU time, 61.1M memory peak, 64K read from disk. Mar 13 00:41:03.976962 containerd[1639]: time="2026-03-13T00:41:03.976931033Z" level=info msg="received container exit event container_id:\"a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21\" id:\"a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21\" pid:2658 exit_status:1 exited_at:{seconds:1773362463 nanos:975839469}" Mar 13 00:41:04.001993 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21-rootfs.mount: Deactivated successfully. Mar 13 00:41:04.191005 kubelet[2856]: E0313 00:41:04.190776 2856 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.1.99:36182->10.0.1.40:2379: read: connection timed out" Mar 13 00:41:04.757840 kubelet[2856]: I0313 00:41:04.757815 2856 scope.go:117] "RemoveContainer" containerID="a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21" Mar 13 00:41:04.764389 containerd[1639]: time="2026-03-13T00:41:04.764348682Z" level=info msg="CreateContainer within sandbox \"f1fb029220b9fcae0db2282066d61b054ee845aa9a2fe9e6c36e5852e36bc669\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 13 00:41:04.779325 containerd[1639]: time="2026-03-13T00:41:04.779287727Z" level=info msg="Container 9ea4289dc00dac350708eda23301ebc8c8a98cdc4f3a680d43074c3bc22cdb78: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:41:04.785835 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4173173160.mount: Deactivated successfully. Mar 13 00:41:04.789729 containerd[1639]: time="2026-03-13T00:41:04.789693981Z" level=info msg="CreateContainer within sandbox \"f1fb029220b9fcae0db2282066d61b054ee845aa9a2fe9e6c36e5852e36bc669\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"9ea4289dc00dac350708eda23301ebc8c8a98cdc4f3a680d43074c3bc22cdb78\"" Mar 13 00:41:04.790150 containerd[1639]: time="2026-03-13T00:41:04.790132469Z" level=info msg="StartContainer for \"9ea4289dc00dac350708eda23301ebc8c8a98cdc4f3a680d43074c3bc22cdb78\"" Mar 13 00:41:04.791172 containerd[1639]: time="2026-03-13T00:41:04.791151420Z" level=info msg="connecting to shim 9ea4289dc00dac350708eda23301ebc8c8a98cdc4f3a680d43074c3bc22cdb78" address="unix:///run/containerd/s/4b01ad3fffc8d595d0fd7ef53a333aaabb5e49fb17ad697536bce1b9869c178c" protocol=ttrpc version=3 Mar 13 00:41:04.813768 systemd[1]: Started cri-containerd-9ea4289dc00dac350708eda23301ebc8c8a98cdc4f3a680d43074c3bc22cdb78.scope - libcontainer container 9ea4289dc00dac350708eda23301ebc8c8a98cdc4f3a680d43074c3bc22cdb78. Mar 13 00:41:04.861679 systemd[1]: cri-containerd-f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8.scope: Deactivated successfully. Mar 13 00:41:04.862237 systemd[1]: cri-containerd-f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8.scope: Consumed 7.929s CPU time, 142M memory peak, 748K read from disk. Mar 13 00:41:04.865881 containerd[1639]: time="2026-03-13T00:41:04.865764706Z" level=info msg="received container exit event container_id:\"f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8\" id:\"f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8\" pid:3182 exit_status:1 exited_at:{seconds:1773362464 nanos:865266958}" Mar 13 00:41:04.876611 containerd[1639]: time="2026-03-13T00:41:04.876503764Z" level=info msg="StartContainer for \"9ea4289dc00dac350708eda23301ebc8c8a98cdc4f3a680d43074c3bc22cdb78\" returns successfully" Mar 13 00:41:04.907424 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8-rootfs.mount: Deactivated successfully. Mar 13 00:41:05.767265 kubelet[2856]: I0313 00:41:05.766993 2856 scope.go:117] "RemoveContainer" containerID="f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8" Mar 13 00:41:05.770880 containerd[1639]: time="2026-03-13T00:41:05.770819049Z" level=info msg="CreateContainer within sandbox \"b004ad50701d8f52c4649d3047e5c72dff7b78e4981c18e6f39dc0282bf653a3\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 13 00:41:05.785686 containerd[1639]: time="2026-03-13T00:41:05.785055370Z" level=info msg="Container 6a9ccbbeaf3d762cb56a5e7bba33701f152080538161c5d1e8f23c72f10d6c19: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:41:05.798844 containerd[1639]: time="2026-03-13T00:41:05.798800959Z" level=info msg="CreateContainer within sandbox \"b004ad50701d8f52c4649d3047e5c72dff7b78e4981c18e6f39dc0282bf653a3\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"6a9ccbbeaf3d762cb56a5e7bba33701f152080538161c5d1e8f23c72f10d6c19\"" Mar 13 00:41:05.799697 containerd[1639]: time="2026-03-13T00:41:05.799548343Z" level=info msg="StartContainer for \"6a9ccbbeaf3d762cb56a5e7bba33701f152080538161c5d1e8f23c72f10d6c19\"" Mar 13 00:41:05.801186 containerd[1639]: time="2026-03-13T00:41:05.801149994Z" level=info msg="connecting to shim 6a9ccbbeaf3d762cb56a5e7bba33701f152080538161c5d1e8f23c72f10d6c19" address="unix:///run/containerd/s/903172a71d70268301427c8b1b9b53c411d47b480d4a7275627dc5289cf1cf73" protocol=ttrpc version=3 Mar 13 00:41:05.838710 systemd[1]: Started cri-containerd-6a9ccbbeaf3d762cb56a5e7bba33701f152080538161c5d1e8f23c72f10d6c19.scope - libcontainer container 6a9ccbbeaf3d762cb56a5e7bba33701f152080538161c5d1e8f23c72f10d6c19. Mar 13 00:41:05.889398 containerd[1639]: time="2026-03-13T00:41:05.889290322Z" level=info msg="StartContainer for \"6a9ccbbeaf3d762cb56a5e7bba33701f152080538161c5d1e8f23c72f10d6c19\" returns successfully" Mar 13 00:41:08.864653 kubelet[2856]: E0313 00:41:08.861026 2856 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.1.99:36014->10.0.1.40:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-4-n-23cf6448d4.189c3fc2a06216df kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-4-n-23cf6448d4,UID:53388ea298f278325c4f4e63339e330a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-23cf6448d4,},FirstTimestamp:2026-03-13 00:40:58.424121055 +0000 UTC m=+276.303227965,LastTimestamp:2026-03-13 00:40:58.424121055 +0000 UTC m=+276.303227965,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-23cf6448d4,}" Mar 13 00:41:09.456551 systemd[1]: cri-containerd-4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c.scope: Deactivated successfully. Mar 13 00:41:09.457775 systemd[1]: cri-containerd-4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c.scope: Consumed 3.556s CPU time, 21M memory peak, 64K read from disk. Mar 13 00:41:09.462501 containerd[1639]: time="2026-03-13T00:41:09.462399649Z" level=info msg="received container exit event container_id:\"4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c\" id:\"4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c\" pid:2697 exit_status:1 exited_at:{seconds:1773362469 nanos:460927939}" Mar 13 00:41:09.518243 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c-rootfs.mount: Deactivated successfully. Mar 13 00:41:09.785352 kubelet[2856]: I0313 00:41:09.785191 2856 scope.go:117] "RemoveContainer" containerID="4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c" Mar 13 00:41:09.787055 containerd[1639]: time="2026-03-13T00:41:09.787014637Z" level=info msg="CreateContainer within sandbox \"ad7621a7f325fe17e74a6168f0d404135d861561bc68a5c3d5a52e068dbb10e3\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 13 00:41:09.800960 containerd[1639]: time="2026-03-13T00:41:09.799584581Z" level=info msg="Container eb7ac767a9207eaae58b6ebe7b1482fe3b3b481a571d1420a2ac7a08f4d7a6fc: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:41:09.807175 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3008780362.mount: Deactivated successfully. Mar 13 00:41:09.812953 containerd[1639]: time="2026-03-13T00:41:09.812916333Z" level=info msg="CreateContainer within sandbox \"ad7621a7f325fe17e74a6168f0d404135d861561bc68a5c3d5a52e068dbb10e3\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"eb7ac767a9207eaae58b6ebe7b1482fe3b3b481a571d1420a2ac7a08f4d7a6fc\"" Mar 13 00:41:09.813376 containerd[1639]: time="2026-03-13T00:41:09.813359710Z" level=info msg="StartContainer for \"eb7ac767a9207eaae58b6ebe7b1482fe3b3b481a571d1420a2ac7a08f4d7a6fc\"" Mar 13 00:41:09.814444 containerd[1639]: time="2026-03-13T00:41:09.814353957Z" level=info msg="connecting to shim eb7ac767a9207eaae58b6ebe7b1482fe3b3b481a571d1420a2ac7a08f4d7a6fc" address="unix:///run/containerd/s/500622851d357e0bd3eec6092a7919d787258ecf12b60961c23ae12937075a73" protocol=ttrpc version=3 Mar 13 00:41:09.835744 systemd[1]: Started cri-containerd-eb7ac767a9207eaae58b6ebe7b1482fe3b3b481a571d1420a2ac7a08f4d7a6fc.scope - libcontainer container eb7ac767a9207eaae58b6ebe7b1482fe3b3b481a571d1420a2ac7a08f4d7a6fc. Mar 13 00:41:09.887119 containerd[1639]: time="2026-03-13T00:41:09.887081338Z" level=info msg="StartContainer for \"eb7ac767a9207eaae58b6ebe7b1482fe3b3b481a571d1420a2ac7a08f4d7a6fc\" returns successfully" Mar 13 00:41:14.193119 kubelet[2856]: E0313 00:41:14.192922 2856 controller.go:195] "Failed to update lease" err="Put \"https://10.0.1.99:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-23cf6448d4?timeout=10s\": context deadline exceeded" Mar 13 00:41:14.875852 kubelet[2856]: I0313 00:41:14.875596 2856 status_manager.go:895] "Failed to get status for pod" podUID="53388ea298f278325c4f4e63339e330a" pod="kube-system/kube-apiserver-ci-4459-2-4-n-23cf6448d4" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.1.99:36104->10.0.1.40:2379: read: connection timed out" Mar 13 00:41:17.093588 systemd[1]: cri-containerd-6a9ccbbeaf3d762cb56a5e7bba33701f152080538161c5d1e8f23c72f10d6c19.scope: Deactivated successfully. Mar 13 00:41:17.094419 systemd[1]: cri-containerd-6a9ccbbeaf3d762cb56a5e7bba33701f152080538161c5d1e8f23c72f10d6c19.scope: Consumed 223ms CPU time, 37.3M memory peak, 1M read from disk. Mar 13 00:41:17.095378 containerd[1639]: time="2026-03-13T00:41:17.094243779Z" level=info msg="received container exit event container_id:\"6a9ccbbeaf3d762cb56a5e7bba33701f152080538161c5d1e8f23c72f10d6c19\" id:\"6a9ccbbeaf3d762cb56a5e7bba33701f152080538161c5d1e8f23c72f10d6c19\" pid:6577 exit_status:1 exited_at:{seconds:1773362477 nanos:93763955}" Mar 13 00:41:17.114274 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6a9ccbbeaf3d762cb56a5e7bba33701f152080538161c5d1e8f23c72f10d6c19-rootfs.mount: Deactivated successfully. Mar 13 00:41:17.599201 containerd[1639]: time="2026-03-13T00:41:17.599103620Z" level=warning msg="container event discarded" container=f1fb029220b9fcae0db2282066d61b054ee845aa9a2fe9e6c36e5852e36bc669 type=CONTAINER_CREATED_EVENT Mar 13 00:41:17.599201 containerd[1639]: time="2026-03-13T00:41:17.599182704Z" level=warning msg="container event discarded" container=f1fb029220b9fcae0db2282066d61b054ee845aa9a2fe9e6c36e5852e36bc669 type=CONTAINER_STARTED_EVENT Mar 13 00:41:17.643138 containerd[1639]: time="2026-03-13T00:41:17.643082911Z" level=warning msg="container event discarded" container=b0cea271265674788d782ba8146e6d9698855357895234de2f2c4b7c17c20b0b type=CONTAINER_CREATED_EVENT Mar 13 00:41:17.643138 containerd[1639]: time="2026-03-13T00:41:17.643126519Z" level=warning msg="container event discarded" container=b0cea271265674788d782ba8146e6d9698855357895234de2f2c4b7c17c20b0b type=CONTAINER_STARTED_EVENT Mar 13 00:41:17.643138 containerd[1639]: time="2026-03-13T00:41:17.643135411Z" level=warning msg="container event discarded" container=a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21 type=CONTAINER_CREATED_EVENT Mar 13 00:41:17.674626 containerd[1639]: time="2026-03-13T00:41:17.674475360Z" level=warning msg="container event discarded" container=99b485a6d18bb6a5d21d210da5a2ca022fc3ab3a186cff1693be8ce38bfc0753 type=CONTAINER_CREATED_EVENT Mar 13 00:41:17.675043 containerd[1639]: time="2026-03-13T00:41:17.674919637Z" level=warning msg="container event discarded" container=ad7621a7f325fe17e74a6168f0d404135d861561bc68a5c3d5a52e068dbb10e3 type=CONTAINER_CREATED_EVENT Mar 13 00:41:17.675043 containerd[1639]: time="2026-03-13T00:41:17.674993977Z" level=warning msg="container event discarded" container=ad7621a7f325fe17e74a6168f0d404135d861561bc68a5c3d5a52e068dbb10e3 type=CONTAINER_STARTED_EVENT Mar 13 00:41:17.708013 containerd[1639]: time="2026-03-13T00:41:17.707833393Z" level=warning msg="container event discarded" container=4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c type=CONTAINER_CREATED_EVENT Mar 13 00:41:17.758247 containerd[1639]: time="2026-03-13T00:41:17.758160616Z" level=warning msg="container event discarded" container=a5345009bc67af2cf61df265cdbde64ad7ed046add77df2295d2101999f8db21 type=CONTAINER_STARTED_EVENT Mar 13 00:41:17.778524 containerd[1639]: time="2026-03-13T00:41:17.778443902Z" level=warning msg="container event discarded" container=99b485a6d18bb6a5d21d210da5a2ca022fc3ab3a186cff1693be8ce38bfc0753 type=CONTAINER_STARTED_EVENT Mar 13 00:41:17.818155 kubelet[2856]: I0313 00:41:17.818089 2856 scope.go:117] "RemoveContainer" containerID="f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8" Mar 13 00:41:17.818782 kubelet[2856]: I0313 00:41:17.818404 2856 scope.go:117] "RemoveContainer" containerID="6a9ccbbeaf3d762cb56a5e7bba33701f152080538161c5d1e8f23c72f10d6c19" Mar 13 00:41:17.818782 kubelet[2856]: E0313 00:41:17.818543 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-6bf85f8dd-bgcdh_tigera-operator(e8121ae7-2f38-420f-be40-022a2ccf3f12)\"" pod="tigera-operator/tigera-operator-6bf85f8dd-bgcdh" podUID="e8121ae7-2f38-420f-be40-022a2ccf3f12" Mar 13 00:41:17.820284 containerd[1639]: time="2026-03-13T00:41:17.820251569Z" level=info msg="RemoveContainer for \"f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8\"" Mar 13 00:41:17.828022 containerd[1639]: time="2026-03-13T00:41:17.827872829Z" level=info msg="RemoveContainer for \"f553510d246b6347d874d465ec9b84aa883a15829c0432accb3b79d8080282c8\" returns successfully" Mar 13 00:41:17.844462 containerd[1639]: time="2026-03-13T00:41:17.844375581Z" level=warning msg="container event discarded" container=4a2d1c6dea0b3e4dc5c0d3ccb339a9ff093fcd4dec4e43343f3a88d2aae1161c type=CONTAINER_STARTED_EVENT Mar 13 00:41:24.194721 kubelet[2856]: E0313 00:41:24.194644 2856 controller.go:195] "Failed to update lease" err="Put \"https://10.0.1.99:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-23cf6448d4?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"