Mar 7 01:30:26.190944 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 7 01:30:26.190965 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Fri Mar 6 22:59:59 -00 2026 Mar 7 01:30:26.190973 kernel: KASLR enabled Mar 7 01:30:26.190978 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Mar 7 01:30:26.190985 kernel: printk: bootconsole [pl11] enabled Mar 7 01:30:26.190991 kernel: efi: EFI v2.7 by EDK II Mar 7 01:30:26.190998 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3f215018 RNG=0x3fd5f998 MEMRESERVE=0x3e44ee18 Mar 7 01:30:26.191004 kernel: random: crng init done Mar 7 01:30:26.191010 kernel: ACPI: Early table checksum verification disabled Mar 7 01:30:26.191016 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) Mar 7 01:30:26.191022 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191028 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191036 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) Mar 7 01:30:26.191043 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191050 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191056 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191063 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191071 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191077 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191084 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Mar 7 01:30:26.191090 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191097 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Mar 7 01:30:26.191103 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] Mar 7 01:30:26.191109 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] Mar 7 01:30:26.191116 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] Mar 7 01:30:26.191122 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] Mar 7 01:30:26.191129 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] Mar 7 01:30:26.191135 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] Mar 7 01:30:26.191143 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] Mar 7 01:30:26.191149 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] Mar 7 01:30:26.191156 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] Mar 7 01:30:26.191162 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] Mar 7 01:30:26.191168 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] Mar 7 01:30:26.191175 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] Mar 7 01:30:26.191181 kernel: NUMA: NODE_DATA [mem 0x1bf7f0800-0x1bf7f5fff] Mar 7 01:30:26.191187 kernel: Zone ranges: Mar 7 01:30:26.191194 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Mar 7 01:30:26.191200 kernel: DMA32 empty Mar 7 01:30:26.191206 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Mar 7 01:30:26.191212 kernel: Movable zone start for each node Mar 7 01:30:26.191223 kernel: Early memory node ranges Mar 7 01:30:26.191230 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Mar 7 01:30:26.191237 kernel: node 0: [mem 0x0000000000824000-0x000000003e54ffff] Mar 7 01:30:26.191244 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] Mar 7 01:30:26.191250 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] Mar 7 01:30:26.191258 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] Mar 7 01:30:26.191265 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] Mar 7 01:30:26.191272 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Mar 7 01:30:26.191279 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Mar 7 01:30:26.191285 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Mar 7 01:30:26.191292 kernel: psci: probing for conduit method from ACPI. Mar 7 01:30:26.191299 kernel: psci: PSCIv1.1 detected in firmware. Mar 7 01:30:26.191306 kernel: psci: Using standard PSCI v0.2 function IDs Mar 7 01:30:26.191312 kernel: psci: MIGRATE_INFO_TYPE not supported. Mar 7 01:30:26.191319 kernel: psci: SMC Calling Convention v1.4 Mar 7 01:30:26.191326 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Mar 7 01:30:26.191333 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Mar 7 01:30:26.191341 kernel: percpu: Embedded 30 pages/cpu s85736 r8192 d28952 u122880 Mar 7 01:30:26.191348 kernel: pcpu-alloc: s85736 r8192 d28952 u122880 alloc=30*4096 Mar 7 01:30:26.191354 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 7 01:30:26.191361 kernel: Detected PIPT I-cache on CPU0 Mar 7 01:30:26.191368 kernel: CPU features: detected: GIC system register CPU interface Mar 7 01:30:26.191375 kernel: CPU features: detected: Hardware dirty bit management Mar 7 01:30:26.191381 kernel: CPU features: detected: Spectre-BHB Mar 7 01:30:26.191388 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 7 01:30:26.191395 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 7 01:30:26.191402 kernel: CPU features: detected: ARM erratum 1418040 Mar 7 01:30:26.191409 kernel: CPU features: detected: ARM erratum 1542419 (kernel portion) Mar 7 01:30:26.191417 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 7 01:30:26.191424 kernel: alternatives: applying boot alternatives Mar 7 01:30:26.191432 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9d22c40559a0d209dc0fcc2dfdd5ddf9671e6da0cc59463f610ba522f01325a6 Mar 7 01:30:26.191439 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 7 01:30:26.191446 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 7 01:30:26.191453 kernel: Fallback order for Node 0: 0 Mar 7 01:30:26.191460 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1032156 Mar 7 01:30:26.191467 kernel: Policy zone: Normal Mar 7 01:30:26.191473 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 7 01:30:26.191480 kernel: software IO TLB: area num 2. Mar 7 01:30:26.191487 kernel: software IO TLB: mapped [mem 0x000000003a44e000-0x000000003e44e000] (64MB) Mar 7 01:30:26.191510 kernel: Memory: 3982640K/4194160K available (10304K kernel code, 2180K rwdata, 8116K rodata, 39424K init, 897K bss, 211520K reserved, 0K cma-reserved) Mar 7 01:30:26.191517 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 7 01:30:26.191524 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 7 01:30:26.191531 kernel: rcu: RCU event tracing is enabled. Mar 7 01:30:26.191538 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 7 01:30:26.191545 kernel: Trampoline variant of Tasks RCU enabled. Mar 7 01:30:26.191552 kernel: Tracing variant of Tasks RCU enabled. Mar 7 01:30:26.191559 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 7 01:30:26.191566 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 7 01:30:26.191573 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 7 01:30:26.191579 kernel: GICv3: 960 SPIs implemented Mar 7 01:30:26.191588 kernel: GICv3: 0 Extended SPIs implemented Mar 7 01:30:26.191595 kernel: Root IRQ handler: gic_handle_irq Mar 7 01:30:26.191602 kernel: GICv3: GICv3 features: 16 PPIs, RSS Mar 7 01:30:26.191608 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Mar 7 01:30:26.191615 kernel: ITS: No ITS available, not enabling LPIs Mar 7 01:30:26.191622 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 7 01:30:26.191629 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 7 01:30:26.191636 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 7 01:30:26.191643 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 7 01:30:26.191651 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 7 01:30:26.191657 kernel: Console: colour dummy device 80x25 Mar 7 01:30:26.191666 kernel: printk: console [tty1] enabled Mar 7 01:30:26.191673 kernel: ACPI: Core revision 20230628 Mar 7 01:30:26.191680 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 7 01:30:26.191687 kernel: pid_max: default: 32768 minimum: 301 Mar 7 01:30:26.191694 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 7 01:30:26.191701 kernel: landlock: Up and running. Mar 7 01:30:26.191708 kernel: SELinux: Initializing. Mar 7 01:30:26.191715 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 01:30:26.191723 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 01:30:26.191731 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 01:30:26.191738 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 01:30:26.191745 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3a8030, hints 0x100000e, misc 0x31e1 Mar 7 01:30:26.191752 kernel: Hyper-V: Host Build 10.0.26100.1480-1-0 Mar 7 01:30:26.191760 kernel: Hyper-V: enabling crash_kexec_post_notifiers Mar 7 01:30:26.191767 kernel: rcu: Hierarchical SRCU implementation. Mar 7 01:30:26.191774 kernel: rcu: Max phase no-delay instances is 400. Mar 7 01:30:26.191781 kernel: Remapping and enabling EFI services. Mar 7 01:30:26.191794 kernel: smp: Bringing up secondary CPUs ... Mar 7 01:30:26.191801 kernel: Detected PIPT I-cache on CPU1 Mar 7 01:30:26.191809 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Mar 7 01:30:26.191816 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 7 01:30:26.191825 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 7 01:30:26.191832 kernel: smp: Brought up 1 node, 2 CPUs Mar 7 01:30:26.191840 kernel: SMP: Total of 2 processors activated. Mar 7 01:30:26.191847 kernel: CPU features: detected: 32-bit EL0 Support Mar 7 01:30:26.191855 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Mar 7 01:30:26.191863 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 7 01:30:26.191871 kernel: CPU features: detected: CRC32 instructions Mar 7 01:30:26.191878 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 7 01:30:26.191886 kernel: CPU features: detected: LSE atomic instructions Mar 7 01:30:26.191893 kernel: CPU features: detected: Privileged Access Never Mar 7 01:30:26.191900 kernel: CPU: All CPU(s) started at EL1 Mar 7 01:30:26.191908 kernel: alternatives: applying system-wide alternatives Mar 7 01:30:26.191915 kernel: devtmpfs: initialized Mar 7 01:30:26.191922 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 7 01:30:26.191931 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 7 01:30:26.191938 kernel: pinctrl core: initialized pinctrl subsystem Mar 7 01:30:26.191946 kernel: SMBIOS 3.1.0 present. Mar 7 01:30:26.191953 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 Mar 7 01:30:26.191960 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 7 01:30:26.191968 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 7 01:30:26.191975 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 7 01:30:26.191983 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 7 01:30:26.191990 kernel: audit: initializing netlink subsys (disabled) Mar 7 01:30:26.191999 kernel: audit: type=2000 audit(0.047:1): state=initialized audit_enabled=0 res=1 Mar 7 01:30:26.192007 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 7 01:30:26.192014 kernel: cpuidle: using governor menu Mar 7 01:30:26.192021 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 7 01:30:26.192029 kernel: ASID allocator initialised with 32768 entries Mar 7 01:30:26.192036 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 7 01:30:26.192043 kernel: Serial: AMBA PL011 UART driver Mar 7 01:30:26.192051 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 7 01:30:26.192058 kernel: Modules: 0 pages in range for non-PLT usage Mar 7 01:30:26.192067 kernel: Modules: 509008 pages in range for PLT usage Mar 7 01:30:26.192074 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 7 01:30:26.192081 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 7 01:30:26.192089 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 7 01:30:26.192096 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 7 01:30:26.192103 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 7 01:30:26.192111 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 7 01:30:26.192118 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 7 01:30:26.192126 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 7 01:30:26.192134 kernel: ACPI: Added _OSI(Module Device) Mar 7 01:30:26.192142 kernel: ACPI: Added _OSI(Processor Device) Mar 7 01:30:26.192149 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 7 01:30:26.192156 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 7 01:30:26.192163 kernel: ACPI: Interpreter enabled Mar 7 01:30:26.192171 kernel: ACPI: Using GIC for interrupt routing Mar 7 01:30:26.192178 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Mar 7 01:30:26.192185 kernel: printk: console [ttyAMA0] enabled Mar 7 01:30:26.192193 kernel: printk: bootconsole [pl11] disabled Mar 7 01:30:26.192201 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Mar 7 01:30:26.192209 kernel: iommu: Default domain type: Translated Mar 7 01:30:26.192216 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 7 01:30:26.192223 kernel: efivars: Registered efivars operations Mar 7 01:30:26.192231 kernel: vgaarb: loaded Mar 7 01:30:26.192238 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 7 01:30:26.192245 kernel: VFS: Disk quotas dquot_6.6.0 Mar 7 01:30:26.192252 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 7 01:30:26.192260 kernel: pnp: PnP ACPI init Mar 7 01:30:26.192269 kernel: pnp: PnP ACPI: found 0 devices Mar 7 01:30:26.192276 kernel: NET: Registered PF_INET protocol family Mar 7 01:30:26.192283 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 7 01:30:26.192291 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 7 01:30:26.192298 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 7 01:30:26.192306 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 7 01:30:26.192314 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 7 01:30:26.192321 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 7 01:30:26.192329 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 01:30:26.192338 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 01:30:26.192345 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 7 01:30:26.192353 kernel: PCI: CLS 0 bytes, default 64 Mar 7 01:30:26.192360 kernel: kvm [1]: HYP mode not available Mar 7 01:30:26.192367 kernel: Initialise system trusted keyrings Mar 7 01:30:26.192375 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 7 01:30:26.192382 kernel: Key type asymmetric registered Mar 7 01:30:26.192389 kernel: Asymmetric key parser 'x509' registered Mar 7 01:30:26.192396 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 7 01:30:26.192405 kernel: io scheduler mq-deadline registered Mar 7 01:30:26.192412 kernel: io scheduler kyber registered Mar 7 01:30:26.192419 kernel: io scheduler bfq registered Mar 7 01:30:26.192427 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 7 01:30:26.192434 kernel: thunder_xcv, ver 1.0 Mar 7 01:30:26.192441 kernel: thunder_bgx, ver 1.0 Mar 7 01:30:26.192448 kernel: nicpf, ver 1.0 Mar 7 01:30:26.192455 kernel: nicvf, ver 1.0 Mar 7 01:30:26.192640 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 7 01:30:26.192716 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-07T01:30:25 UTC (1772847025) Mar 7 01:30:26.192726 kernel: efifb: probing for efifb Mar 7 01:30:26.192734 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Mar 7 01:30:26.192741 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Mar 7 01:30:26.192748 kernel: efifb: scrolling: redraw Mar 7 01:30:26.192756 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 7 01:30:26.192763 kernel: Console: switching to colour frame buffer device 128x48 Mar 7 01:30:26.192771 kernel: fb0: EFI VGA frame buffer device Mar 7 01:30:26.192780 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Mar 7 01:30:26.192787 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 7 01:30:26.192795 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 6 counters available Mar 7 01:30:26.192802 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 7 01:30:26.192810 kernel: watchdog: Hard watchdog permanently disabled Mar 7 01:30:26.192817 kernel: NET: Registered PF_INET6 protocol family Mar 7 01:30:26.192824 kernel: Segment Routing with IPv6 Mar 7 01:30:26.192832 kernel: In-situ OAM (IOAM) with IPv6 Mar 7 01:30:26.192839 kernel: NET: Registered PF_PACKET protocol family Mar 7 01:30:26.192848 kernel: Key type dns_resolver registered Mar 7 01:30:26.192855 kernel: registered taskstats version 1 Mar 7 01:30:26.192862 kernel: Loading compiled-in X.509 certificates Mar 7 01:30:26.192870 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: e62b4e4ebcb406beff1271ecc7444548c4ab67e9' Mar 7 01:30:26.192877 kernel: Key type .fscrypt registered Mar 7 01:30:26.192884 kernel: Key type fscrypt-provisioning registered Mar 7 01:30:26.192891 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 7 01:30:26.192899 kernel: ima: Allocated hash algorithm: sha1 Mar 7 01:30:26.192906 kernel: ima: No architecture policies found Mar 7 01:30:26.192915 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 7 01:30:26.192923 kernel: clk: Disabling unused clocks Mar 7 01:30:26.192930 kernel: Freeing unused kernel memory: 39424K Mar 7 01:30:26.192937 kernel: Run /init as init process Mar 7 01:30:26.192944 kernel: with arguments: Mar 7 01:30:26.192951 kernel: /init Mar 7 01:30:26.192959 kernel: with environment: Mar 7 01:30:26.192966 kernel: HOME=/ Mar 7 01:30:26.192973 kernel: TERM=linux Mar 7 01:30:26.192982 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 7 01:30:26.192993 systemd[1]: Detected virtualization microsoft. Mar 7 01:30:26.193001 systemd[1]: Detected architecture arm64. Mar 7 01:30:26.193008 systemd[1]: Running in initrd. Mar 7 01:30:26.193016 systemd[1]: No hostname configured, using default hostname. Mar 7 01:30:26.193023 systemd[1]: Hostname set to . Mar 7 01:30:26.193032 systemd[1]: Initializing machine ID from random generator. Mar 7 01:30:26.193041 systemd[1]: Queued start job for default target initrd.target. Mar 7 01:30:26.193049 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 01:30:26.193057 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 01:30:26.193065 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 7 01:30:26.193073 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 01:30:26.193081 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 7 01:30:26.193089 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 7 01:30:26.193099 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 7 01:30:26.193108 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 7 01:30:26.193116 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 01:30:26.193124 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 01:30:26.193132 systemd[1]: Reached target paths.target - Path Units. Mar 7 01:30:26.193140 systemd[1]: Reached target slices.target - Slice Units. Mar 7 01:30:26.193148 systemd[1]: Reached target swap.target - Swaps. Mar 7 01:30:26.193156 systemd[1]: Reached target timers.target - Timer Units. Mar 7 01:30:26.193164 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 01:30:26.193173 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 01:30:26.193181 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 7 01:30:26.193189 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 7 01:30:26.193197 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 01:30:26.193205 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 01:30:26.193213 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 01:30:26.193221 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 01:30:26.193229 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 7 01:30:26.193239 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 01:30:26.193247 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 7 01:30:26.193255 systemd[1]: Starting systemd-fsck-usr.service... Mar 7 01:30:26.193262 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 01:30:26.193270 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 01:30:26.193291 systemd-journald[218]: Collecting audit messages is disabled. Mar 7 01:30:26.193312 systemd-journald[218]: Journal started Mar 7 01:30:26.193331 systemd-journald[218]: Runtime Journal (/run/log/journal/217746a6dcc04c1c9c8d08ebc8846cda) is 8.0M, max 78.5M, 70.5M free. Mar 7 01:30:26.212972 systemd-modules-load[219]: Inserted module 'overlay' Mar 7 01:30:26.222792 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:30:26.236503 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 7 01:30:26.243977 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 01:30:26.244018 kernel: Bridge firewalling registered Mar 7 01:30:26.244106 systemd-modules-load[219]: Inserted module 'br_netfilter' Mar 7 01:30:26.248977 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 7 01:30:26.257462 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 01:30:26.268512 systemd[1]: Finished systemd-fsck-usr.service. Mar 7 01:30:26.275622 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 01:30:26.283563 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:30:26.304999 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 01:30:26.312667 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 01:30:26.327677 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 7 01:30:26.349683 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 01:30:26.358483 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:30:26.371519 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 01:30:26.381816 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 7 01:30:26.400525 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 01:30:26.414735 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 7 01:30:26.422688 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 01:30:26.441751 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 01:30:26.455866 dracut-cmdline[253]: dracut-dracut-053 Mar 7 01:30:26.455866 dracut-cmdline[253]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9d22c40559a0d209dc0fcc2dfdd5ddf9671e6da0cc59463f610ba522f01325a6 Mar 7 01:30:26.492764 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 01:30:26.504405 systemd-resolved[254]: Positive Trust Anchors: Mar 7 01:30:26.504415 systemd-resolved[254]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 01:30:26.504446 systemd-resolved[254]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 01:30:26.506604 systemd-resolved[254]: Defaulting to hostname 'linux'. Mar 7 01:30:26.508330 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 01:30:26.517120 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 01:30:26.621518 kernel: SCSI subsystem initialized Mar 7 01:30:26.628508 kernel: Loading iSCSI transport class v2.0-870. Mar 7 01:30:26.638519 kernel: iscsi: registered transport (tcp) Mar 7 01:30:26.654908 kernel: iscsi: registered transport (qla4xxx) Mar 7 01:30:26.654925 kernel: QLogic iSCSI HBA Driver Mar 7 01:30:26.694815 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 7 01:30:26.706970 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 7 01:30:26.734364 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 7 01:30:26.734422 kernel: device-mapper: uevent: version 1.0.3 Mar 7 01:30:26.739508 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 7 01:30:26.786511 kernel: raid6: neonx8 gen() 15811 MB/s Mar 7 01:30:26.805504 kernel: raid6: neonx4 gen() 15692 MB/s Mar 7 01:30:26.824499 kernel: raid6: neonx2 gen() 13315 MB/s Mar 7 01:30:26.844501 kernel: raid6: neonx1 gen() 10488 MB/s Mar 7 01:30:26.863499 kernel: raid6: int64x8 gen() 6974 MB/s Mar 7 01:30:26.883506 kernel: raid6: int64x4 gen() 7366 MB/s Mar 7 01:30:26.903499 kernel: raid6: int64x2 gen() 6146 MB/s Mar 7 01:30:26.925078 kernel: raid6: int64x1 gen() 5072 MB/s Mar 7 01:30:26.925099 kernel: raid6: using algorithm neonx8 gen() 15811 MB/s Mar 7 01:30:26.947132 kernel: raid6: .... xor() 12049 MB/s, rmw enabled Mar 7 01:30:26.947158 kernel: raid6: using neon recovery algorithm Mar 7 01:30:26.957494 kernel: xor: measuring software checksum speed Mar 7 01:30:26.957510 kernel: 8regs : 19726 MB/sec Mar 7 01:30:26.960311 kernel: 32regs : 19660 MB/sec Mar 7 01:30:26.963005 kernel: arm64_neon : 27123 MB/sec Mar 7 01:30:26.966326 kernel: xor: using function: arm64_neon (27123 MB/sec) Mar 7 01:30:27.016508 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 7 01:30:27.025363 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 7 01:30:27.040617 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 01:30:27.060889 systemd-udevd[439]: Using default interface naming scheme 'v255'. Mar 7 01:30:27.065049 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 01:30:27.081603 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 7 01:30:27.096668 dracut-pre-trigger[449]: rd.md=0: removing MD RAID activation Mar 7 01:30:27.123253 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 01:30:27.135825 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 01:30:27.175764 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 01:30:27.196197 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 7 01:30:27.226742 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 7 01:30:27.236758 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 01:30:27.256325 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 01:30:27.271435 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 01:30:27.286514 kernel: hv_vmbus: Vmbus version:5.3 Mar 7 01:30:27.287670 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 7 01:30:27.320951 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 7 01:30:27.320973 kernel: hv_vmbus: registering driver hyperv_keyboard Mar 7 01:30:27.320983 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 7 01:30:27.320993 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Mar 7 01:30:27.323692 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 01:30:27.323906 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:30:27.346293 kernel: PTP clock support registered Mar 7 01:30:27.333655 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 01:30:27.449735 kernel: hv_utils: Registering HyperV Utility Driver Mar 7 01:30:27.449757 kernel: hv_vmbus: registering driver hv_utils Mar 7 01:30:27.449767 kernel: hv_utils: Heartbeat IC version 3.0 Mar 7 01:30:27.449776 kernel: hv_utils: Shutdown IC version 3.2 Mar 7 01:30:27.449786 kernel: hv_utils: TimeSync IC version 4.0 Mar 7 01:30:27.449795 kernel: hv_vmbus: registering driver hv_netvsc Mar 7 01:30:27.347552 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:30:27.347759 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:30:27.489826 kernel: hv_vmbus: registering driver hid_hyperv Mar 7 01:30:27.489847 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Mar 7 01:30:27.489858 kernel: hv_vmbus: registering driver hv_storvsc Mar 7 01:30:27.489867 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Mar 7 01:30:27.361465 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:30:27.500962 kernel: scsi host0: storvsc_host_t Mar 7 01:30:27.444832 systemd-resolved[254]: Clock change detected. Flushing caches. Mar 7 01:30:27.513739 kernel: scsi host1: storvsc_host_t Mar 7 01:30:27.513913 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Mar 7 01:30:27.488187 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:30:27.529998 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Mar 7 01:30:27.513704 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 7 01:30:27.530150 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:30:27.530840 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:30:27.553042 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:30:27.577612 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Mar 7 01:30:27.577836 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 7 01:30:27.579586 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Mar 7 01:30:27.579960 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:30:27.591907 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 01:30:27.616617 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Mar 7 01:30:27.616808 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Mar 7 01:30:27.616898 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 7 01:30:27.616989 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Mar 7 01:30:27.617077 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Mar 7 01:30:27.636142 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#11 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 7 01:30:27.636360 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:30:27.641572 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 7 01:30:27.657556 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:30:27.677560 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#181 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 7 01:30:27.785003 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Mar 7 01:30:27.811112 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by (udev-worker) (498) Mar 7 01:30:27.811141 kernel: BTRFS: device fsid 237c8587-8110-47ef-99f9-37e4ed4d3b31 devid 1 transid 36 /dev/sda3 scanned by (udev-worker) (499) Mar 7 01:30:27.816323 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Mar 7 01:30:27.837752 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 7 01:30:27.847695 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Mar 7 01:30:27.852838 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Mar 7 01:30:27.875762 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 7 01:30:27.898828 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:30:27.907565 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:30:28.916567 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:30:28.917352 disk-uuid[595]: The operation has completed successfully. Mar 7 01:30:28.983041 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 7 01:30:28.983156 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 7 01:30:29.009689 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 7 01:30:29.021180 sh[708]: Success Mar 7 01:30:29.039723 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 7 01:30:29.132518 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 7 01:30:29.142628 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 7 01:30:29.151678 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 7 01:30:29.181691 kernel: BTRFS info (device dm-0): first mount of filesystem 237c8587-8110-47ef-99f9-37e4ed4d3b31 Mar 7 01:30:29.181748 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 7 01:30:29.187599 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 7 01:30:29.192238 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 7 01:30:29.196643 kernel: BTRFS info (device dm-0): using free space tree Mar 7 01:30:29.266183 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 7 01:30:29.270871 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 7 01:30:29.287818 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 7 01:30:29.297477 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 7 01:30:29.327980 kernel: BTRFS info (device sda6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 01:30:29.328043 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 7 01:30:29.331774 kernel: BTRFS info (device sda6): using free space tree Mar 7 01:30:29.350056 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 01:30:29.355947 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 7 01:30:29.367580 kernel: BTRFS info (device sda6): last unmount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 01:30:29.376590 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 7 01:30:29.391758 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 7 01:30:29.424170 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 01:30:29.439685 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 01:30:29.464238 systemd-networkd[892]: lo: Link UP Mar 7 01:30:29.464249 systemd-networkd[892]: lo: Gained carrier Mar 7 01:30:29.464995 systemd-networkd[892]: Enumeration completed Mar 7 01:30:29.465099 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 01:30:29.465527 systemd-networkd[892]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:30:29.465530 systemd-networkd[892]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 01:30:29.470096 systemd-networkd[892]: eth0: Link UP Mar 7 01:30:29.470499 systemd-networkd[892]: eth0: Gained carrier Mar 7 01:30:29.470509 systemd-networkd[892]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:30:29.472888 systemd[1]: Reached target network.target - Network. Mar 7 01:30:29.504586 systemd-networkd[892]: eth0: DHCPv4 address 10.200.20.32/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 7 01:30:29.648615 ignition[839]: Ignition 2.19.0 Mar 7 01:30:29.651337 ignition[839]: Stage: fetch-offline Mar 7 01:30:29.651384 ignition[839]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:30:29.654873 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 01:30:29.651392 ignition[839]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 01:30:29.651495 ignition[839]: parsed url from cmdline: "" Mar 7 01:30:29.651498 ignition[839]: no config URL provided Mar 7 01:30:29.651503 ignition[839]: reading system config file "/usr/lib/ignition/user.ign" Mar 7 01:30:29.651510 ignition[839]: no config at "/usr/lib/ignition/user.ign" Mar 7 01:30:29.651517 ignition[839]: failed to fetch config: resource requires networking Mar 7 01:30:29.681789 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 7 01:30:29.651706 ignition[839]: Ignition finished successfully Mar 7 01:30:29.702168 ignition[901]: Ignition 2.19.0 Mar 7 01:30:29.702182 ignition[901]: Stage: fetch Mar 7 01:30:29.702390 ignition[901]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:30:29.702400 ignition[901]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 01:30:29.702501 ignition[901]: parsed url from cmdline: "" Mar 7 01:30:29.702504 ignition[901]: no config URL provided Mar 7 01:30:29.702508 ignition[901]: reading system config file "/usr/lib/ignition/user.ign" Mar 7 01:30:29.702516 ignition[901]: no config at "/usr/lib/ignition/user.ign" Mar 7 01:30:29.702559 ignition[901]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Mar 7 01:30:29.804330 ignition[901]: GET result: OK Mar 7 01:30:29.804429 ignition[901]: config has been read from IMDS userdata Mar 7 01:30:29.804472 ignition[901]: parsing config with SHA512: 28ad04719edd5619ee215ec54f17d8153d397a9c861fe92d7f200cedccbc41128695d08253edcd319c754e205202767e432b76447997bc052fcc92bdf9555a1e Mar 7 01:30:29.808561 unknown[901]: fetched base config from "system" Mar 7 01:30:29.810267 ignition[901]: fetch: fetch complete Mar 7 01:30:29.808576 unknown[901]: fetched base config from "system" Mar 7 01:30:29.810273 ignition[901]: fetch: fetch passed Mar 7 01:30:29.808582 unknown[901]: fetched user config from "azure" Mar 7 01:30:29.810336 ignition[901]: Ignition finished successfully Mar 7 01:30:29.812335 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 7 01:30:29.835710 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 7 01:30:29.850674 ignition[907]: Ignition 2.19.0 Mar 7 01:30:29.850683 ignition[907]: Stage: kargs Mar 7 01:30:29.856590 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 7 01:30:29.850872 ignition[907]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:30:29.850885 ignition[907]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 01:30:29.870716 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 7 01:30:29.851883 ignition[907]: kargs: kargs passed Mar 7 01:30:29.851953 ignition[907]: Ignition finished successfully Mar 7 01:30:29.889308 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 7 01:30:29.885717 ignition[913]: Ignition 2.19.0 Mar 7 01:30:29.894175 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 7 01:30:29.885724 ignition[913]: Stage: disks Mar 7 01:30:29.902494 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 7 01:30:29.885888 ignition[913]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:30:29.912583 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 01:30:29.885897 ignition[913]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 01:30:29.919926 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 01:30:29.886780 ignition[913]: disks: disks passed Mar 7 01:30:29.929294 systemd[1]: Reached target basic.target - Basic System. Mar 7 01:30:29.886826 ignition[913]: Ignition finished successfully Mar 7 01:30:29.949748 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 7 01:30:30.001926 systemd-fsck[921]: ROOT: clean, 14/7326000 files, 477710/7359488 blocks Mar 7 01:30:30.009801 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 7 01:30:30.027710 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 7 01:30:30.085558 kernel: EXT4-fs (sda9): mounted filesystem 596a8ea8-9d3d-4d06-a56e-9d3ebd3cb76d r/w with ordered data mode. Quota mode: none. Mar 7 01:30:30.085917 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 7 01:30:30.090388 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 7 01:30:30.115618 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 01:30:30.126128 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 7 01:30:30.132730 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 7 01:30:30.148566 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 7 01:30:30.148610 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 01:30:30.185137 kernel: hv_netvsc 7ced8dc6-d8cf-7ced-8dc6-d8cf7ced8dc6 eth0: VF slot 1 added Mar 7 01:30:30.185314 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (932) Mar 7 01:30:30.155888 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 7 01:30:30.200523 kernel: BTRFS info (device sda6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 01:30:30.200616 kernel: hv_vmbus: registering driver hv_pci Mar 7 01:30:30.200636 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 7 01:30:30.199291 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 7 01:30:30.224271 kernel: hv_pci cd5bce2c-8d5e-41d5-a78a-0a161c517b26: PCI VMBus probing: Using version 0x10004 Mar 7 01:30:30.224456 kernel: BTRFS info (device sda6): using free space tree Mar 7 01:30:30.224467 kernel: hv_pci cd5bce2c-8d5e-41d5-a78a-0a161c517b26: PCI host bridge to bus 8d5e:00 Mar 7 01:30:30.234727 kernel: pci_bus 8d5e:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Mar 7 01:30:30.239483 kernel: pci_bus 8d5e:00: No busn resource found for root bus, will use [bus 00-ff] Mar 7 01:30:30.245147 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 01:30:30.245189 kernel: pci 8d5e:00:02.0: [15b3:1018] type 00 class 0x020000 Mar 7 01:30:30.249803 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 01:30:30.266592 kernel: pci 8d5e:00:02.0: reg 0x10: [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 7 01:30:30.271609 kernel: pci 8d5e:00:02.0: enabling Extended Tags Mar 7 01:30:30.290671 kernel: pci 8d5e:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 8d5e:00:02.0 (capable of 126.016 Gb/s with 8.0 GT/s PCIe x16 link) Mar 7 01:30:30.301223 kernel: pci_bus 8d5e:00: busn_res: [bus 00-ff] end is updated to 00 Mar 7 01:30:30.301442 kernel: pci 8d5e:00:02.0: BAR 0: assigned [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 7 01:30:30.339683 coreos-metadata[934]: Mar 07 01:30:30.339 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 7 01:30:30.351733 coreos-metadata[934]: Mar 07 01:30:30.351 INFO Fetch successful Mar 7 01:30:30.351733 coreos-metadata[934]: Mar 07 01:30:30.351 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Mar 7 01:30:30.371074 kernel: mlx5_core 8d5e:00:02.0: enabling device (0000 -> 0002) Mar 7 01:30:30.371751 kernel: mlx5_core 8d5e:00:02.0: firmware version: 16.30.5026 Mar 7 01:30:30.371852 coreos-metadata[934]: Mar 07 01:30:30.370 INFO Fetch successful Mar 7 01:30:30.378712 coreos-metadata[934]: Mar 07 01:30:30.374 INFO wrote hostname ci-4081.3.6-n-3151c5d0e2 to /sysroot/etc/hostname Mar 7 01:30:30.385090 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 7 01:30:30.461383 initrd-setup-root[964]: cut: /sysroot/etc/passwd: No such file or directory Mar 7 01:30:30.481984 initrd-setup-root[971]: cut: /sysroot/etc/group: No such file or directory Mar 7 01:30:30.495193 initrd-setup-root[978]: cut: /sysroot/etc/shadow: No such file or directory Mar 7 01:30:30.506247 initrd-setup-root[989]: cut: /sysroot/etc/gshadow: No such file or directory Mar 7 01:30:30.578349 kernel: hv_netvsc 7ced8dc6-d8cf-7ced-8dc6-d8cf7ced8dc6 eth0: VF registering: eth1 Mar 7 01:30:30.578566 kernel: mlx5_core 8d5e:00:02.0 eth1: joined to eth0 Mar 7 01:30:30.586655 kernel: mlx5_core 8d5e:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Mar 7 01:30:30.597647 kernel: mlx5_core 8d5e:00:02.0 enP36190s1: renamed from eth1 Mar 7 01:30:30.606757 systemd-networkd[892]: eth1: Interface name change detected, renamed to enP36190s1. Mar 7 01:30:30.652740 systemd-networkd[892]: eth0: Gained IPv6LL Mar 7 01:30:30.725202 systemd-networkd[892]: enP36190s1: Link UP Mar 7 01:30:30.728756 kernel: mlx5_core 8d5e:00:02.0 enP36190s1: Link up Mar 7 01:30:30.769513 systemd-networkd[892]: enP36190s1: Gained carrier Mar 7 01:30:30.773505 kernel: hv_netvsc 7ced8dc6-d8cf-7ced-8dc6-d8cf7ced8dc6 eth0: Data path switched to VF: enP36190s1 Mar 7 01:30:30.840134 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 7 01:30:30.860679 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 7 01:30:30.870746 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 7 01:30:30.885970 kernel: BTRFS info (device sda6): last unmount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 01:30:30.881504 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 7 01:30:30.908970 ignition[1061]: INFO : Ignition 2.19.0 Mar 7 01:30:30.908970 ignition[1061]: INFO : Stage: mount Mar 7 01:30:30.908970 ignition[1061]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 01:30:30.908970 ignition[1061]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 01:30:30.939778 ignition[1061]: INFO : mount: mount passed Mar 7 01:30:30.939778 ignition[1061]: INFO : Ignition finished successfully Mar 7 01:30:30.914809 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 7 01:30:30.924751 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 7 01:30:30.938384 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 7 01:30:30.965886 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 01:30:30.984696 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 scanned by mount (1073) Mar 7 01:30:30.984744 kernel: BTRFS info (device sda6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 01:30:30.995032 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 7 01:30:30.998473 kernel: BTRFS info (device sda6): using free space tree Mar 7 01:30:31.006500 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 01:30:31.006713 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 01:30:31.029423 ignition[1089]: INFO : Ignition 2.19.0 Mar 7 01:30:31.029423 ignition[1089]: INFO : Stage: files Mar 7 01:30:31.035898 ignition[1089]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 01:30:31.035898 ignition[1089]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 01:30:31.035898 ignition[1089]: DEBUG : files: compiled without relabeling support, skipping Mar 7 01:30:31.035898 ignition[1089]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 7 01:30:31.035898 ignition[1089]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 7 01:30:31.065137 ignition[1089]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 7 01:30:31.065137 ignition[1089]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 7 01:30:31.065137 ignition[1089]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 7 01:30:31.065137 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 7 01:30:31.065137 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 7 01:30:31.058890 unknown[1089]: wrote ssh authorized keys file for user: core Mar 7 01:30:38.297063 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 7 01:30:38.459471 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.35.1-arm64.raw: attempt #1 Mar 7 01:30:39.098052 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 7 01:30:39.514330 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 7 01:30:39.514330 ignition[1089]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 7 01:30:39.528692 ignition[1089]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 01:30:39.528692 ignition[1089]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 01:30:39.528692 ignition[1089]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 7 01:30:39.528692 ignition[1089]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 7 01:30:39.528692 ignition[1089]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 7 01:30:39.567316 ignition[1089]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 7 01:30:39.567316 ignition[1089]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 7 01:30:39.567316 ignition[1089]: INFO : files: files passed Mar 7 01:30:39.567316 ignition[1089]: INFO : Ignition finished successfully Mar 7 01:30:39.540823 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 7 01:30:39.567849 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 7 01:30:39.581752 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 7 01:30:39.599877 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 7 01:30:39.628657 initrd-setup-root-after-ignition[1118]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 01:30:39.628657 initrd-setup-root-after-ignition[1118]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 7 01:30:39.602271 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 7 01:30:39.656057 initrd-setup-root-after-ignition[1122]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 01:30:39.625871 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 01:30:39.633952 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 7 01:30:39.668782 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 7 01:30:39.699866 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 7 01:30:39.699985 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 7 01:30:39.709807 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 7 01:30:39.719454 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 7 01:30:39.728314 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 7 01:30:39.742831 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 7 01:30:39.760949 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 01:30:39.773792 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 7 01:30:39.792499 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 7 01:30:39.797391 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 01:30:39.806809 systemd[1]: Stopped target timers.target - Timer Units. Mar 7 01:30:39.815271 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 7 01:30:39.815334 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 01:30:39.827551 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 7 01:30:39.836682 systemd[1]: Stopped target basic.target - Basic System. Mar 7 01:30:39.844596 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 7 01:30:39.852659 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 01:30:39.861873 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 7 01:30:39.871178 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 7 01:30:39.879868 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 01:30:39.888966 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 7 01:30:39.899470 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 7 01:30:39.908164 systemd[1]: Stopped target swap.target - Swaps. Mar 7 01:30:39.915497 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 7 01:30:39.915568 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 7 01:30:39.927933 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 7 01:30:39.936613 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 01:30:39.946314 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 7 01:30:39.946349 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 01:30:39.956703 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 7 01:30:39.956765 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 7 01:30:39.970868 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 7 01:30:39.970918 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 01:30:39.979752 systemd[1]: ignition-files.service: Deactivated successfully. Mar 7 01:30:39.979790 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 7 01:30:39.988290 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 7 01:30:39.988323 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 7 01:30:40.011727 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 7 01:30:40.046647 ignition[1143]: INFO : Ignition 2.19.0 Mar 7 01:30:40.046647 ignition[1143]: INFO : Stage: umount Mar 7 01:30:40.046647 ignition[1143]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 01:30:40.046647 ignition[1143]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 01:30:40.046647 ignition[1143]: INFO : umount: umount passed Mar 7 01:30:40.046647 ignition[1143]: INFO : Ignition finished successfully Mar 7 01:30:40.022121 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 7 01:30:40.022202 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 01:30:40.034678 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 7 01:30:40.041138 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 7 01:30:40.041197 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 01:30:40.050541 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 7 01:30:40.050604 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 01:30:40.063463 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 7 01:30:40.063575 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 7 01:30:40.075272 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 7 01:30:40.075363 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 7 01:30:40.085509 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 7 01:30:40.085864 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 7 01:30:40.085901 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 7 01:30:40.092342 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 7 01:30:40.092385 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 7 01:30:40.105698 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 7 01:30:40.105745 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 7 01:30:40.115831 systemd[1]: Stopped target network.target - Network. Mar 7 01:30:40.123572 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 7 01:30:40.123615 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 01:30:40.133012 systemd[1]: Stopped target paths.target - Path Units. Mar 7 01:30:40.141524 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 7 01:30:40.145563 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 01:30:40.151227 systemd[1]: Stopped target slices.target - Slice Units. Mar 7 01:30:40.159840 systemd[1]: Stopped target sockets.target - Socket Units. Mar 7 01:30:40.168189 systemd[1]: iscsid.socket: Deactivated successfully. Mar 7 01:30:40.168236 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 01:30:40.177706 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 7 01:30:40.177738 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 01:30:40.185809 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 7 01:30:40.185849 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 7 01:30:40.194245 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 7 01:30:40.194275 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 7 01:30:40.202399 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 7 01:30:40.210481 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 7 01:30:40.218846 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 7 01:30:40.218935 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 7 01:30:40.222572 systemd-networkd[892]: eth0: DHCPv6 lease lost Mar 7 01:30:40.394389 kernel: hv_netvsc 7ced8dc6-d8cf-7ced-8dc6-d8cf7ced8dc6 eth0: Data path switched from VF: enP36190s1 Mar 7 01:30:40.228486 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 7 01:30:40.228601 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 7 01:30:40.240359 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 7 01:30:40.240575 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 7 01:30:40.247962 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 7 01:30:40.248019 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 7 01:30:40.278676 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 7 01:30:40.288804 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 7 01:30:40.288902 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 01:30:40.297882 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 01:30:40.307820 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 7 01:30:40.307914 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 7 01:30:40.337871 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 7 01:30:40.338339 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 01:30:40.348207 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 7 01:30:40.348277 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 7 01:30:40.356287 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 7 01:30:40.356332 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 01:30:40.365139 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 7 01:30:40.365185 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 7 01:30:40.377665 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 7 01:30:40.377714 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 7 01:30:40.394459 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 01:30:40.394520 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:30:40.415678 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 7 01:30:40.422602 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 7 01:30:40.577412 systemd-journald[218]: Received SIGTERM from PID 1 (systemd). Mar 7 01:30:40.422670 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 7 01:30:40.433553 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 7 01:30:40.433606 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 7 01:30:40.442793 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 7 01:30:40.442837 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 01:30:40.452483 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 7 01:30:40.452525 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 01:30:40.462177 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:30:40.462221 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:30:40.471533 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 7 01:30:40.473684 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 7 01:30:40.481040 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 7 01:30:40.483198 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 7 01:30:40.491986 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 7 01:30:40.509818 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 7 01:30:40.523041 systemd[1]: Switching root. Mar 7 01:30:40.648942 systemd-journald[218]: Journal stopped Mar 7 01:30:26.190944 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 7 01:30:26.190965 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Fri Mar 6 22:59:59 -00 2026 Mar 7 01:30:26.190973 kernel: KASLR enabled Mar 7 01:30:26.190978 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Mar 7 01:30:26.190985 kernel: printk: bootconsole [pl11] enabled Mar 7 01:30:26.190991 kernel: efi: EFI v2.7 by EDK II Mar 7 01:30:26.190998 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3f215018 RNG=0x3fd5f998 MEMRESERVE=0x3e44ee18 Mar 7 01:30:26.191004 kernel: random: crng init done Mar 7 01:30:26.191010 kernel: ACPI: Early table checksum verification disabled Mar 7 01:30:26.191016 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) Mar 7 01:30:26.191022 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191028 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191036 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) Mar 7 01:30:26.191043 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191050 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191056 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191063 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191071 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191077 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191084 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Mar 7 01:30:26.191090 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 01:30:26.191097 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Mar 7 01:30:26.191103 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] Mar 7 01:30:26.191109 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] Mar 7 01:30:26.191116 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] Mar 7 01:30:26.191122 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] Mar 7 01:30:26.191129 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] Mar 7 01:30:26.191135 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] Mar 7 01:30:26.191143 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] Mar 7 01:30:26.191149 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] Mar 7 01:30:26.191156 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] Mar 7 01:30:26.191162 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] Mar 7 01:30:26.191168 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] Mar 7 01:30:26.191175 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] Mar 7 01:30:26.191181 kernel: NUMA: NODE_DATA [mem 0x1bf7f0800-0x1bf7f5fff] Mar 7 01:30:26.191187 kernel: Zone ranges: Mar 7 01:30:26.191194 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Mar 7 01:30:26.191200 kernel: DMA32 empty Mar 7 01:30:26.191206 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Mar 7 01:30:26.191212 kernel: Movable zone start for each node Mar 7 01:30:26.191223 kernel: Early memory node ranges Mar 7 01:30:26.191230 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Mar 7 01:30:26.191237 kernel: node 0: [mem 0x0000000000824000-0x000000003e54ffff] Mar 7 01:30:26.191244 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] Mar 7 01:30:26.191250 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] Mar 7 01:30:26.191258 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] Mar 7 01:30:26.191265 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] Mar 7 01:30:26.191272 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Mar 7 01:30:26.191279 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Mar 7 01:30:26.191285 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Mar 7 01:30:26.191292 kernel: psci: probing for conduit method from ACPI. Mar 7 01:30:26.191299 kernel: psci: PSCIv1.1 detected in firmware. Mar 7 01:30:26.191306 kernel: psci: Using standard PSCI v0.2 function IDs Mar 7 01:30:26.191312 kernel: psci: MIGRATE_INFO_TYPE not supported. Mar 7 01:30:26.191319 kernel: psci: SMC Calling Convention v1.4 Mar 7 01:30:26.191326 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Mar 7 01:30:26.191333 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Mar 7 01:30:26.191341 kernel: percpu: Embedded 30 pages/cpu s85736 r8192 d28952 u122880 Mar 7 01:30:26.191348 kernel: pcpu-alloc: s85736 r8192 d28952 u122880 alloc=30*4096 Mar 7 01:30:26.191354 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 7 01:30:26.191361 kernel: Detected PIPT I-cache on CPU0 Mar 7 01:30:26.191368 kernel: CPU features: detected: GIC system register CPU interface Mar 7 01:30:26.191375 kernel: CPU features: detected: Hardware dirty bit management Mar 7 01:30:26.191381 kernel: CPU features: detected: Spectre-BHB Mar 7 01:30:26.191388 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 7 01:30:26.191395 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 7 01:30:26.191402 kernel: CPU features: detected: ARM erratum 1418040 Mar 7 01:30:26.191409 kernel: CPU features: detected: ARM erratum 1542419 (kernel portion) Mar 7 01:30:26.191417 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 7 01:30:26.191424 kernel: alternatives: applying boot alternatives Mar 7 01:30:26.191432 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9d22c40559a0d209dc0fcc2dfdd5ddf9671e6da0cc59463f610ba522f01325a6 Mar 7 01:30:26.191439 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 7 01:30:26.191446 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 7 01:30:26.191453 kernel: Fallback order for Node 0: 0 Mar 7 01:30:26.191460 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1032156 Mar 7 01:30:26.191467 kernel: Policy zone: Normal Mar 7 01:30:26.191473 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 7 01:30:26.191480 kernel: software IO TLB: area num 2. Mar 7 01:30:26.191487 kernel: software IO TLB: mapped [mem 0x000000003a44e000-0x000000003e44e000] (64MB) Mar 7 01:30:26.191510 kernel: Memory: 3982640K/4194160K available (10304K kernel code, 2180K rwdata, 8116K rodata, 39424K init, 897K bss, 211520K reserved, 0K cma-reserved) Mar 7 01:30:26.191517 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 7 01:30:26.191524 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 7 01:30:26.191531 kernel: rcu: RCU event tracing is enabled. Mar 7 01:30:26.191538 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 7 01:30:26.191545 kernel: Trampoline variant of Tasks RCU enabled. Mar 7 01:30:26.191552 kernel: Tracing variant of Tasks RCU enabled. Mar 7 01:30:26.191559 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 7 01:30:26.191566 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 7 01:30:26.191573 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 7 01:30:26.191579 kernel: GICv3: 960 SPIs implemented Mar 7 01:30:26.191588 kernel: GICv3: 0 Extended SPIs implemented Mar 7 01:30:26.191595 kernel: Root IRQ handler: gic_handle_irq Mar 7 01:30:26.191602 kernel: GICv3: GICv3 features: 16 PPIs, RSS Mar 7 01:30:26.191608 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Mar 7 01:30:26.191615 kernel: ITS: No ITS available, not enabling LPIs Mar 7 01:30:26.191622 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 7 01:30:26.191629 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 7 01:30:26.191636 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 7 01:30:26.191643 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 7 01:30:26.191651 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 7 01:30:26.191657 kernel: Console: colour dummy device 80x25 Mar 7 01:30:26.191666 kernel: printk: console [tty1] enabled Mar 7 01:30:26.191673 kernel: ACPI: Core revision 20230628 Mar 7 01:30:26.191680 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 7 01:30:26.191687 kernel: pid_max: default: 32768 minimum: 301 Mar 7 01:30:26.191694 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 7 01:30:26.191701 kernel: landlock: Up and running. Mar 7 01:30:26.191708 kernel: SELinux: Initializing. Mar 7 01:30:26.191715 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 01:30:26.191723 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 01:30:26.191731 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 01:30:26.191738 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 01:30:26.191745 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3a8030, hints 0x100000e, misc 0x31e1 Mar 7 01:30:26.191752 kernel: Hyper-V: Host Build 10.0.26100.1480-1-0 Mar 7 01:30:26.191760 kernel: Hyper-V: enabling crash_kexec_post_notifiers Mar 7 01:30:26.191767 kernel: rcu: Hierarchical SRCU implementation. Mar 7 01:30:26.191774 kernel: rcu: Max phase no-delay instances is 400. Mar 7 01:30:26.191781 kernel: Remapping and enabling EFI services. Mar 7 01:30:26.191794 kernel: smp: Bringing up secondary CPUs ... Mar 7 01:30:26.191801 kernel: Detected PIPT I-cache on CPU1 Mar 7 01:30:26.191809 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Mar 7 01:30:26.191816 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 7 01:30:26.191825 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 7 01:30:26.191832 kernel: smp: Brought up 1 node, 2 CPUs Mar 7 01:30:26.191840 kernel: SMP: Total of 2 processors activated. Mar 7 01:30:26.191847 kernel: CPU features: detected: 32-bit EL0 Support Mar 7 01:30:26.191855 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Mar 7 01:30:26.191863 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 7 01:30:26.191871 kernel: CPU features: detected: CRC32 instructions Mar 7 01:30:26.191878 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 7 01:30:26.191886 kernel: CPU features: detected: LSE atomic instructions Mar 7 01:30:26.191893 kernel: CPU features: detected: Privileged Access Never Mar 7 01:30:26.191900 kernel: CPU: All CPU(s) started at EL1 Mar 7 01:30:26.191908 kernel: alternatives: applying system-wide alternatives Mar 7 01:30:26.191915 kernel: devtmpfs: initialized Mar 7 01:30:26.191922 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 7 01:30:26.191931 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 7 01:30:26.191938 kernel: pinctrl core: initialized pinctrl subsystem Mar 7 01:30:26.191946 kernel: SMBIOS 3.1.0 present. Mar 7 01:30:26.191953 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 Mar 7 01:30:26.191960 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 7 01:30:26.191968 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 7 01:30:26.191975 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 7 01:30:26.191983 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 7 01:30:26.191990 kernel: audit: initializing netlink subsys (disabled) Mar 7 01:30:26.191999 kernel: audit: type=2000 audit(0.047:1): state=initialized audit_enabled=0 res=1 Mar 7 01:30:26.192007 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 7 01:30:26.192014 kernel: cpuidle: using governor menu Mar 7 01:30:26.192021 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 7 01:30:26.192029 kernel: ASID allocator initialised with 32768 entries Mar 7 01:30:26.192036 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 7 01:30:26.192043 kernel: Serial: AMBA PL011 UART driver Mar 7 01:30:26.192051 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 7 01:30:26.192058 kernel: Modules: 0 pages in range for non-PLT usage Mar 7 01:30:26.192067 kernel: Modules: 509008 pages in range for PLT usage Mar 7 01:30:26.192074 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 7 01:30:26.192081 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 7 01:30:26.192089 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 7 01:30:26.192096 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 7 01:30:26.192103 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 7 01:30:26.192111 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 7 01:30:26.192118 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 7 01:30:26.192126 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 7 01:30:26.192134 kernel: ACPI: Added _OSI(Module Device) Mar 7 01:30:26.192142 kernel: ACPI: Added _OSI(Processor Device) Mar 7 01:30:26.192149 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 7 01:30:26.192156 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 7 01:30:26.192163 kernel: ACPI: Interpreter enabled Mar 7 01:30:26.192171 kernel: ACPI: Using GIC for interrupt routing Mar 7 01:30:26.192178 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Mar 7 01:30:26.192185 kernel: printk: console [ttyAMA0] enabled Mar 7 01:30:26.192193 kernel: printk: bootconsole [pl11] disabled Mar 7 01:30:26.192201 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Mar 7 01:30:26.192209 kernel: iommu: Default domain type: Translated Mar 7 01:30:26.192216 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 7 01:30:26.192223 kernel: efivars: Registered efivars operations Mar 7 01:30:26.192231 kernel: vgaarb: loaded Mar 7 01:30:26.192238 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 7 01:30:26.192245 kernel: VFS: Disk quotas dquot_6.6.0 Mar 7 01:30:26.192252 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 7 01:30:26.192260 kernel: pnp: PnP ACPI init Mar 7 01:30:26.192269 kernel: pnp: PnP ACPI: found 0 devices Mar 7 01:30:26.192276 kernel: NET: Registered PF_INET protocol family Mar 7 01:30:26.192283 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 7 01:30:26.192291 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 7 01:30:26.192298 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 7 01:30:26.192306 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 7 01:30:26.192314 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 7 01:30:26.192321 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 7 01:30:26.192329 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 01:30:26.192338 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 01:30:26.192345 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 7 01:30:26.192353 kernel: PCI: CLS 0 bytes, default 64 Mar 7 01:30:26.192360 kernel: kvm [1]: HYP mode not available Mar 7 01:30:26.192367 kernel: Initialise system trusted keyrings Mar 7 01:30:26.192375 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 7 01:30:26.192382 kernel: Key type asymmetric registered Mar 7 01:30:26.192389 kernel: Asymmetric key parser 'x509' registered Mar 7 01:30:26.192396 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 7 01:30:26.192405 kernel: io scheduler mq-deadline registered Mar 7 01:30:26.192412 kernel: io scheduler kyber registered Mar 7 01:30:26.192419 kernel: io scheduler bfq registered Mar 7 01:30:26.192427 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 7 01:30:26.192434 kernel: thunder_xcv, ver 1.0 Mar 7 01:30:26.192441 kernel: thunder_bgx, ver 1.0 Mar 7 01:30:26.192448 kernel: nicpf, ver 1.0 Mar 7 01:30:26.192455 kernel: nicvf, ver 1.0 Mar 7 01:30:26.192640 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 7 01:30:26.192716 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-07T01:30:25 UTC (1772847025) Mar 7 01:30:26.192726 kernel: efifb: probing for efifb Mar 7 01:30:26.192734 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Mar 7 01:30:26.192741 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Mar 7 01:30:26.192748 kernel: efifb: scrolling: redraw Mar 7 01:30:26.192756 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 7 01:30:26.192763 kernel: Console: switching to colour frame buffer device 128x48 Mar 7 01:30:26.192771 kernel: fb0: EFI VGA frame buffer device Mar 7 01:30:26.192780 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Mar 7 01:30:26.192787 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 7 01:30:26.192795 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 6 counters available Mar 7 01:30:26.192802 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 7 01:30:26.192810 kernel: watchdog: Hard watchdog permanently disabled Mar 7 01:30:26.192817 kernel: NET: Registered PF_INET6 protocol family Mar 7 01:30:26.192824 kernel: Segment Routing with IPv6 Mar 7 01:30:26.192832 kernel: In-situ OAM (IOAM) with IPv6 Mar 7 01:30:26.192839 kernel: NET: Registered PF_PACKET protocol family Mar 7 01:30:26.192848 kernel: Key type dns_resolver registered Mar 7 01:30:26.192855 kernel: registered taskstats version 1 Mar 7 01:30:26.192862 kernel: Loading compiled-in X.509 certificates Mar 7 01:30:26.192870 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: e62b4e4ebcb406beff1271ecc7444548c4ab67e9' Mar 7 01:30:26.192877 kernel: Key type .fscrypt registered Mar 7 01:30:26.192884 kernel: Key type fscrypt-provisioning registered Mar 7 01:30:26.192891 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 7 01:30:26.192899 kernel: ima: Allocated hash algorithm: sha1 Mar 7 01:30:26.192906 kernel: ima: No architecture policies found Mar 7 01:30:26.192915 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 7 01:30:26.192923 kernel: clk: Disabling unused clocks Mar 7 01:30:26.192930 kernel: Freeing unused kernel memory: 39424K Mar 7 01:30:26.192937 kernel: Run /init as init process Mar 7 01:30:26.192944 kernel: with arguments: Mar 7 01:30:26.192951 kernel: /init Mar 7 01:30:26.192959 kernel: with environment: Mar 7 01:30:26.192966 kernel: HOME=/ Mar 7 01:30:26.192973 kernel: TERM=linux Mar 7 01:30:26.192982 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 7 01:30:26.192993 systemd[1]: Detected virtualization microsoft. Mar 7 01:30:26.193001 systemd[1]: Detected architecture arm64. Mar 7 01:30:26.193008 systemd[1]: Running in initrd. Mar 7 01:30:26.193016 systemd[1]: No hostname configured, using default hostname. Mar 7 01:30:26.193023 systemd[1]: Hostname set to . Mar 7 01:30:26.193032 systemd[1]: Initializing machine ID from random generator. Mar 7 01:30:26.193041 systemd[1]: Queued start job for default target initrd.target. Mar 7 01:30:26.193049 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 01:30:26.193057 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 01:30:26.193065 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 7 01:30:26.193073 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 01:30:26.193081 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 7 01:30:26.193089 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 7 01:30:26.193099 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 7 01:30:26.193108 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 7 01:30:26.193116 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 01:30:26.193124 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 01:30:26.193132 systemd[1]: Reached target paths.target - Path Units. Mar 7 01:30:26.193140 systemd[1]: Reached target slices.target - Slice Units. Mar 7 01:30:26.193148 systemd[1]: Reached target swap.target - Swaps. Mar 7 01:30:26.193156 systemd[1]: Reached target timers.target - Timer Units. Mar 7 01:30:26.193164 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 01:30:26.193173 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 01:30:26.193181 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 7 01:30:26.193189 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 7 01:30:26.193197 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 01:30:26.193205 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 01:30:26.193213 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 01:30:26.193221 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 01:30:26.193229 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 7 01:30:26.193239 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 01:30:26.193247 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 7 01:30:26.193255 systemd[1]: Starting systemd-fsck-usr.service... Mar 7 01:30:26.193262 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 01:30:26.193270 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 01:30:26.193291 systemd-journald[218]: Collecting audit messages is disabled. Mar 7 01:30:26.193312 systemd-journald[218]: Journal started Mar 7 01:30:26.193331 systemd-journald[218]: Runtime Journal (/run/log/journal/217746a6dcc04c1c9c8d08ebc8846cda) is 8.0M, max 78.5M, 70.5M free. Mar 7 01:30:26.212972 systemd-modules-load[219]: Inserted module 'overlay' Mar 7 01:30:26.222792 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:30:26.236503 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 7 01:30:26.243977 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 01:30:26.244018 kernel: Bridge firewalling registered Mar 7 01:30:26.244106 systemd-modules-load[219]: Inserted module 'br_netfilter' Mar 7 01:30:26.248977 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 7 01:30:26.257462 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 01:30:26.268512 systemd[1]: Finished systemd-fsck-usr.service. Mar 7 01:30:26.275622 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 01:30:26.283563 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:30:26.304999 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 01:30:26.312667 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 01:30:26.327677 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 7 01:30:26.349683 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 01:30:26.358483 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:30:26.371519 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 01:30:26.381816 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 7 01:30:26.400525 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 01:30:26.414735 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 7 01:30:26.422688 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 01:30:26.441751 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 01:30:26.455866 dracut-cmdline[253]: dracut-dracut-053 Mar 7 01:30:26.455866 dracut-cmdline[253]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9d22c40559a0d209dc0fcc2dfdd5ddf9671e6da0cc59463f610ba522f01325a6 Mar 7 01:30:26.492764 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 01:30:26.504405 systemd-resolved[254]: Positive Trust Anchors: Mar 7 01:30:26.504415 systemd-resolved[254]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 01:30:26.504446 systemd-resolved[254]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 01:30:26.506604 systemd-resolved[254]: Defaulting to hostname 'linux'. Mar 7 01:30:26.508330 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 01:30:26.517120 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 01:30:26.621518 kernel: SCSI subsystem initialized Mar 7 01:30:26.628508 kernel: Loading iSCSI transport class v2.0-870. Mar 7 01:30:26.638519 kernel: iscsi: registered transport (tcp) Mar 7 01:30:26.654908 kernel: iscsi: registered transport (qla4xxx) Mar 7 01:30:26.654925 kernel: QLogic iSCSI HBA Driver Mar 7 01:30:26.694815 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 7 01:30:26.706970 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 7 01:30:26.734364 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 7 01:30:26.734422 kernel: device-mapper: uevent: version 1.0.3 Mar 7 01:30:26.739508 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 7 01:30:26.786511 kernel: raid6: neonx8 gen() 15811 MB/s Mar 7 01:30:26.805504 kernel: raid6: neonx4 gen() 15692 MB/s Mar 7 01:30:26.824499 kernel: raid6: neonx2 gen() 13315 MB/s Mar 7 01:30:26.844501 kernel: raid6: neonx1 gen() 10488 MB/s Mar 7 01:30:26.863499 kernel: raid6: int64x8 gen() 6974 MB/s Mar 7 01:30:26.883506 kernel: raid6: int64x4 gen() 7366 MB/s Mar 7 01:30:26.903499 kernel: raid6: int64x2 gen() 6146 MB/s Mar 7 01:30:26.925078 kernel: raid6: int64x1 gen() 5072 MB/s Mar 7 01:30:26.925099 kernel: raid6: using algorithm neonx8 gen() 15811 MB/s Mar 7 01:30:26.947132 kernel: raid6: .... xor() 12049 MB/s, rmw enabled Mar 7 01:30:26.947158 kernel: raid6: using neon recovery algorithm Mar 7 01:30:26.957494 kernel: xor: measuring software checksum speed Mar 7 01:30:26.957510 kernel: 8regs : 19726 MB/sec Mar 7 01:30:26.960311 kernel: 32regs : 19660 MB/sec Mar 7 01:30:26.963005 kernel: arm64_neon : 27123 MB/sec Mar 7 01:30:26.966326 kernel: xor: using function: arm64_neon (27123 MB/sec) Mar 7 01:30:27.016508 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 7 01:30:27.025363 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 7 01:30:27.040617 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 01:30:27.060889 systemd-udevd[439]: Using default interface naming scheme 'v255'. Mar 7 01:30:27.065049 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 01:30:27.081603 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 7 01:30:27.096668 dracut-pre-trigger[449]: rd.md=0: removing MD RAID activation Mar 7 01:30:27.123253 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 01:30:27.135825 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 01:30:27.175764 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 01:30:27.196197 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 7 01:30:27.226742 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 7 01:30:27.236758 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 01:30:27.256325 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 01:30:27.271435 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 01:30:27.286514 kernel: hv_vmbus: Vmbus version:5.3 Mar 7 01:30:27.287670 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 7 01:30:27.320951 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 7 01:30:27.320973 kernel: hv_vmbus: registering driver hyperv_keyboard Mar 7 01:30:27.320983 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 7 01:30:27.320993 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Mar 7 01:30:27.323692 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 01:30:27.323906 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:30:27.346293 kernel: PTP clock support registered Mar 7 01:30:27.333655 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 01:30:27.449735 kernel: hv_utils: Registering HyperV Utility Driver Mar 7 01:30:27.449757 kernel: hv_vmbus: registering driver hv_utils Mar 7 01:30:27.449767 kernel: hv_utils: Heartbeat IC version 3.0 Mar 7 01:30:27.449776 kernel: hv_utils: Shutdown IC version 3.2 Mar 7 01:30:27.449786 kernel: hv_utils: TimeSync IC version 4.0 Mar 7 01:30:27.449795 kernel: hv_vmbus: registering driver hv_netvsc Mar 7 01:30:27.347552 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:30:27.347759 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:30:27.489826 kernel: hv_vmbus: registering driver hid_hyperv Mar 7 01:30:27.489847 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Mar 7 01:30:27.489858 kernel: hv_vmbus: registering driver hv_storvsc Mar 7 01:30:27.489867 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Mar 7 01:30:27.361465 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:30:27.500962 kernel: scsi host0: storvsc_host_t Mar 7 01:30:27.444832 systemd-resolved[254]: Clock change detected. Flushing caches. Mar 7 01:30:27.513739 kernel: scsi host1: storvsc_host_t Mar 7 01:30:27.513913 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Mar 7 01:30:27.488187 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:30:27.529998 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Mar 7 01:30:27.513704 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 7 01:30:27.530150 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:30:27.530840 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:30:27.553042 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:30:27.577612 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Mar 7 01:30:27.577836 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 7 01:30:27.579586 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Mar 7 01:30:27.579960 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:30:27.591907 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 01:30:27.616617 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Mar 7 01:30:27.616808 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Mar 7 01:30:27.616898 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 7 01:30:27.616989 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Mar 7 01:30:27.617077 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Mar 7 01:30:27.636142 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#11 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 7 01:30:27.636360 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:30:27.641572 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 7 01:30:27.657556 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:30:27.677560 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#181 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 7 01:30:27.785003 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Mar 7 01:30:27.811112 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by (udev-worker) (498) Mar 7 01:30:27.811141 kernel: BTRFS: device fsid 237c8587-8110-47ef-99f9-37e4ed4d3b31 devid 1 transid 36 /dev/sda3 scanned by (udev-worker) (499) Mar 7 01:30:27.816323 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Mar 7 01:30:27.837752 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 7 01:30:27.847695 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Mar 7 01:30:27.852838 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Mar 7 01:30:27.875762 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 7 01:30:27.898828 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:30:27.907565 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:30:28.916567 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:30:28.917352 disk-uuid[595]: The operation has completed successfully. Mar 7 01:30:28.983041 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 7 01:30:28.983156 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 7 01:30:29.009689 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 7 01:30:29.021180 sh[708]: Success Mar 7 01:30:29.039723 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 7 01:30:29.132518 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 7 01:30:29.142628 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 7 01:30:29.151678 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 7 01:30:29.181691 kernel: BTRFS info (device dm-0): first mount of filesystem 237c8587-8110-47ef-99f9-37e4ed4d3b31 Mar 7 01:30:29.181748 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 7 01:30:29.187599 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 7 01:30:29.192238 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 7 01:30:29.196643 kernel: BTRFS info (device dm-0): using free space tree Mar 7 01:30:29.266183 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 7 01:30:29.270871 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 7 01:30:29.287818 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 7 01:30:29.297477 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 7 01:30:29.327980 kernel: BTRFS info (device sda6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 01:30:29.328043 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 7 01:30:29.331774 kernel: BTRFS info (device sda6): using free space tree Mar 7 01:30:29.350056 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 01:30:29.355947 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 7 01:30:29.367580 kernel: BTRFS info (device sda6): last unmount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 01:30:29.376590 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 7 01:30:29.391758 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 7 01:30:29.424170 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 01:30:29.439685 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 01:30:29.464238 systemd-networkd[892]: lo: Link UP Mar 7 01:30:29.464249 systemd-networkd[892]: lo: Gained carrier Mar 7 01:30:29.464995 systemd-networkd[892]: Enumeration completed Mar 7 01:30:29.465099 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 01:30:29.465527 systemd-networkd[892]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:30:29.465530 systemd-networkd[892]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 01:30:29.470096 systemd-networkd[892]: eth0: Link UP Mar 7 01:30:29.470499 systemd-networkd[892]: eth0: Gained carrier Mar 7 01:30:29.470509 systemd-networkd[892]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:30:29.472888 systemd[1]: Reached target network.target - Network. Mar 7 01:30:29.504586 systemd-networkd[892]: eth0: DHCPv4 address 10.200.20.32/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 7 01:30:29.648615 ignition[839]: Ignition 2.19.0 Mar 7 01:30:29.651337 ignition[839]: Stage: fetch-offline Mar 7 01:30:29.651384 ignition[839]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:30:29.654873 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 01:30:29.651392 ignition[839]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 01:30:29.651495 ignition[839]: parsed url from cmdline: "" Mar 7 01:30:29.651498 ignition[839]: no config URL provided Mar 7 01:30:29.651503 ignition[839]: reading system config file "/usr/lib/ignition/user.ign" Mar 7 01:30:29.651510 ignition[839]: no config at "/usr/lib/ignition/user.ign" Mar 7 01:30:29.651517 ignition[839]: failed to fetch config: resource requires networking Mar 7 01:30:29.681789 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 7 01:30:29.651706 ignition[839]: Ignition finished successfully Mar 7 01:30:29.702168 ignition[901]: Ignition 2.19.0 Mar 7 01:30:29.702182 ignition[901]: Stage: fetch Mar 7 01:30:29.702390 ignition[901]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:30:29.702400 ignition[901]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 01:30:29.702501 ignition[901]: parsed url from cmdline: "" Mar 7 01:30:29.702504 ignition[901]: no config URL provided Mar 7 01:30:29.702508 ignition[901]: reading system config file "/usr/lib/ignition/user.ign" Mar 7 01:30:29.702516 ignition[901]: no config at "/usr/lib/ignition/user.ign" Mar 7 01:30:29.702559 ignition[901]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Mar 7 01:30:29.804330 ignition[901]: GET result: OK Mar 7 01:30:29.804429 ignition[901]: config has been read from IMDS userdata Mar 7 01:30:29.804472 ignition[901]: parsing config with SHA512: 28ad04719edd5619ee215ec54f17d8153d397a9c861fe92d7f200cedccbc41128695d08253edcd319c754e205202767e432b76447997bc052fcc92bdf9555a1e Mar 7 01:30:29.808561 unknown[901]: fetched base config from "system" Mar 7 01:30:29.810267 ignition[901]: fetch: fetch complete Mar 7 01:30:29.808576 unknown[901]: fetched base config from "system" Mar 7 01:30:29.810273 ignition[901]: fetch: fetch passed Mar 7 01:30:29.808582 unknown[901]: fetched user config from "azure" Mar 7 01:30:29.810336 ignition[901]: Ignition finished successfully Mar 7 01:30:29.812335 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 7 01:30:29.835710 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 7 01:30:29.850674 ignition[907]: Ignition 2.19.0 Mar 7 01:30:29.850683 ignition[907]: Stage: kargs Mar 7 01:30:29.856590 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 7 01:30:29.850872 ignition[907]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:30:29.850885 ignition[907]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 01:30:29.870716 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 7 01:30:29.851883 ignition[907]: kargs: kargs passed Mar 7 01:30:29.851953 ignition[907]: Ignition finished successfully Mar 7 01:30:29.889308 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 7 01:30:29.885717 ignition[913]: Ignition 2.19.0 Mar 7 01:30:29.894175 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 7 01:30:29.885724 ignition[913]: Stage: disks Mar 7 01:30:29.902494 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 7 01:30:29.885888 ignition[913]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:30:29.912583 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 01:30:29.885897 ignition[913]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 01:30:29.919926 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 01:30:29.886780 ignition[913]: disks: disks passed Mar 7 01:30:29.929294 systemd[1]: Reached target basic.target - Basic System. Mar 7 01:30:29.886826 ignition[913]: Ignition finished successfully Mar 7 01:30:29.949748 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 7 01:30:30.001926 systemd-fsck[921]: ROOT: clean, 14/7326000 files, 477710/7359488 blocks Mar 7 01:30:30.009801 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 7 01:30:30.027710 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 7 01:30:30.085558 kernel: EXT4-fs (sda9): mounted filesystem 596a8ea8-9d3d-4d06-a56e-9d3ebd3cb76d r/w with ordered data mode. Quota mode: none. Mar 7 01:30:30.085917 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 7 01:30:30.090388 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 7 01:30:30.115618 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 01:30:30.126128 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 7 01:30:30.132730 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 7 01:30:30.148566 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 7 01:30:30.148610 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 01:30:30.185137 kernel: hv_netvsc 7ced8dc6-d8cf-7ced-8dc6-d8cf7ced8dc6 eth0: VF slot 1 added Mar 7 01:30:30.185314 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (932) Mar 7 01:30:30.155888 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 7 01:30:30.200523 kernel: BTRFS info (device sda6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 01:30:30.200616 kernel: hv_vmbus: registering driver hv_pci Mar 7 01:30:30.200636 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 7 01:30:30.199291 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 7 01:30:30.224271 kernel: hv_pci cd5bce2c-8d5e-41d5-a78a-0a161c517b26: PCI VMBus probing: Using version 0x10004 Mar 7 01:30:30.224456 kernel: BTRFS info (device sda6): using free space tree Mar 7 01:30:30.224467 kernel: hv_pci cd5bce2c-8d5e-41d5-a78a-0a161c517b26: PCI host bridge to bus 8d5e:00 Mar 7 01:30:30.234727 kernel: pci_bus 8d5e:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Mar 7 01:30:30.239483 kernel: pci_bus 8d5e:00: No busn resource found for root bus, will use [bus 00-ff] Mar 7 01:30:30.245147 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 01:30:30.245189 kernel: pci 8d5e:00:02.0: [15b3:1018] type 00 class 0x020000 Mar 7 01:30:30.249803 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 01:30:30.266592 kernel: pci 8d5e:00:02.0: reg 0x10: [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 7 01:30:30.271609 kernel: pci 8d5e:00:02.0: enabling Extended Tags Mar 7 01:30:30.290671 kernel: pci 8d5e:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 8d5e:00:02.0 (capable of 126.016 Gb/s with 8.0 GT/s PCIe x16 link) Mar 7 01:30:30.301223 kernel: pci_bus 8d5e:00: busn_res: [bus 00-ff] end is updated to 00 Mar 7 01:30:30.301442 kernel: pci 8d5e:00:02.0: BAR 0: assigned [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 7 01:30:30.339683 coreos-metadata[934]: Mar 07 01:30:30.339 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 7 01:30:30.351733 coreos-metadata[934]: Mar 07 01:30:30.351 INFO Fetch successful Mar 7 01:30:30.351733 coreos-metadata[934]: Mar 07 01:30:30.351 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Mar 7 01:30:30.371074 kernel: mlx5_core 8d5e:00:02.0: enabling device (0000 -> 0002) Mar 7 01:30:30.371751 kernel: mlx5_core 8d5e:00:02.0: firmware version: 16.30.5026 Mar 7 01:30:30.371852 coreos-metadata[934]: Mar 07 01:30:30.370 INFO Fetch successful Mar 7 01:30:30.378712 coreos-metadata[934]: Mar 07 01:30:30.374 INFO wrote hostname ci-4081.3.6-n-3151c5d0e2 to /sysroot/etc/hostname Mar 7 01:30:30.385090 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 7 01:30:30.461383 initrd-setup-root[964]: cut: /sysroot/etc/passwd: No such file or directory Mar 7 01:30:30.481984 initrd-setup-root[971]: cut: /sysroot/etc/group: No such file or directory Mar 7 01:30:30.495193 initrd-setup-root[978]: cut: /sysroot/etc/shadow: No such file or directory Mar 7 01:30:30.506247 initrd-setup-root[989]: cut: /sysroot/etc/gshadow: No such file or directory Mar 7 01:30:30.578349 kernel: hv_netvsc 7ced8dc6-d8cf-7ced-8dc6-d8cf7ced8dc6 eth0: VF registering: eth1 Mar 7 01:30:30.578566 kernel: mlx5_core 8d5e:00:02.0 eth1: joined to eth0 Mar 7 01:30:30.586655 kernel: mlx5_core 8d5e:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Mar 7 01:30:30.597647 kernel: mlx5_core 8d5e:00:02.0 enP36190s1: renamed from eth1 Mar 7 01:30:30.606757 systemd-networkd[892]: eth1: Interface name change detected, renamed to enP36190s1. Mar 7 01:30:30.652740 systemd-networkd[892]: eth0: Gained IPv6LL Mar 7 01:30:30.725202 systemd-networkd[892]: enP36190s1: Link UP Mar 7 01:30:30.728756 kernel: mlx5_core 8d5e:00:02.0 enP36190s1: Link up Mar 7 01:30:30.769513 systemd-networkd[892]: enP36190s1: Gained carrier Mar 7 01:30:30.773505 kernel: hv_netvsc 7ced8dc6-d8cf-7ced-8dc6-d8cf7ced8dc6 eth0: Data path switched to VF: enP36190s1 Mar 7 01:30:30.840134 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 7 01:30:30.860679 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 7 01:30:30.870746 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 7 01:30:30.885970 kernel: BTRFS info (device sda6): last unmount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 01:30:30.881504 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 7 01:30:30.908970 ignition[1061]: INFO : Ignition 2.19.0 Mar 7 01:30:30.908970 ignition[1061]: INFO : Stage: mount Mar 7 01:30:30.908970 ignition[1061]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 01:30:30.908970 ignition[1061]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 01:30:30.939778 ignition[1061]: INFO : mount: mount passed Mar 7 01:30:30.939778 ignition[1061]: INFO : Ignition finished successfully Mar 7 01:30:30.914809 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 7 01:30:30.924751 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 7 01:30:30.938384 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 7 01:30:30.965886 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 01:30:30.984696 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 scanned by mount (1073) Mar 7 01:30:30.984744 kernel: BTRFS info (device sda6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 01:30:30.995032 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 7 01:30:30.998473 kernel: BTRFS info (device sda6): using free space tree Mar 7 01:30:31.006500 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 01:30:31.006713 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 01:30:31.029423 ignition[1089]: INFO : Ignition 2.19.0 Mar 7 01:30:31.029423 ignition[1089]: INFO : Stage: files Mar 7 01:30:31.035898 ignition[1089]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 01:30:31.035898 ignition[1089]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 01:30:31.035898 ignition[1089]: DEBUG : files: compiled without relabeling support, skipping Mar 7 01:30:31.035898 ignition[1089]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 7 01:30:31.035898 ignition[1089]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 7 01:30:31.065137 ignition[1089]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 7 01:30:31.065137 ignition[1089]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 7 01:30:31.065137 ignition[1089]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 7 01:30:31.065137 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 7 01:30:31.065137 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 7 01:30:31.058890 unknown[1089]: wrote ssh authorized keys file for user: core Mar 7 01:30:38.297063 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 7 01:30:38.459471 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 7 01:30:38.468152 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.35.1-arm64.raw: attempt #1 Mar 7 01:30:39.098052 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 7 01:30:39.514330 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 7 01:30:39.514330 ignition[1089]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 7 01:30:39.528692 ignition[1089]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 01:30:39.528692 ignition[1089]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 01:30:39.528692 ignition[1089]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 7 01:30:39.528692 ignition[1089]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 7 01:30:39.528692 ignition[1089]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 7 01:30:39.567316 ignition[1089]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 7 01:30:39.567316 ignition[1089]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 7 01:30:39.567316 ignition[1089]: INFO : files: files passed Mar 7 01:30:39.567316 ignition[1089]: INFO : Ignition finished successfully Mar 7 01:30:39.540823 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 7 01:30:39.567849 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 7 01:30:39.581752 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 7 01:30:39.599877 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 7 01:30:39.628657 initrd-setup-root-after-ignition[1118]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 01:30:39.628657 initrd-setup-root-after-ignition[1118]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 7 01:30:39.602271 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 7 01:30:39.656057 initrd-setup-root-after-ignition[1122]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 01:30:39.625871 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 01:30:39.633952 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 7 01:30:39.668782 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 7 01:30:39.699866 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 7 01:30:39.699985 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 7 01:30:39.709807 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 7 01:30:39.719454 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 7 01:30:39.728314 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 7 01:30:39.742831 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 7 01:30:39.760949 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 01:30:39.773792 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 7 01:30:39.792499 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 7 01:30:39.797391 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 01:30:39.806809 systemd[1]: Stopped target timers.target - Timer Units. Mar 7 01:30:39.815271 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 7 01:30:39.815334 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 01:30:39.827551 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 7 01:30:39.836682 systemd[1]: Stopped target basic.target - Basic System. Mar 7 01:30:39.844596 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 7 01:30:39.852659 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 01:30:39.861873 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 7 01:30:39.871178 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 7 01:30:39.879868 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 01:30:39.888966 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 7 01:30:39.899470 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 7 01:30:39.908164 systemd[1]: Stopped target swap.target - Swaps. Mar 7 01:30:39.915497 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 7 01:30:39.915568 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 7 01:30:39.927933 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 7 01:30:39.936613 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 01:30:39.946314 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 7 01:30:39.946349 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 01:30:39.956703 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 7 01:30:39.956765 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 7 01:30:39.970868 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 7 01:30:39.970918 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 01:30:39.979752 systemd[1]: ignition-files.service: Deactivated successfully. Mar 7 01:30:39.979790 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 7 01:30:39.988290 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 7 01:30:39.988323 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 7 01:30:40.011727 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 7 01:30:40.046647 ignition[1143]: INFO : Ignition 2.19.0 Mar 7 01:30:40.046647 ignition[1143]: INFO : Stage: umount Mar 7 01:30:40.046647 ignition[1143]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 01:30:40.046647 ignition[1143]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 01:30:40.046647 ignition[1143]: INFO : umount: umount passed Mar 7 01:30:40.046647 ignition[1143]: INFO : Ignition finished successfully Mar 7 01:30:40.022121 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 7 01:30:40.022202 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 01:30:40.034678 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 7 01:30:40.041138 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 7 01:30:40.041197 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 01:30:40.050541 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 7 01:30:40.050604 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 01:30:40.063463 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 7 01:30:40.063575 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 7 01:30:40.075272 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 7 01:30:40.075363 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 7 01:30:40.085509 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 7 01:30:40.085864 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 7 01:30:40.085901 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 7 01:30:40.092342 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 7 01:30:40.092385 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 7 01:30:40.105698 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 7 01:30:40.105745 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 7 01:30:40.115831 systemd[1]: Stopped target network.target - Network. Mar 7 01:30:40.123572 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 7 01:30:40.123615 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 01:30:40.133012 systemd[1]: Stopped target paths.target - Path Units. Mar 7 01:30:40.141524 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 7 01:30:40.145563 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 01:30:40.151227 systemd[1]: Stopped target slices.target - Slice Units. Mar 7 01:30:40.159840 systemd[1]: Stopped target sockets.target - Socket Units. Mar 7 01:30:40.168189 systemd[1]: iscsid.socket: Deactivated successfully. Mar 7 01:30:40.168236 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 01:30:40.177706 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 7 01:30:40.177738 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 01:30:40.185809 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 7 01:30:40.185849 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 7 01:30:40.194245 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 7 01:30:40.194275 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 7 01:30:40.202399 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 7 01:30:40.210481 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 7 01:30:40.218846 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 7 01:30:40.218935 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 7 01:30:40.222572 systemd-networkd[892]: eth0: DHCPv6 lease lost Mar 7 01:30:40.394389 kernel: hv_netvsc 7ced8dc6-d8cf-7ced-8dc6-d8cf7ced8dc6 eth0: Data path switched from VF: enP36190s1 Mar 7 01:30:40.228486 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 7 01:30:40.228601 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 7 01:30:40.240359 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 7 01:30:40.240575 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 7 01:30:40.247962 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 7 01:30:40.248019 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 7 01:30:40.278676 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 7 01:30:40.288804 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 7 01:30:40.288902 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 01:30:40.297882 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 01:30:40.307820 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 7 01:30:40.307914 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 7 01:30:40.337871 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 7 01:30:40.338339 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 01:30:40.348207 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 7 01:30:40.348277 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 7 01:30:40.356287 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 7 01:30:40.356332 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 01:30:40.365139 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 7 01:30:40.365185 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 7 01:30:40.377665 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 7 01:30:40.377714 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 7 01:30:40.394459 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 01:30:40.394520 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:30:40.415678 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 7 01:30:40.422602 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 7 01:30:40.577412 systemd-journald[218]: Received SIGTERM from PID 1 (systemd). Mar 7 01:30:40.422670 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 7 01:30:40.433553 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 7 01:30:40.433606 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 7 01:30:40.442793 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 7 01:30:40.442837 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 01:30:40.452483 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 7 01:30:40.452525 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 01:30:40.462177 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:30:40.462221 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:30:40.471533 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 7 01:30:40.473684 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 7 01:30:40.481040 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 7 01:30:40.483198 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 7 01:30:40.491986 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 7 01:30:40.509818 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 7 01:30:40.523041 systemd[1]: Switching root. Mar 7 01:30:40.648942 systemd-journald[218]: Journal stopped Mar 7 01:30:43.214938 kernel: SELinux: policy capability network_peer_controls=1 Mar 7 01:30:43.214962 kernel: SELinux: policy capability open_perms=1 Mar 7 01:30:43.214972 kernel: SELinux: policy capability extended_socket_class=1 Mar 7 01:30:43.214979 kernel: SELinux: policy capability always_check_network=0 Mar 7 01:30:43.214990 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 7 01:30:43.214997 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 7 01:30:43.215007 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 7 01:30:43.215014 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 7 01:30:43.215022 kernel: audit: type=1403 audit(1772847041.541:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 7 01:30:43.215032 systemd[1]: Successfully loaded SELinux policy in 68.709ms. Mar 7 01:30:43.215044 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 9.824ms. Mar 7 01:30:43.215054 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 7 01:30:43.215062 systemd[1]: Detected virtualization microsoft. Mar 7 01:30:43.215071 systemd[1]: Detected architecture arm64. Mar 7 01:30:43.215082 systemd[1]: Detected first boot. Mar 7 01:30:43.215093 systemd[1]: Hostname set to . Mar 7 01:30:43.215101 systemd[1]: Initializing machine ID from random generator. Mar 7 01:30:43.215110 zram_generator::config[1183]: No configuration found. Mar 7 01:30:43.215120 systemd[1]: Populated /etc with preset unit settings. Mar 7 01:30:43.215128 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 7 01:30:43.215137 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 7 01:30:43.215146 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 7 01:30:43.215158 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 7 01:30:43.215167 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 7 01:30:43.215176 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 7 01:30:43.215185 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 7 01:30:43.215194 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 7 01:30:43.215204 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 7 01:30:43.215213 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 7 01:30:43.215223 systemd[1]: Created slice user.slice - User and Session Slice. Mar 7 01:30:43.215233 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 01:30:43.215242 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 01:30:43.215251 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 7 01:30:43.215260 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 7 01:30:43.215269 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 7 01:30:43.215279 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 01:30:43.215289 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 7 01:30:43.215299 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 01:30:43.215308 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 7 01:30:43.215317 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 7 01:30:43.215329 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 7 01:30:43.215339 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 7 01:30:43.215348 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 01:30:43.215357 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 01:30:43.215367 systemd[1]: Reached target slices.target - Slice Units. Mar 7 01:30:43.215378 systemd[1]: Reached target swap.target - Swaps. Mar 7 01:30:43.215387 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 7 01:30:43.215397 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 7 01:30:43.215406 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 01:30:43.215415 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 01:30:43.215425 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 01:30:43.215436 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 7 01:30:43.215446 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 7 01:30:43.215455 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 7 01:30:43.215465 systemd[1]: Mounting media.mount - External Media Directory... Mar 7 01:30:43.215474 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 7 01:30:43.215484 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 7 01:30:43.215494 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 7 01:30:43.215505 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 7 01:30:43.215515 systemd[1]: Reached target machines.target - Containers. Mar 7 01:30:43.215524 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 7 01:30:43.215534 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 01:30:43.215554 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 01:30:43.215565 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 7 01:30:43.215574 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 01:30:43.215584 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 7 01:30:43.215595 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 01:30:43.215605 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 7 01:30:43.215614 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 01:30:43.215623 kernel: fuse: init (API version 7.39) Mar 7 01:30:43.215632 kernel: ACPI: bus type drm_connector registered Mar 7 01:30:43.215640 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 7 01:30:43.215650 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 7 01:30:43.215659 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 7 01:30:43.215669 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 7 01:30:43.215680 kernel: loop: module loaded Mar 7 01:30:43.215688 systemd[1]: Stopped systemd-fsck-usr.service. Mar 7 01:30:43.215699 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 01:30:43.215708 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 01:30:43.215717 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 7 01:30:43.215741 systemd-journald[1286]: Collecting audit messages is disabled. Mar 7 01:30:43.215763 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 7 01:30:43.215773 systemd-journald[1286]: Journal started Mar 7 01:30:43.215793 systemd-journald[1286]: Runtime Journal (/run/log/journal/6b772b759d664838a92c0ce24a81219b) is 8.0M, max 78.5M, 70.5M free. Mar 7 01:30:42.495349 systemd[1]: Queued start job for default target multi-user.target. Mar 7 01:30:42.540213 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 7 01:30:42.540573 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 7 01:30:42.540901 systemd[1]: systemd-journald.service: Consumed 2.482s CPU time. Mar 7 01:30:43.242661 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 01:30:43.250636 systemd[1]: verity-setup.service: Deactivated successfully. Mar 7 01:30:43.250722 systemd[1]: Stopped verity-setup.service. Mar 7 01:30:43.264562 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 01:30:43.264813 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 7 01:30:43.269373 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 7 01:30:43.274163 systemd[1]: Mounted media.mount - External Media Directory. Mar 7 01:30:43.278619 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 7 01:30:43.283986 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 7 01:30:43.289089 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 7 01:30:43.293311 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 7 01:30:43.298690 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 01:30:43.304446 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 7 01:30:43.307559 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 7 01:30:43.313277 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 01:30:43.313431 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 01:30:43.318430 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 7 01:30:43.318566 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 7 01:30:43.323224 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 01:30:43.323354 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 01:30:43.329024 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 7 01:30:43.329143 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 7 01:30:43.333854 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 01:30:43.333970 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 01:30:43.338678 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 01:30:43.343981 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 7 01:30:43.349596 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 7 01:30:43.355225 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 01:30:43.368594 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 7 01:30:43.381642 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 7 01:30:43.387317 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 7 01:30:43.392266 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 7 01:30:43.392299 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 01:30:43.397694 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 7 01:30:43.412691 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 7 01:30:43.418883 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 7 01:30:43.423372 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 01:30:43.435725 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 7 01:30:43.442781 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 7 01:30:43.449673 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 01:30:43.451853 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 7 01:30:43.460825 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 01:30:43.462826 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 01:30:43.471809 systemd-journald[1286]: Time spent on flushing to /var/log/journal/6b772b759d664838a92c0ce24a81219b is 50.161ms for 890 entries. Mar 7 01:30:43.471809 systemd-journald[1286]: System Journal (/var/log/journal/6b772b759d664838a92c0ce24a81219b) is 8.0M, max 2.6G, 2.6G free. Mar 7 01:30:43.553148 systemd-journald[1286]: Received client request to flush runtime journal. Mar 7 01:30:43.553187 kernel: loop0: detected capacity change from 0 to 31320 Mar 7 01:30:43.484833 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 7 01:30:43.496025 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 7 01:30:43.507935 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 7 01:30:43.521425 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 7 01:30:43.526818 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 7 01:30:43.536578 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 7 01:30:43.542073 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 7 01:30:43.547660 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 01:30:43.554282 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 7 01:30:43.566142 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 7 01:30:43.576918 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 7 01:30:43.584801 udevadm[1322]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 7 01:30:43.596131 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 7 01:30:43.606771 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 01:30:43.631862 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 7 01:30:43.633459 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 7 01:30:43.661561 systemd-tmpfiles[1332]: ACLs are not supported, ignoring. Mar 7 01:30:43.661901 systemd-tmpfiles[1332]: ACLs are not supported, ignoring. Mar 7 01:30:43.667221 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 01:30:43.686572 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 7 01:30:43.724563 kernel: loop1: detected capacity change from 0 to 114432 Mar 7 01:30:43.828568 kernel: loop2: detected capacity change from 0 to 114328 Mar 7 01:30:43.931713 kernel: loop3: detected capacity change from 0 to 197488 Mar 7 01:30:43.983567 kernel: loop4: detected capacity change from 0 to 31320 Mar 7 01:30:43.996578 kernel: loop5: detected capacity change from 0 to 114432 Mar 7 01:30:44.031594 kernel: loop6: detected capacity change from 0 to 114328 Mar 7 01:30:44.043574 kernel: loop7: detected capacity change from 0 to 197488 Mar 7 01:30:44.056769 (sd-merge)[1342]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Mar 7 01:30:44.057205 (sd-merge)[1342]: Merged extensions into '/usr'. Mar 7 01:30:44.062170 systemd[1]: Reloading requested from client PID 1317 ('systemd-sysext') (unit systemd-sysext.service)... Mar 7 01:30:44.062283 systemd[1]: Reloading... Mar 7 01:30:44.129575 zram_generator::config[1365]: No configuration found. Mar 7 01:30:44.253616 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 01:30:44.309417 systemd[1]: Reloading finished in 246 ms. Mar 7 01:30:44.340979 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 7 01:30:44.346445 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 7 01:30:44.356717 systemd[1]: Starting ensure-sysext.service... Mar 7 01:30:44.362753 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 01:30:44.373729 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 01:30:44.387657 systemd[1]: Reloading requested from client PID 1424 ('systemctl') (unit ensure-sysext.service)... Mar 7 01:30:44.387678 systemd[1]: Reloading... Mar 7 01:30:44.401879 systemd-tmpfiles[1425]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 7 01:30:44.402156 systemd-tmpfiles[1425]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 7 01:30:44.402850 systemd-tmpfiles[1425]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 7 01:30:44.403093 systemd-tmpfiles[1425]: ACLs are not supported, ignoring. Mar 7 01:30:44.403136 systemd-tmpfiles[1425]: ACLs are not supported, ignoring. Mar 7 01:30:44.406797 systemd-udevd[1426]: Using default interface naming scheme 'v255'. Mar 7 01:30:44.410062 systemd-tmpfiles[1425]: Detected autofs mount point /boot during canonicalization of boot. Mar 7 01:30:44.410073 systemd-tmpfiles[1425]: Skipping /boot Mar 7 01:30:44.419729 systemd-tmpfiles[1425]: Detected autofs mount point /boot during canonicalization of boot. Mar 7 01:30:44.419743 systemd-tmpfiles[1425]: Skipping /boot Mar 7 01:30:44.477599 zram_generator::config[1455]: No configuration found. Mar 7 01:30:44.646873 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#237 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 7 01:30:44.670494 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 01:30:44.676565 kernel: mousedev: PS/2 mouse device common for all mice Mar 7 01:30:44.713689 kernel: hv_vmbus: registering driver hyperv_fb Mar 7 01:30:44.713784 kernel: hv_vmbus: registering driver hv_balloon Mar 7 01:30:44.721816 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Mar 7 01:30:44.721895 kernel: hv_balloon: Memory hot add disabled on ARM64 Mar 7 01:30:44.748574 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Mar 7 01:30:44.756428 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Mar 7 01:30:44.772770 kernel: Console: switching to colour dummy device 80x25 Mar 7 01:30:44.800434 kernel: Console: switching to colour frame buffer device 128x48 Mar 7 01:30:44.802746 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 7 01:30:44.802919 systemd[1]: Reloading finished in 414 ms. Mar 7 01:30:44.832335 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1483) Mar 7 01:30:44.827088 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 01:30:44.845608 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 01:30:44.898663 systemd[1]: Finished ensure-sysext.service. Mar 7 01:30:44.914090 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 7 01:30:44.928789 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 7 01:30:44.941953 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 7 01:30:44.951794 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 7 01:30:44.959505 ldconfig[1312]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 7 01:30:44.960161 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 01:30:44.966750 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 7 01:30:44.975009 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 01:30:44.982459 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 7 01:30:44.995965 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 01:30:45.005755 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 01:30:45.006150 lvm[1597]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 7 01:30:45.012996 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 01:30:45.014108 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 7 01:30:45.021386 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 7 01:30:45.029577 augenrules[1610]: No rules Mar 7 01:30:45.043471 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 01:30:45.051738 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 01:30:45.059890 systemd[1]: Reached target time-set.target - System Time Set. Mar 7 01:30:45.076330 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 7 01:30:45.087734 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:30:45.094280 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 7 01:30:45.101111 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 7 01:30:45.106517 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 7 01:30:45.112692 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 01:30:45.113033 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 01:30:45.118289 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 7 01:30:45.118538 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 7 01:30:45.123431 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 01:30:45.123884 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 01:30:45.129457 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 01:30:45.129726 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 01:30:45.134720 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 7 01:30:45.140393 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 7 01:30:45.157578 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 7 01:30:45.166382 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 01:30:45.175697 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 7 01:30:45.180689 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 01:30:45.180825 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 01:30:45.182807 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 7 01:30:45.189533 lvm[1634]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 7 01:30:45.201857 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 7 01:30:45.208668 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 7 01:30:45.227121 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 7 01:30:45.235422 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 7 01:30:45.281946 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:30:45.299621 systemd-networkd[1617]: lo: Link UP Mar 7 01:30:45.299913 systemd-networkd[1617]: lo: Gained carrier Mar 7 01:30:45.301187 systemd-resolved[1618]: Positive Trust Anchors: Mar 7 01:30:45.301205 systemd-resolved[1618]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 01:30:45.301237 systemd-resolved[1618]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 01:30:45.302236 systemd-networkd[1617]: Enumeration completed Mar 7 01:30:45.302428 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 01:30:45.302763 systemd-networkd[1617]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:30:45.302833 systemd-networkd[1617]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 01:30:45.313118 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 7 01:30:45.313827 systemd-resolved[1618]: Using system hostname 'ci-4081.3.6-n-3151c5d0e2'. Mar 7 01:30:45.321678 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 7 01:30:45.327884 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 7 01:30:45.348574 kernel: mlx5_core 8d5e:00:02.0 enP36190s1: Link up Mar 7 01:30:45.375699 kernel: hv_netvsc 7ced8dc6-d8cf-7ced-8dc6-d8cf7ced8dc6 eth0: Data path switched to VF: enP36190s1 Mar 7 01:30:45.375815 systemd-networkd[1617]: enP36190s1: Link UP Mar 7 01:30:45.375896 systemd-networkd[1617]: eth0: Link UP Mar 7 01:30:45.375900 systemd-networkd[1617]: eth0: Gained carrier Mar 7 01:30:45.375913 systemd-networkd[1617]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:30:45.377506 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 01:30:45.382254 systemd[1]: Reached target network.target - Network. Mar 7 01:30:45.386070 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 01:30:45.391069 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 01:30:45.391937 systemd-networkd[1617]: enP36190s1: Gained carrier Mar 7 01:30:45.395954 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 7 01:30:45.401358 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 7 01:30:45.406996 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 7 01:30:45.411489 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 7 01:30:45.416776 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 7 01:30:45.422337 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 7 01:30:45.422370 systemd[1]: Reached target paths.target - Path Units. Mar 7 01:30:45.426271 systemd[1]: Reached target timers.target - Timer Units. Mar 7 01:30:45.426602 systemd-networkd[1617]: eth0: DHCPv4 address 10.200.20.32/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 7 01:30:45.431347 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 7 01:30:45.437963 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 7 01:30:45.447484 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 7 01:30:45.453014 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 7 01:30:45.457764 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 01:30:45.461969 systemd[1]: Reached target basic.target - Basic System. Mar 7 01:30:45.466137 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 7 01:30:45.466165 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 7 01:30:45.476633 systemd[1]: Starting chronyd.service - NTP client/server... Mar 7 01:30:45.482678 systemd[1]: Starting containerd.service - containerd container runtime... Mar 7 01:30:45.492047 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 7 01:30:45.499767 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 7 01:30:45.507522 (chronyd)[1653]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS Mar 7 01:30:45.508739 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 7 01:30:45.516659 jq[1659]: false Mar 7 01:30:45.520117 chronyd[1662]: chronyd version 4.5 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) Mar 7 01:30:45.521896 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 7 01:30:45.527585 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 7 01:30:45.527756 systemd[1]: hv_fcopy_daemon.service - Hyper-V FCOPY daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_fcopy). Mar 7 01:30:45.533672 chronyd[1662]: Timezone right/UTC failed leap second check, ignoring Mar 7 01:30:45.533865 chronyd[1662]: Loaded seccomp filter (level 2) Mar 7 01:30:45.534399 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Mar 7 01:30:45.542460 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Mar 7 01:30:45.545751 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 7 01:30:45.550037 extend-filesystems[1660]: Found loop4 Mar 7 01:30:45.550037 extend-filesystems[1660]: Found loop5 Mar 7 01:30:45.550037 extend-filesystems[1660]: Found loop6 Mar 7 01:30:45.550037 extend-filesystems[1660]: Found loop7 Mar 7 01:30:45.550037 extend-filesystems[1660]: Found sda Mar 7 01:30:45.550037 extend-filesystems[1660]: Found sda1 Mar 7 01:30:45.550037 extend-filesystems[1660]: Found sda2 Mar 7 01:30:45.550037 extend-filesystems[1660]: Found sda3 Mar 7 01:30:45.550037 extend-filesystems[1660]: Found usr Mar 7 01:30:45.550037 extend-filesystems[1660]: Found sda4 Mar 7 01:30:45.550037 extend-filesystems[1660]: Found sda6 Mar 7 01:30:45.550037 extend-filesystems[1660]: Found sda7 Mar 7 01:30:45.550037 extend-filesystems[1660]: Found sda9 Mar 7 01:30:45.550037 extend-filesystems[1660]: Checking size of /dev/sda9 Mar 7 01:30:45.729855 kernel: hv_utils: KVP IC version 4.0 Mar 7 01:30:45.729893 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1463) Mar 7 01:30:45.729937 coreos-metadata[1655]: Mar 07 01:30:45.659 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 7 01:30:45.729937 coreos-metadata[1655]: Mar 07 01:30:45.669 INFO Fetch successful Mar 7 01:30:45.729937 coreos-metadata[1655]: Mar 07 01:30:45.669 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Mar 7 01:30:45.729937 coreos-metadata[1655]: Mar 07 01:30:45.678 INFO Fetch successful Mar 7 01:30:45.729937 coreos-metadata[1655]: Mar 07 01:30:45.678 INFO Fetching http://168.63.129.16/machine/e38e8ebf-099a-4505-91bf-15a594c13407/1a001ab3%2D4828%2D494b%2D8a16%2Df97871197968.%5Fci%2D4081.3.6%2Dn%2D3151c5d0e2?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Mar 7 01:30:45.729937 coreos-metadata[1655]: Mar 07 01:30:45.680 INFO Fetch successful Mar 7 01:30:45.729937 coreos-metadata[1655]: Mar 07 01:30:45.680 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Mar 7 01:30:45.729937 coreos-metadata[1655]: Mar 07 01:30:45.695 INFO Fetch successful Mar 7 01:30:45.552336 KVP[1663]: KVP starting; pid is:1663 Mar 7 01:30:45.730350 extend-filesystems[1660]: Old size kept for /dev/sda9 Mar 7 01:30:45.730350 extend-filesystems[1660]: Found sr0 Mar 7 01:30:45.556095 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 7 01:30:45.560154 dbus-daemon[1656]: [system] SELinux support is enabled Mar 7 01:30:45.568245 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 7 01:30:45.583553 KVP[1663]: KVP LIC Version: 3.1 Mar 7 01:30:45.594858 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 7 01:30:45.644823 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 7 01:30:45.654020 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 7 01:30:45.671785 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 7 01:30:45.777911 update_engine[1698]: I20260307 01:30:45.747509 1698 main.cc:92] Flatcar Update Engine starting Mar 7 01:30:45.777911 update_engine[1698]: I20260307 01:30:45.748944 1698 update_check_scheduler.cc:74] Next update check in 6m41s Mar 7 01:30:45.673446 systemd[1]: Starting update-engine.service - Update Engine... Mar 7 01:30:45.778274 jq[1701]: true Mar 7 01:30:45.686818 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 7 01:30:45.698095 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 7 01:30:45.710504 systemd[1]: Started chronyd.service - NTP client/server. Mar 7 01:30:45.727242 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 7 01:30:45.728617 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 7 01:30:45.728917 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 7 01:30:45.729067 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 7 01:30:45.748893 systemd[1]: motdgen.service: Deactivated successfully. Mar 7 01:30:45.749067 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 7 01:30:45.776940 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 7 01:30:45.777123 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 7 01:30:45.796587 systemd-logind[1680]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 7 01:30:45.797081 systemd-logind[1680]: New seat seat0. Mar 7 01:30:45.799782 systemd[1]: Started systemd-logind.service - User Login Management. Mar 7 01:30:45.815215 jq[1717]: true Mar 7 01:30:45.835301 dbus-daemon[1656]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 7 01:30:45.833282 (ntainerd)[1718]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 7 01:30:45.847679 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 7 01:30:45.857178 systemd[1]: Started update-engine.service - Update Engine. Mar 7 01:30:45.865706 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 7 01:30:45.865910 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 7 01:30:45.866032 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 7 01:30:45.873532 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 7 01:30:45.873653 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 7 01:30:45.886210 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 7 01:30:45.901816 bash[1747]: Updated "/home/core/.ssh/authorized_keys" Mar 7 01:30:45.902814 tar[1716]: linux-arm64/LICENSE Mar 7 01:30:45.902814 tar[1716]: linux-arm64/helm Mar 7 01:30:45.905018 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 7 01:30:45.916424 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 7 01:30:46.008221 locksmithd[1748]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 7 01:30:46.086760 sshd_keygen[1690]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 7 01:30:46.087788 containerd[1718]: time="2026-03-07T01:30:46.087718740Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Mar 7 01:30:46.117742 containerd[1718]: time="2026-03-07T01:30:46.117696420Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 7 01:30:46.122287 containerd[1718]: time="2026-03-07T01:30:46.121568020Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:30:46.122287 containerd[1718]: time="2026-03-07T01:30:46.121604580Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 7 01:30:46.122287 containerd[1718]: time="2026-03-07T01:30:46.121620500Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 7 01:30:46.122287 containerd[1718]: time="2026-03-07T01:30:46.121766220Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 7 01:30:46.122287 containerd[1718]: time="2026-03-07T01:30:46.121781700Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 7 01:30:46.122287 containerd[1718]: time="2026-03-07T01:30:46.121839140Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:30:46.122287 containerd[1718]: time="2026-03-07T01:30:46.121850700Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 7 01:30:46.122287 containerd[1718]: time="2026-03-07T01:30:46.122015500Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:30:46.122287 containerd[1718]: time="2026-03-07T01:30:46.122030260Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 7 01:30:46.122287 containerd[1718]: time="2026-03-07T01:30:46.122042580Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:30:46.122287 containerd[1718]: time="2026-03-07T01:30:46.122051660Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 7 01:30:46.122563 containerd[1718]: time="2026-03-07T01:30:46.122116620Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 7 01:30:46.125111 containerd[1718]: time="2026-03-07T01:30:46.124677860Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 7 01:30:46.125111 containerd[1718]: time="2026-03-07T01:30:46.124820700Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:30:46.125111 containerd[1718]: time="2026-03-07T01:30:46.124834820Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 7 01:30:46.125111 containerd[1718]: time="2026-03-07T01:30:46.124926620Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 7 01:30:46.125111 containerd[1718]: time="2026-03-07T01:30:46.124968220Z" level=info msg="metadata content store policy set" policy=shared Mar 7 01:30:46.127843 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 7 01:30:46.142899 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 7 01:30:46.148051 containerd[1718]: time="2026-03-07T01:30:46.147819380Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 7 01:30:46.148051 containerd[1718]: time="2026-03-07T01:30:46.147884660Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 7 01:30:46.148051 containerd[1718]: time="2026-03-07T01:30:46.147906660Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 7 01:30:46.148051 containerd[1718]: time="2026-03-07T01:30:46.147922620Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 7 01:30:46.148051 containerd[1718]: time="2026-03-07T01:30:46.147936460Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 7 01:30:46.148819 containerd[1718]: time="2026-03-07T01:30:46.148539060Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 7 01:30:46.149200 containerd[1718]: time="2026-03-07T01:30:46.149066020Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 7 01:30:46.149391 containerd[1718]: time="2026-03-07T01:30:46.149336180Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 7 01:30:46.149391 containerd[1718]: time="2026-03-07T01:30:46.149360300Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 7 01:30:46.149391 containerd[1718]: time="2026-03-07T01:30:46.149372900Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 7 01:30:46.149553 containerd[1718]: time="2026-03-07T01:30:46.149486100Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 7 01:30:46.149553 containerd[1718]: time="2026-03-07T01:30:46.149504900Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 7 01:30:46.149553 containerd[1718]: time="2026-03-07T01:30:46.149519140Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 7 01:30:46.149553 containerd[1718]: time="2026-03-07T01:30:46.149532940Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 7 01:30:46.149864 containerd[1718]: time="2026-03-07T01:30:46.149723700Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 7 01:30:46.149864 containerd[1718]: time="2026-03-07T01:30:46.149752100Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 7 01:30:46.149864 containerd[1718]: time="2026-03-07T01:30:46.149767940Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 7 01:30:46.149864 containerd[1718]: time="2026-03-07T01:30:46.149780460Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 7 01:30:46.149864 containerd[1718]: time="2026-03-07T01:30:46.149808940Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.149864 containerd[1718]: time="2026-03-07T01:30:46.149828820Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.149864 containerd[1718]: time="2026-03-07T01:30:46.149840900Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.151360 containerd[1718]: time="2026-03-07T01:30:46.151003540Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.151360 containerd[1718]: time="2026-03-07T01:30:46.151030020Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.151360 containerd[1718]: time="2026-03-07T01:30:46.151052580Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.151360 containerd[1718]: time="2026-03-07T01:30:46.151229460Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.151360 containerd[1718]: time="2026-03-07T01:30:46.151287980Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.151360 containerd[1718]: time="2026-03-07T01:30:46.151308380Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.151360 containerd[1718]: time="2026-03-07T01:30:46.151325220Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.151360 containerd[1718]: time="2026-03-07T01:30:46.151338140Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.151842 containerd[1718]: time="2026-03-07T01:30:46.151595780Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.151842 containerd[1718]: time="2026-03-07T01:30:46.151615420Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.151842 containerd[1718]: time="2026-03-07T01:30:46.151631980Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 7 01:30:46.151842 containerd[1718]: time="2026-03-07T01:30:46.151758420Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.151842 containerd[1718]: time="2026-03-07T01:30:46.151776700Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.151842 containerd[1718]: time="2026-03-07T01:30:46.151788060Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 7 01:30:46.152210 containerd[1718]: time="2026-03-07T01:30:46.152090620Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 7 01:30:46.152210 containerd[1718]: time="2026-03-07T01:30:46.152150580Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 7 01:30:46.152210 containerd[1718]: time="2026-03-07T01:30:46.152163900Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 7 01:30:46.152210 containerd[1718]: time="2026-03-07T01:30:46.152175860Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 7 01:30:46.152210 containerd[1718]: time="2026-03-07T01:30:46.152185060Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.152210 containerd[1718]: time="2026-03-07T01:30:46.152196420Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 7 01:30:46.152578 containerd[1718]: time="2026-03-07T01:30:46.152362020Z" level=info msg="NRI interface is disabled by configuration." Mar 7 01:30:46.152578 containerd[1718]: time="2026-03-07T01:30:46.152380420Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 7 01:30:46.152971 containerd[1718]: time="2026-03-07T01:30:46.152860860Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 7 01:30:46.152971 containerd[1718]: time="2026-03-07T01:30:46.152932580Z" level=info msg="Connect containerd service" Mar 7 01:30:46.153278 containerd[1718]: time="2026-03-07T01:30:46.153138420Z" level=info msg="using legacy CRI server" Mar 7 01:30:46.153278 containerd[1718]: time="2026-03-07T01:30:46.153154420Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 7 01:30:46.153428 containerd[1718]: time="2026-03-07T01:30:46.153378100Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 7 01:30:46.154327 containerd[1718]: time="2026-03-07T01:30:46.154304580Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 7 01:30:46.154518 containerd[1718]: time="2026-03-07T01:30:46.154478540Z" level=info msg="Start subscribing containerd event" Mar 7 01:30:46.154674 containerd[1718]: time="2026-03-07T01:30:46.154534580Z" level=info msg="Start recovering state" Mar 7 01:30:46.154674 containerd[1718]: time="2026-03-07T01:30:46.154617620Z" level=info msg="Start event monitor" Mar 7 01:30:46.154674 containerd[1718]: time="2026-03-07T01:30:46.154636140Z" level=info msg="Start snapshots syncer" Mar 7 01:30:46.154674 containerd[1718]: time="2026-03-07T01:30:46.154644980Z" level=info msg="Start cni network conf syncer for default" Mar 7 01:30:46.154674 containerd[1718]: time="2026-03-07T01:30:46.154652260Z" level=info msg="Start streaming server" Mar 7 01:30:46.154886 containerd[1718]: time="2026-03-07T01:30:46.154813060Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 7 01:30:46.154958 containerd[1718]: time="2026-03-07T01:30:46.154945340Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 7 01:30:46.155135 systemd[1]: Started containerd.service - containerd container runtime. Mar 7 01:30:46.160467 containerd[1718]: time="2026-03-07T01:30:46.160242220Z" level=info msg="containerd successfully booted in 0.073797s" Mar 7 01:30:46.162086 systemd[1]: issuegen.service: Deactivated successfully. Mar 7 01:30:46.163649 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 7 01:30:46.177436 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 7 01:30:46.191566 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 7 01:30:46.203979 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 7 01:30:46.210860 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 7 01:30:46.220087 systemd[1]: Reached target getty.target - Login Prompts. Mar 7 01:30:46.389205 tar[1716]: linux-arm64/README.md Mar 7 01:30:46.400221 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 7 01:30:46.844679 systemd-networkd[1617]: eth0: Gained IPv6LL Mar 7 01:30:46.846682 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 7 01:30:46.853301 systemd[1]: Reached target network-online.target - Network is Online. Mar 7 01:30:46.868933 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:30:46.875813 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 7 01:30:46.885414 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Mar 7 01:30:46.911008 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Mar 7 01:30:46.917572 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 7 01:30:47.496722 waagent[1799]: 2026-03-07T01:30:47.495892Z INFO Daemon Daemon Azure Linux Agent Version: 2.9.1.1 Mar 7 01:30:47.500522 waagent[1799]: 2026-03-07T01:30:47.500458Z INFO Daemon Daemon OS: flatcar 4081.3.6 Mar 7 01:30:47.504259 waagent[1799]: 2026-03-07T01:30:47.504210Z INFO Daemon Daemon Python: 3.11.9 Mar 7 01:30:47.510388 waagent[1799]: 2026-03-07T01:30:47.509727Z INFO Daemon Daemon Run daemon Mar 7 01:30:47.513346 waagent[1799]: 2026-03-07T01:30:47.513300Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4081.3.6' Mar 7 01:30:47.521456 waagent[1799]: 2026-03-07T01:30:47.521044Z INFO Daemon Daemon Using waagent for provisioning Mar 7 01:30:47.525518 waagent[1799]: 2026-03-07T01:30:47.525468Z INFO Daemon Daemon Activate resource disk Mar 7 01:30:47.529837 waagent[1799]: 2026-03-07T01:30:47.529137Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Mar 7 01:30:47.538575 waagent[1799]: 2026-03-07T01:30:47.538491Z INFO Daemon Daemon Found device: None Mar 7 01:30:47.544564 waagent[1799]: 2026-03-07T01:30:47.542862Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Mar 7 01:30:47.549702 waagent[1799]: 2026-03-07T01:30:47.549646Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Mar 7 01:30:47.562630 waagent[1799]: 2026-03-07T01:30:47.561693Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 7 01:30:47.567642 waagent[1799]: 2026-03-07T01:30:47.567379Z INFO Daemon Daemon Running default provisioning handler Mar 7 01:30:47.572645 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:30:47.579101 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 7 01:30:47.581007 (kubelet)[1807]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 01:30:47.584636 systemd[1]: Startup finished in 616ms (kernel) + 15.676s (initrd) + 6.110s (userspace) = 22.403s. Mar 7 01:30:47.607471 waagent[1799]: 2026-03-07T01:30:47.598930Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Mar 7 01:30:47.610257 waagent[1799]: 2026-03-07T01:30:47.609483Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Mar 7 01:30:47.618603 waagent[1799]: 2026-03-07T01:30:47.618515Z INFO Daemon Daemon cloud-init is enabled: False Mar 7 01:30:47.625065 waagent[1799]: 2026-03-07T01:30:47.624996Z INFO Daemon Daemon Copying ovf-env.xml Mar 7 01:30:47.690330 waagent[1799]: 2026-03-07T01:30:47.690238Z INFO Daemon Daemon Successfully mounted dvd Mar 7 01:30:47.720496 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Mar 7 01:30:47.721900 login[1778]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:30:47.725216 login[1779]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:30:47.726454 waagent[1799]: 2026-03-07T01:30:47.726355Z INFO Daemon Daemon Detect protocol endpoint Mar 7 01:30:47.731342 waagent[1799]: 2026-03-07T01:30:47.731210Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 7 01:30:47.739065 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 7 01:30:47.742838 waagent[1799]: 2026-03-07T01:30:47.742753Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Mar 7 01:30:47.748204 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 7 01:30:47.752916 waagent[1799]: 2026-03-07T01:30:47.748595Z INFO Daemon Daemon Test for route to 168.63.129.16 Mar 7 01:30:47.754675 waagent[1799]: 2026-03-07T01:30:47.754613Z INFO Daemon Daemon Route to 168.63.129.16 exists Mar 7 01:30:47.758913 systemd-logind[1680]: New session 2 of user core. Mar 7 01:30:47.759751 waagent[1799]: 2026-03-07T01:30:47.759449Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Mar 7 01:30:47.771299 systemd-logind[1680]: New session 1 of user core. Mar 7 01:30:47.781704 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 7 01:30:47.789288 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 7 01:30:47.796504 waagent[1799]: 2026-03-07T01:30:47.796457Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Mar 7 01:30:47.802194 (systemd)[1826]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 7 01:30:47.804690 waagent[1799]: 2026-03-07T01:30:47.804116Z INFO Daemon Daemon Wire protocol version:2012-11-30 Mar 7 01:30:47.809596 waagent[1799]: 2026-03-07T01:30:47.808832Z INFO Daemon Daemon Server preferred version:2015-04-05 Mar 7 01:30:47.926481 waagent[1799]: 2026-03-07T01:30:47.926390Z INFO Daemon Daemon Initializing goal state during protocol detection Mar 7 01:30:47.934550 waagent[1799]: 2026-03-07T01:30:47.932835Z INFO Daemon Daemon Forcing an update of the goal state. Mar 7 01:30:47.943697 waagent[1799]: 2026-03-07T01:30:47.943646Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 7 01:30:47.953795 systemd[1826]: Queued start job for default target default.target. Mar 7 01:30:47.960004 systemd[1826]: Created slice app.slice - User Application Slice. Mar 7 01:30:47.960034 systemd[1826]: Reached target paths.target - Paths. Mar 7 01:30:47.960047 systemd[1826]: Reached target timers.target - Timers. Mar 7 01:30:47.962706 systemd[1826]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 7 01:30:47.970041 waagent[1799]: 2026-03-07T01:30:47.969989Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.179 Mar 7 01:30:47.973027 systemd[1826]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 7 01:30:47.973893 systemd[1826]: Reached target sockets.target - Sockets. Mar 7 01:30:47.974002 systemd[1826]: Reached target basic.target - Basic System. Mar 7 01:30:47.974043 systemd[1826]: Reached target default.target - Main User Target. Mar 7 01:30:47.974068 systemd[1826]: Startup finished in 162ms. Mar 7 01:30:47.974618 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 7 01:30:47.977411 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 7 01:30:47.978123 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 7 01:30:47.980624 waagent[1799]: 2026-03-07T01:30:47.980047Z INFO Daemon Mar 7 01:30:47.992428 waagent[1799]: 2026-03-07T01:30:47.992359Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 23fa307a-9465-4300-ad3a-8e7371ce09c6 eTag: 788234576005281845 source: Fabric] Mar 7 01:30:48.003785 waagent[1799]: 2026-03-07T01:30:48.003688Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Mar 7 01:30:48.011857 waagent[1799]: 2026-03-07T01:30:48.011801Z INFO Daemon Mar 7 01:30:48.014365 waagent[1799]: 2026-03-07T01:30:48.014317Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Mar 7 01:30:48.027611 waagent[1799]: 2026-03-07T01:30:48.026593Z INFO Daemon Daemon Downloading artifacts profile blob Mar 7 01:30:48.113882 waagent[1799]: 2026-03-07T01:30:48.112767Z INFO Daemon Downloaded certificate {'thumbprint': '2183B234E3DC53FE97E15936CE8638E960F0910C', 'hasPrivateKey': True} Mar 7 01:30:48.121675 waagent[1799]: 2026-03-07T01:30:48.121515Z INFO Daemon Fetch goal state completed Mar 7 01:30:48.128147 kubelet[1807]: E0307 01:30:48.128102 1807 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 01:30:48.130873 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 01:30:48.131015 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 01:30:48.132137 waagent[1799]: 2026-03-07T01:30:48.132093Z INFO Daemon Daemon Starting provisioning Mar 7 01:30:48.136904 waagent[1799]: 2026-03-07T01:30:48.136835Z INFO Daemon Daemon Handle ovf-env.xml. Mar 7 01:30:48.140906 waagent[1799]: 2026-03-07T01:30:48.140853Z INFO Daemon Daemon Set hostname [ci-4081.3.6-n-3151c5d0e2] Mar 7 01:30:48.151358 waagent[1799]: 2026-03-07T01:30:48.151301Z INFO Daemon Daemon Publish hostname [ci-4081.3.6-n-3151c5d0e2] Mar 7 01:30:48.156320 waagent[1799]: 2026-03-07T01:30:48.156270Z INFO Daemon Daemon Examine /proc/net/route for primary interface Mar 7 01:30:48.161591 waagent[1799]: 2026-03-07T01:30:48.161514Z INFO Daemon Daemon Primary interface is [eth0] Mar 7 01:30:48.176801 systemd-networkd[1617]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:30:48.176808 systemd-networkd[1617]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 01:30:48.176840 systemd-networkd[1617]: eth0: DHCP lease lost Mar 7 01:30:48.178149 waagent[1799]: 2026-03-07T01:30:48.178077Z INFO Daemon Daemon Create user account if not exists Mar 7 01:30:48.183088 waagent[1799]: 2026-03-07T01:30:48.183042Z INFO Daemon Daemon User core already exists, skip useradd Mar 7 01:30:48.187887 waagent[1799]: 2026-03-07T01:30:48.187844Z INFO Daemon Daemon Configure sudoer Mar 7 01:30:48.191830 waagent[1799]: 2026-03-07T01:30:48.191784Z INFO Daemon Daemon Configure sshd Mar 7 01:30:48.195413 waagent[1799]: 2026-03-07T01:30:48.195368Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Mar 7 01:30:48.195594 systemd-networkd[1617]: eth0: DHCPv6 lease lost Mar 7 01:30:48.205350 waagent[1799]: 2026-03-07T01:30:48.205293Z INFO Daemon Daemon Deploy ssh public key. Mar 7 01:30:48.218605 systemd-networkd[1617]: eth0: DHCPv4 address 10.200.20.32/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 7 01:30:49.280989 waagent[1799]: 2026-03-07T01:30:49.280942Z INFO Daemon Daemon Provisioning complete Mar 7 01:30:49.296430 waagent[1799]: 2026-03-07T01:30:49.296383Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Mar 7 01:30:49.301149 waagent[1799]: 2026-03-07T01:30:49.301102Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Mar 7 01:30:49.308727 waagent[1799]: 2026-03-07T01:30:49.308687Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.9.1.1 is the most current agent Mar 7 01:30:49.436244 waagent[1870]: 2026-03-07T01:30:49.435624Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.9.1.1) Mar 7 01:30:49.436244 waagent[1870]: 2026-03-07T01:30:49.435767Z INFO ExtHandler ExtHandler OS: flatcar 4081.3.6 Mar 7 01:30:49.436244 waagent[1870]: 2026-03-07T01:30:49.435818Z INFO ExtHandler ExtHandler Python: 3.11.9 Mar 7 01:30:49.448504 waagent[1870]: 2026-03-07T01:30:49.448430Z INFO ExtHandler ExtHandler Distro: flatcar-4081.3.6; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.9; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.20.1; Mar 7 01:30:49.448861 waagent[1870]: 2026-03-07T01:30:49.448819Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 7 01:30:49.449042 waagent[1870]: 2026-03-07T01:30:49.449007Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 7 01:30:49.456417 waagent[1870]: 2026-03-07T01:30:49.456358Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 7 01:30:49.461801 waagent[1870]: 2026-03-07T01:30:49.461757Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.179 Mar 7 01:30:49.463570 waagent[1870]: 2026-03-07T01:30:49.462365Z INFO ExtHandler Mar 7 01:30:49.463570 waagent[1870]: 2026-03-07T01:30:49.462441Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: c1783e09-b2a2-4b7c-9def-6ad7158c4b2d eTag: 788234576005281845 source: Fabric] Mar 7 01:30:49.463570 waagent[1870]: 2026-03-07T01:30:49.462722Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 7 01:30:49.463570 waagent[1870]: 2026-03-07T01:30:49.463259Z INFO ExtHandler Mar 7 01:30:49.463570 waagent[1870]: 2026-03-07T01:30:49.463328Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Mar 7 01:30:49.466788 waagent[1870]: 2026-03-07T01:30:49.466757Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 7 01:30:49.541620 waagent[1870]: 2026-03-07T01:30:49.541483Z INFO ExtHandler Downloaded certificate {'thumbprint': '2183B234E3DC53FE97E15936CE8638E960F0910C', 'hasPrivateKey': True} Mar 7 01:30:49.542276 waagent[1870]: 2026-03-07T01:30:49.542236Z INFO ExtHandler Fetch goal state completed Mar 7 01:30:49.556504 waagent[1870]: 2026-03-07T01:30:49.556450Z INFO ExtHandler ExtHandler WALinuxAgent-2.9.1.1 running as process 1870 Mar 7 01:30:49.556764 waagent[1870]: 2026-03-07T01:30:49.556729Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Mar 7 01:30:49.558374 waagent[1870]: 2026-03-07T01:30:49.558334Z INFO ExtHandler ExtHandler Cgroup monitoring is not supported on ['flatcar', '4081.3.6', '', 'Flatcar Container Linux by Kinvolk'] Mar 7 01:30:49.558871 waagent[1870]: 2026-03-07T01:30:49.558833Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Mar 7 01:30:49.569405 waagent[1870]: 2026-03-07T01:30:49.569364Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Mar 7 01:30:49.569615 waagent[1870]: 2026-03-07T01:30:49.569578Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Mar 7 01:30:49.575598 waagent[1870]: 2026-03-07T01:30:49.575534Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Mar 7 01:30:49.581718 systemd[1]: Reloading requested from client PID 1883 ('systemctl') (unit waagent.service)... Mar 7 01:30:49.581733 systemd[1]: Reloading... Mar 7 01:30:49.663571 zram_generator::config[1920]: No configuration found. Mar 7 01:30:49.761054 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 01:30:49.839596 systemd[1]: Reloading finished in 257 ms. Mar 7 01:30:49.864580 waagent[1870]: 2026-03-07T01:30:49.863735Z INFO ExtHandler ExtHandler Executing systemctl daemon-reload for setting up waagent-network-setup.service Mar 7 01:30:49.870537 systemd[1]: Reloading requested from client PID 1971 ('systemctl') (unit waagent.service)... Mar 7 01:30:49.870558 systemd[1]: Reloading... Mar 7 01:30:49.949575 zram_generator::config[2003]: No configuration found. Mar 7 01:30:50.063533 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 01:30:50.138463 systemd[1]: Reloading finished in 267 ms. Mar 7 01:30:50.165026 waagent[1870]: 2026-03-07T01:30:50.164217Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Mar 7 01:30:50.165026 waagent[1870]: 2026-03-07T01:30:50.164390Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Mar 7 01:30:50.278673 waagent[1870]: 2026-03-07T01:30:50.278593Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Mar 7 01:30:50.279217 waagent[1870]: 2026-03-07T01:30:50.279176Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: configuration enabled [True], cgroups enabled [False], python supported: [True] Mar 7 01:30:50.279967 waagent[1870]: 2026-03-07T01:30:50.279895Z INFO ExtHandler ExtHandler Starting env monitor service. Mar 7 01:30:50.280306 waagent[1870]: 2026-03-07T01:30:50.280229Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Mar 7 01:30:50.281210 waagent[1870]: 2026-03-07T01:30:50.280503Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 7 01:30:50.281210 waagent[1870]: 2026-03-07T01:30:50.280608Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 7 01:30:50.281210 waagent[1870]: 2026-03-07T01:30:50.280802Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Mar 7 01:30:50.281210 waagent[1870]: 2026-03-07T01:30:50.280965Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Mar 7 01:30:50.281210 waagent[1870]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Mar 7 01:30:50.281210 waagent[1870]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Mar 7 01:30:50.281210 waagent[1870]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Mar 7 01:30:50.281210 waagent[1870]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Mar 7 01:30:50.281210 waagent[1870]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 7 01:30:50.281210 waagent[1870]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 7 01:30:50.281605 waagent[1870]: 2026-03-07T01:30:50.281520Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 7 01:30:50.281690 waagent[1870]: 2026-03-07T01:30:50.281644Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Mar 7 01:30:50.281817 waagent[1870]: 2026-03-07T01:30:50.281781Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Mar 7 01:30:50.282001 waagent[1870]: 2026-03-07T01:30:50.281961Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 7 01:30:50.282293 waagent[1870]: 2026-03-07T01:30:50.282215Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Mar 7 01:30:50.282337 waagent[1870]: 2026-03-07T01:30:50.282285Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Mar 7 01:30:50.282672 waagent[1870]: 2026-03-07T01:30:50.282615Z INFO EnvHandler ExtHandler Configure routes Mar 7 01:30:50.283057 waagent[1870]: 2026-03-07T01:30:50.283015Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Mar 7 01:30:50.283154 waagent[1870]: 2026-03-07T01:30:50.283125Z INFO EnvHandler ExtHandler Gateway:None Mar 7 01:30:50.284649 waagent[1870]: 2026-03-07T01:30:50.284614Z INFO EnvHandler ExtHandler Routes:None Mar 7 01:30:50.291777 waagent[1870]: 2026-03-07T01:30:50.291724Z INFO ExtHandler ExtHandler Mar 7 01:30:50.291872 waagent[1870]: 2026-03-07T01:30:50.291836Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 132a23c9-714b-4296-99b8-1cf57de1e4e8 correlation ab76c02d-ba77-4399-9411-bdfe49e18fc4 created: 2026-03-07T01:30:07.070335Z] Mar 7 01:30:50.292250 waagent[1870]: 2026-03-07T01:30:50.292196Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 7 01:30:50.292830 waagent[1870]: 2026-03-07T01:30:50.292786Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 1 ms] Mar 7 01:30:50.301983 waagent[1870]: 2026-03-07T01:30:50.301623Z INFO MonitorHandler ExtHandler Network interfaces: Mar 7 01:30:50.301983 waagent[1870]: Executing ['ip', '-a', '-o', 'link']: Mar 7 01:30:50.301983 waagent[1870]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Mar 7 01:30:50.301983 waagent[1870]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 7c:ed:8d:c6:d8:cf brd ff:ff:ff:ff:ff:ff Mar 7 01:30:50.301983 waagent[1870]: 3: enP36190s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 7c:ed:8d:c6:d8:cf brd ff:ff:ff:ff:ff:ff\ altname enP36190p0s2 Mar 7 01:30:50.301983 waagent[1870]: Executing ['ip', '-4', '-a', '-o', 'address']: Mar 7 01:30:50.301983 waagent[1870]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Mar 7 01:30:50.301983 waagent[1870]: 2: eth0 inet 10.200.20.32/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Mar 7 01:30:50.301983 waagent[1870]: Executing ['ip', '-6', '-a', '-o', 'address']: Mar 7 01:30:50.301983 waagent[1870]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Mar 7 01:30:50.301983 waagent[1870]: 2: eth0 inet6 fe80::7eed:8dff:fec6:d8cf/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Mar 7 01:30:50.328255 waagent[1870]: 2026-03-07T01:30:50.328206Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.9.1.1 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: E3521F23-6EE0-4C9E-9628-F74C7CE4783C;DroppedPackets: 0;UpdateGSErrors: 0;AutoUpdate: 0] Mar 7 01:30:50.345569 waagent[1870]: 2026-03-07T01:30:50.345500Z INFO EnvHandler ExtHandler Successfully added Azure fabric firewall rules. Current Firewall rules: Mar 7 01:30:50.345569 waagent[1870]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 7 01:30:50.345569 waagent[1870]: pkts bytes target prot opt in out source destination Mar 7 01:30:50.345569 waagent[1870]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 7 01:30:50.345569 waagent[1870]: pkts bytes target prot opt in out source destination Mar 7 01:30:50.345569 waagent[1870]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 7 01:30:50.345569 waagent[1870]: pkts bytes target prot opt in out source destination Mar 7 01:30:50.345569 waagent[1870]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 7 01:30:50.345569 waagent[1870]: 3 534 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 7 01:30:50.345569 waagent[1870]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 7 01:30:50.348277 waagent[1870]: 2026-03-07T01:30:50.348222Z INFO EnvHandler ExtHandler Current Firewall rules: Mar 7 01:30:50.348277 waagent[1870]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 7 01:30:50.348277 waagent[1870]: pkts bytes target prot opt in out source destination Mar 7 01:30:50.348277 waagent[1870]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 7 01:30:50.348277 waagent[1870]: pkts bytes target prot opt in out source destination Mar 7 01:30:50.348277 waagent[1870]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 7 01:30:50.348277 waagent[1870]: pkts bytes target prot opt in out source destination Mar 7 01:30:50.348277 waagent[1870]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 7 01:30:50.348277 waagent[1870]: 3 534 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 7 01:30:50.348277 waagent[1870]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 7 01:30:50.348509 waagent[1870]: 2026-03-07T01:30:50.348474Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Mar 7 01:30:53.439498 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 7 01:30:53.440600 systemd[1]: Started sshd@0-10.200.20.32:22-10.200.16.10:36540.service - OpenSSH per-connection server daemon (10.200.16.10:36540). Mar 7 01:30:53.941509 sshd[2091]: Accepted publickey for core from 10.200.16.10 port 36540 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:30:53.942805 sshd[2091]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:30:53.946339 systemd-logind[1680]: New session 3 of user core. Mar 7 01:30:53.953668 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 7 01:30:54.379683 systemd[1]: Started sshd@1-10.200.20.32:22-10.200.16.10:36554.service - OpenSSH per-connection server daemon (10.200.16.10:36554). Mar 7 01:30:54.872312 sshd[2096]: Accepted publickey for core from 10.200.16.10 port 36554 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:30:54.873120 sshd[2096]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:30:54.876709 systemd-logind[1680]: New session 4 of user core. Mar 7 01:30:54.883677 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 7 01:30:55.224954 sshd[2096]: pam_unix(sshd:session): session closed for user core Mar 7 01:30:55.228643 systemd[1]: sshd@1-10.200.20.32:22-10.200.16.10:36554.service: Deactivated successfully. Mar 7 01:30:55.230275 systemd[1]: session-4.scope: Deactivated successfully. Mar 7 01:30:55.231064 systemd-logind[1680]: Session 4 logged out. Waiting for processes to exit. Mar 7 01:30:55.232031 systemd-logind[1680]: Removed session 4. Mar 7 01:30:55.313250 systemd[1]: Started sshd@2-10.200.20.32:22-10.200.16.10:36560.service - OpenSSH per-connection server daemon (10.200.16.10:36560). Mar 7 01:30:55.805405 sshd[2103]: Accepted publickey for core from 10.200.16.10 port 36560 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:30:55.806216 sshd[2103]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:30:55.810630 systemd-logind[1680]: New session 5 of user core. Mar 7 01:30:55.817745 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 7 01:30:56.152696 sshd[2103]: pam_unix(sshd:session): session closed for user core Mar 7 01:30:56.156614 systemd[1]: sshd@2-10.200.20.32:22-10.200.16.10:36560.service: Deactivated successfully. Mar 7 01:30:56.158221 systemd[1]: session-5.scope: Deactivated successfully. Mar 7 01:30:56.159005 systemd-logind[1680]: Session 5 logged out. Waiting for processes to exit. Mar 7 01:30:56.159829 systemd-logind[1680]: Removed session 5. Mar 7 01:30:56.239340 systemd[1]: Started sshd@3-10.200.20.32:22-10.200.16.10:36570.service - OpenSSH per-connection server daemon (10.200.16.10:36570). Mar 7 01:30:56.726857 sshd[2110]: Accepted publickey for core from 10.200.16.10 port 36570 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:30:56.728176 sshd[2110]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:30:56.732789 systemd-logind[1680]: New session 6 of user core. Mar 7 01:30:56.738718 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 7 01:30:57.077322 sshd[2110]: pam_unix(sshd:session): session closed for user core Mar 7 01:30:57.080907 systemd[1]: sshd@3-10.200.20.32:22-10.200.16.10:36570.service: Deactivated successfully. Mar 7 01:30:57.082707 systemd[1]: session-6.scope: Deactivated successfully. Mar 7 01:30:57.085001 systemd-logind[1680]: Session 6 logged out. Waiting for processes to exit. Mar 7 01:30:57.085854 systemd-logind[1680]: Removed session 6. Mar 7 01:30:57.164880 systemd[1]: Started sshd@4-10.200.20.32:22-10.200.16.10:36584.service - OpenSSH per-connection server daemon (10.200.16.10:36584). Mar 7 01:30:57.654885 sshd[2117]: Accepted publickey for core from 10.200.16.10 port 36584 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:30:57.655726 sshd[2117]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:30:57.659808 systemd-logind[1680]: New session 7 of user core. Mar 7 01:30:57.662689 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 7 01:30:57.955772 sudo[2120]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 7 01:30:57.956037 sudo[2120]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 01:30:57.971284 sudo[2120]: pam_unix(sudo:session): session closed for user root Mar 7 01:30:58.049072 sshd[2117]: pam_unix(sshd:session): session closed for user core Mar 7 01:30:58.052817 systemd[1]: sshd@4-10.200.20.32:22-10.200.16.10:36584.service: Deactivated successfully. Mar 7 01:30:58.054338 systemd[1]: session-7.scope: Deactivated successfully. Mar 7 01:30:58.054945 systemd-logind[1680]: Session 7 logged out. Waiting for processes to exit. Mar 7 01:30:58.056121 systemd-logind[1680]: Removed session 7. Mar 7 01:30:58.135220 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 7 01:30:58.136681 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:30:58.139845 systemd[1]: Started sshd@5-10.200.20.32:22-10.200.16.10:36596.service - OpenSSH per-connection server daemon (10.200.16.10:36596). Mar 7 01:30:58.261880 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:30:58.271804 (kubelet)[2135]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 01:30:58.321031 kubelet[2135]: E0307 01:30:58.320973 2135 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 01:30:58.324412 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 01:30:58.324671 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 01:30:58.629761 sshd[2126]: Accepted publickey for core from 10.200.16.10 port 36596 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:30:58.631449 sshd[2126]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:30:58.636118 systemd-logind[1680]: New session 8 of user core. Mar 7 01:30:58.637712 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 7 01:30:58.905645 sudo[2144]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 7 01:30:58.906333 sudo[2144]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 01:30:58.909277 sudo[2144]: pam_unix(sudo:session): session closed for user root Mar 7 01:30:58.913984 sudo[2143]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 7 01:30:58.914252 sudo[2143]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 01:30:58.926786 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Mar 7 01:30:58.928322 auditctl[2147]: No rules Mar 7 01:30:58.929407 systemd[1]: audit-rules.service: Deactivated successfully. Mar 7 01:30:58.929641 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Mar 7 01:30:58.932167 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 7 01:30:58.956731 augenrules[2165]: No rules Mar 7 01:30:58.958205 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 7 01:30:58.960963 sudo[2143]: pam_unix(sudo:session): session closed for user root Mar 7 01:30:59.038574 sshd[2126]: pam_unix(sshd:session): session closed for user core Mar 7 01:30:59.042382 systemd[1]: sshd@5-10.200.20.32:22-10.200.16.10:36596.service: Deactivated successfully. Mar 7 01:30:59.044504 systemd[1]: session-8.scope: Deactivated successfully. Mar 7 01:30:59.045247 systemd-logind[1680]: Session 8 logged out. Waiting for processes to exit. Mar 7 01:30:59.046013 systemd-logind[1680]: Removed session 8. Mar 7 01:30:59.135978 systemd[1]: Started sshd@6-10.200.20.32:22-10.200.16.10:36604.service - OpenSSH per-connection server daemon (10.200.16.10:36604). Mar 7 01:30:59.632388 sshd[2173]: Accepted publickey for core from 10.200.16.10 port 36604 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:30:59.633158 sshd[2173]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:30:59.637592 systemd-logind[1680]: New session 9 of user core. Mar 7 01:30:59.643677 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 7 01:30:59.905228 sudo[2176]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 7 01:30:59.906072 sudo[2176]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 01:31:00.768754 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 7 01:31:00.768894 (dockerd)[2192]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 7 01:31:01.003890 dockerd[2192]: time="2026-03-07T01:31:01.003834700Z" level=info msg="Starting up" Mar 7 01:31:01.110230 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport660842714-merged.mount: Deactivated successfully. Mar 7 01:31:01.195958 dockerd[2192]: time="2026-03-07T01:31:01.195911900Z" level=info msg="Loading containers: start." Mar 7 01:31:01.293566 kernel: Initializing XFRM netlink socket Mar 7 01:31:01.359412 systemd-networkd[1617]: docker0: Link UP Mar 7 01:31:01.381178 dockerd[2192]: time="2026-03-07T01:31:01.380531740Z" level=info msg="Loading containers: done." Mar 7 01:31:01.391892 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2603793534-merged.mount: Deactivated successfully. Mar 7 01:31:01.399049 dockerd[2192]: time="2026-03-07T01:31:01.398940180Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 7 01:31:01.399220 dockerd[2192]: time="2026-03-07T01:31:01.399170660Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Mar 7 01:31:01.399525 dockerd[2192]: time="2026-03-07T01:31:01.399344460Z" level=info msg="Daemon has completed initialization" Mar 7 01:31:01.457770 dockerd[2192]: time="2026-03-07T01:31:01.457696740Z" level=info msg="API listen on /run/docker.sock" Mar 7 01:31:01.458191 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 7 01:31:01.807628 containerd[1718]: time="2026-03-07T01:31:01.807591980Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\"" Mar 7 01:31:02.767793 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount773279612.mount: Deactivated successfully. Mar 7 01:31:04.406574 containerd[1718]: time="2026-03-07T01:31:04.406215460Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:04.408955 containerd[1718]: time="2026-03-07T01:31:04.408920980Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.35.2: active requests=0, bytes read=24701796" Mar 7 01:31:04.411715 containerd[1718]: time="2026-03-07T01:31:04.411675940Z" level=info msg="ImageCreate event name:\"sha256:713a7d5fc5ed8383c9ffe550e487150c9818d05f0c4c012688fbb27885fcc7bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:04.416437 containerd[1718]: time="2026-03-07T01:31:04.416355180Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:04.417664 containerd[1718]: time="2026-03-07T01:31:04.417438780Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.35.2\" with image id \"sha256:713a7d5fc5ed8383c9ffe550e487150c9818d05f0c4c012688fbb27885fcc7bf\", repo tag \"registry.k8s.io/kube-apiserver:v1.35.2\", repo digest \"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\", size \"24698395\" in 2.60981012s" Mar 7 01:31:04.417664 containerd[1718]: time="2026-03-07T01:31:04.417471780Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\" returns image reference \"sha256:713a7d5fc5ed8383c9ffe550e487150c9818d05f0c4c012688fbb27885fcc7bf\"" Mar 7 01:31:04.418200 containerd[1718]: time="2026-03-07T01:31:04.418083620Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\"" Mar 7 01:31:06.003580 containerd[1718]: time="2026-03-07T01:31:06.002955580Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:06.007166 containerd[1718]: time="2026-03-07T01:31:06.007136100Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.35.2: active requests=0, bytes read=19063039" Mar 7 01:31:06.009874 containerd[1718]: time="2026-03-07T01:31:06.009845620Z" level=info msg="ImageCreate event name:\"sha256:6137f51959af5f0a4da7fb6c0bd868f615a534c02d42e303ad6fb31345ee4854\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:06.014992 containerd[1718]: time="2026-03-07T01:31:06.014963380Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:06.016097 containerd[1718]: time="2026-03-07T01:31:06.016070740Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.35.2\" with image id \"sha256:6137f51959af5f0a4da7fb6c0bd868f615a534c02d42e303ad6fb31345ee4854\", repo tag \"registry.k8s.io/kube-controller-manager:v1.35.2\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\", size \"20675140\" in 1.59773056s" Mar 7 01:31:06.016132 containerd[1718]: time="2026-03-07T01:31:06.016102380Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\" returns image reference \"sha256:6137f51959af5f0a4da7fb6c0bd868f615a534c02d42e303ad6fb31345ee4854\"" Mar 7 01:31:06.017031 containerd[1718]: time="2026-03-07T01:31:06.017004980Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\"" Mar 7 01:31:07.273969 containerd[1718]: time="2026-03-07T01:31:07.273908780Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:07.276561 containerd[1718]: time="2026-03-07T01:31:07.276526020Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.35.2: active requests=0, bytes read=13797901" Mar 7 01:31:07.280179 containerd[1718]: time="2026-03-07T01:31:07.280153500Z" level=info msg="ImageCreate event name:\"sha256:6ad431b09accba3ccc8ac6df4b239aa11c7adf8ee0a477b9f0b54cf9f083f8c6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:07.285450 containerd[1718]: time="2026-03-07T01:31:07.285411860Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:07.287554 containerd[1718]: time="2026-03-07T01:31:07.287519660Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.35.2\" with image id \"sha256:6ad431b09accba3ccc8ac6df4b239aa11c7adf8ee0a477b9f0b54cf9f083f8c6\", repo tag \"registry.k8s.io/kube-scheduler:v1.35.2\", repo digest \"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\", size \"15410020\" in 1.27048024s" Mar 7 01:31:07.287587 containerd[1718]: time="2026-03-07T01:31:07.287565700Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\" returns image reference \"sha256:6ad431b09accba3ccc8ac6df4b239aa11c7adf8ee0a477b9f0b54cf9f083f8c6\"" Mar 7 01:31:07.287975 containerd[1718]: time="2026-03-07T01:31:07.287953380Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\"" Mar 7 01:31:08.430119 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 7 01:31:08.437319 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:31:08.541854 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount559557785.mount: Deactivated successfully. Mar 7 01:31:08.562978 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:31:08.567345 (kubelet)[2406]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 01:31:08.639029 kubelet[2406]: E0307 01:31:08.638966 2406 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 01:31:08.641755 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 01:31:08.642011 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 01:31:09.127362 containerd[1718]: time="2026-03-07T01:31:09.126718260Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:09.129759 containerd[1718]: time="2026-03-07T01:31:09.129731580Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.35.2: active requests=0, bytes read=22329583" Mar 7 01:31:09.134073 containerd[1718]: time="2026-03-07T01:31:09.133212540Z" level=info msg="ImageCreate event name:\"sha256:df7dcaf93e84e5dfbe96b2f86588b38a8959748d9c84b2e0532e2b5ae1bc5884\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:09.137139 containerd[1718]: time="2026-03-07T01:31:09.137111540Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:09.137722 containerd[1718]: time="2026-03-07T01:31:09.137627100Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.35.2\" with image id \"sha256:df7dcaf93e84e5dfbe96b2f86588b38a8959748d9c84b2e0532e2b5ae1bc5884\", repo tag \"registry.k8s.io/kube-proxy:v1.35.2\", repo digest \"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\", size \"22328602\" in 1.84964504s" Mar 7 01:31:09.138275 containerd[1718]: time="2026-03-07T01:31:09.138256460Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\" returns image reference \"sha256:df7dcaf93e84e5dfbe96b2f86588b38a8959748d9c84b2e0532e2b5ae1bc5884\"" Mar 7 01:31:09.138787 containerd[1718]: time="2026-03-07T01:31:09.138752340Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\"" Mar 7 01:31:09.320300 chronyd[1662]: Selected source PHC0 Mar 7 01:31:09.795026 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1204479987.mount: Deactivated successfully. Mar 7 01:31:11.108580 containerd[1718]: time="2026-03-07T01:31:11.108086879Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:11.110596 containerd[1718]: time="2026-03-07T01:31:11.110568477Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.13.1: active requests=0, bytes read=21172211" Mar 7 01:31:11.114269 containerd[1718]: time="2026-03-07T01:31:11.113799553Z" level=info msg="ImageCreate event name:\"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:11.118924 containerd[1718]: time="2026-03-07T01:31:11.118877308Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:11.120222 containerd[1718]: time="2026-03-07T01:31:11.120189827Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.13.1\" with image id \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\", repo tag \"registry.k8s.io/coredns/coredns:v1.13.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\", size \"21168808\" in 1.981292887s" Mar 7 01:31:11.120324 containerd[1718]: time="2026-03-07T01:31:11.120308747Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\" returns image reference \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\"" Mar 7 01:31:11.120910 containerd[1718]: time="2026-03-07T01:31:11.120881146Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 7 01:31:11.692518 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2873332886.mount: Deactivated successfully. Mar 7 01:31:11.710129 containerd[1718]: time="2026-03-07T01:31:11.710073659Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:11.712983 containerd[1718]: time="2026-03-07T01:31:11.712800577Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268709" Mar 7 01:31:11.716158 containerd[1718]: time="2026-03-07T01:31:11.715762613Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:11.719964 containerd[1718]: time="2026-03-07T01:31:11.719935929Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:11.720743 containerd[1718]: time="2026-03-07T01:31:11.720711768Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 599.722102ms" Mar 7 01:31:11.720807 containerd[1718]: time="2026-03-07T01:31:11.720743808Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Mar 7 01:31:11.721182 containerd[1718]: time="2026-03-07T01:31:11.721138768Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\"" Mar 7 01:31:12.828304 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2387144439.mount: Deactivated successfully. Mar 7 01:31:13.856576 containerd[1718]: time="2026-03-07T01:31:13.856236318Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.6-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:13.858585 containerd[1718]: time="2026-03-07T01:31:13.858560396Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.6-0: active requests=0, bytes read=21738165" Mar 7 01:31:13.862598 containerd[1718]: time="2026-03-07T01:31:13.862561312Z" level=info msg="ImageCreate event name:\"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:13.872605 containerd[1718]: time="2026-03-07T01:31:13.871531182Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:13.872605 containerd[1718]: time="2026-03-07T01:31:13.872472701Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.6-0\" with image id \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\", repo tag \"registry.k8s.io/etcd:3.6.6-0\", repo digest \"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\", size \"21749640\" in 2.151303293s" Mar 7 01:31:13.872605 containerd[1718]: time="2026-03-07T01:31:13.872508541Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\" returns image reference \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\"" Mar 7 01:31:16.358604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:31:16.368787 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:31:16.401585 systemd[1]: Reloading requested from client PID 2564 ('systemctl') (unit session-9.scope)... Mar 7 01:31:16.401746 systemd[1]: Reloading... Mar 7 01:31:16.504571 zram_generator::config[2600]: No configuration found. Mar 7 01:31:16.613510 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 01:31:16.691037 systemd[1]: Reloading finished in 288 ms. Mar 7 01:31:16.732092 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:31:16.736065 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:31:16.738107 systemd[1]: kubelet.service: Deactivated successfully. Mar 7 01:31:16.738305 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:31:16.742800 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:31:17.375532 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:31:17.380990 (kubelet)[2673]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 01:31:17.412954 kubelet[2673]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 01:31:17.898513 kubelet[2673]: I0307 01:31:17.898447 2673 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 7 01:31:17.898513 kubelet[2673]: I0307 01:31:17.898498 2673 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 7 01:31:18.265098 kubelet[2673]: I0307 01:31:18.265054 2673 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 7 01:31:18.265098 kubelet[2673]: I0307 01:31:18.265081 2673 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 7 01:31:18.265409 kubelet[2673]: I0307 01:31:18.265393 2673 server.go:951] "Client rotation is on, will bootstrap in background" Mar 7 01:31:19.029942 kubelet[2673]: I0307 01:31:19.029914 2673 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 01:31:19.030652 kubelet[2673]: E0307 01:31:19.030601 2673 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.32:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.32:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 7 01:31:19.036495 kubelet[2673]: E0307 01:31:19.036426 2673 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 7 01:31:19.036495 kubelet[2673]: I0307 01:31:19.036490 2673 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 7 01:31:19.040456 kubelet[2673]: I0307 01:31:19.040421 2673 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 7 01:31:19.041562 kubelet[2673]: I0307 01:31:19.041384 2673 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 7 01:31:19.041562 kubelet[2673]: I0307 01:31:19.041420 2673 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081.3.6-n-3151c5d0e2","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 7 01:31:19.041692 kubelet[2673]: I0307 01:31:19.041573 2673 topology_manager.go:143] "Creating topology manager with none policy" Mar 7 01:31:19.041692 kubelet[2673]: I0307 01:31:19.041583 2673 container_manager_linux.go:308] "Creating device plugin manager" Mar 7 01:31:19.041692 kubelet[2673]: I0307 01:31:19.041668 2673 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 7 01:31:19.217694 kubelet[2673]: I0307 01:31:19.217663 2673 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 7 01:31:19.217857 kubelet[2673]: I0307 01:31:19.217837 2673 kubelet.go:482] "Attempting to sync node with API server" Mar 7 01:31:19.217857 kubelet[2673]: I0307 01:31:19.217853 2673 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 7 01:31:19.217999 kubelet[2673]: I0307 01:31:19.217869 2673 kubelet.go:394] "Adding apiserver pod source" Mar 7 01:31:19.217999 kubelet[2673]: I0307 01:31:19.217879 2673 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 7 01:31:19.221880 kubelet[2673]: I0307 01:31:19.221842 2673 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 7 01:31:19.223568 kubelet[2673]: I0307 01:31:19.222814 2673 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 7 01:31:19.223568 kubelet[2673]: I0307 01:31:19.222851 2673 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 7 01:31:19.223568 kubelet[2673]: W0307 01:31:19.222890 2673 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 7 01:31:19.225421 kubelet[2673]: I0307 01:31:19.225359 2673 server.go:1257] "Started kubelet" Mar 7 01:31:19.226717 kubelet[2673]: I0307 01:31:19.226665 2673 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 7 01:31:19.230197 kubelet[2673]: I0307 01:31:19.230165 2673 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 7 01:31:19.231572 kubelet[2673]: I0307 01:31:19.231376 2673 server.go:317] "Adding debug handlers to kubelet server" Mar 7 01:31:19.232164 kubelet[2673]: I0307 01:31:19.226810 2673 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 7 01:31:19.232229 kubelet[2673]: I0307 01:31:19.232176 2673 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 7 01:31:19.232646 kubelet[2673]: I0307 01:31:19.232341 2673 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 7 01:31:19.233081 kubelet[2673]: I0307 01:31:19.233059 2673 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 7 01:31:19.233511 kubelet[2673]: E0307 01:31:19.232457 2673 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.32:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.32:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081.3.6-n-3151c5d0e2.189a6b068e506ea1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081.3.6-n-3151c5d0e2,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081.3.6-n-3151c5d0e2,},FirstTimestamp:2026-03-07 01:31:19.225323169 +0000 UTC m=+1.841499753,LastTimestamp:2026-03-07 01:31:19.225323169 +0000 UTC m=+1.841499753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081.3.6-n-3151c5d0e2,}" Mar 7 01:31:19.235103 kubelet[2673]: E0307 01:31:19.233901 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:19.235103 kubelet[2673]: I0307 01:31:19.233953 2673 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 7 01:31:19.235103 kubelet[2673]: I0307 01:31:19.234096 2673 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 7 01:31:19.235103 kubelet[2673]: I0307 01:31:19.234144 2673 reconciler.go:29] "Reconciler: start to sync state" Mar 7 01:31:19.236527 kubelet[2673]: I0307 01:31:19.236494 2673 factory.go:223] Registration of the systemd container factory successfully Mar 7 01:31:19.236650 kubelet[2673]: I0307 01:31:19.236602 2673 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 7 01:31:19.237909 kubelet[2673]: E0307 01:31:19.237889 2673 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 7 01:31:19.238103 kubelet[2673]: E0307 01:31:19.238081 2673 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081.3.6-n-3151c5d0e2?timeout=10s\": dial tcp 10.200.20.32:6443: connect: connection refused" interval="200ms" Mar 7 01:31:19.239003 kubelet[2673]: I0307 01:31:19.238986 2673 factory.go:223] Registration of the containerd container factory successfully Mar 7 01:31:19.256988 kubelet[2673]: I0307 01:31:19.256952 2673 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 7 01:31:19.258319 kubelet[2673]: I0307 01:31:19.258299 2673 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 7 01:31:19.258417 kubelet[2673]: I0307 01:31:19.258408 2673 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 7 01:31:19.258480 kubelet[2673]: I0307 01:31:19.258472 2673 kubelet.go:2501] "Starting kubelet main sync loop" Mar 7 01:31:19.258630 kubelet[2673]: E0307 01:31:19.258613 2673 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 7 01:31:19.286964 kubelet[2673]: I0307 01:31:19.286653 2673 cpu_manager.go:225] "Starting" policy="none" Mar 7 01:31:19.286964 kubelet[2673]: I0307 01:31:19.286669 2673 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 7 01:31:19.286964 kubelet[2673]: I0307 01:31:19.286689 2673 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 7 01:31:19.292532 kubelet[2673]: I0307 01:31:19.292512 2673 policy_none.go:50] "Start" Mar 7 01:31:19.292806 kubelet[2673]: I0307 01:31:19.292672 2673 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 7 01:31:19.292806 kubelet[2673]: I0307 01:31:19.292689 2673 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 7 01:31:19.299526 kubelet[2673]: I0307 01:31:19.299011 2673 policy_none.go:44] "Start" Mar 7 01:31:19.303121 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 7 01:31:19.315657 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 7 01:31:19.319009 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 7 01:31:19.326577 kubelet[2673]: E0307 01:31:19.326354 2673 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 7 01:31:19.326577 kubelet[2673]: I0307 01:31:19.326571 2673 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 7 01:31:19.326716 kubelet[2673]: I0307 01:31:19.326582 2673 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 7 01:31:19.327212 kubelet[2673]: I0307 01:31:19.327136 2673 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 7 01:31:19.328597 kubelet[2673]: E0307 01:31:19.328574 2673 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 7 01:31:19.328654 kubelet[2673]: E0307 01:31:19.328611 2673 eviction_manager.go:297] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:19.371007 systemd[1]: Created slice kubepods-burstable-pod43737e925deaad2f2c24321e567967ad.slice - libcontainer container kubepods-burstable-pod43737e925deaad2f2c24321e567967ad.slice. Mar 7 01:31:19.377429 kubelet[2673]: E0307 01:31:19.377246 2673 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.382093 systemd[1]: Created slice kubepods-burstable-pod522f6aae96647f1a7eee523613005ced.slice - libcontainer container kubepods-burstable-pod522f6aae96647f1a7eee523613005ced.slice. Mar 7 01:31:19.392612 kubelet[2673]: E0307 01:31:19.392580 2673 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.395260 systemd[1]: Created slice kubepods-burstable-poda6bcaa92c9c313b14bbb76d46218cdb0.slice - libcontainer container kubepods-burstable-poda6bcaa92c9c313b14bbb76d46218cdb0.slice. Mar 7 01:31:19.397672 kubelet[2673]: E0307 01:31:19.397621 2673 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.428196 kubelet[2673]: I0307 01:31:19.428166 2673 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.428579 kubelet[2673]: E0307 01:31:19.428538 2673 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://10.200.20.32:6443/api/v1/nodes\": dial tcp 10.200.20.32:6443: connect: connection refused" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.434847 kubelet[2673]: I0307 01:31:19.434822 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/43737e925deaad2f2c24321e567967ad-flexvolume-dir\") pod \"kube-controller-manager-ci-4081.3.6-n-3151c5d0e2\" (UID: \"43737e925deaad2f2c24321e567967ad\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.434889 kubelet[2673]: I0307 01:31:19.434850 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a6bcaa92c9c313b14bbb76d46218cdb0-ca-certs\") pod \"kube-apiserver-ci-4081.3.6-n-3151c5d0e2\" (UID: \"a6bcaa92c9c313b14bbb76d46218cdb0\") " pod="kube-system/kube-apiserver-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.434889 kubelet[2673]: I0307 01:31:19.434868 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a6bcaa92c9c313b14bbb76d46218cdb0-k8s-certs\") pod \"kube-apiserver-ci-4081.3.6-n-3151c5d0e2\" (UID: \"a6bcaa92c9c313b14bbb76d46218cdb0\") " pod="kube-system/kube-apiserver-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.434889 kubelet[2673]: I0307 01:31:19.434883 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a6bcaa92c9c313b14bbb76d46218cdb0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081.3.6-n-3151c5d0e2\" (UID: \"a6bcaa92c9c313b14bbb76d46218cdb0\") " pod="kube-system/kube-apiserver-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.434958 kubelet[2673]: I0307 01:31:19.434898 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/43737e925deaad2f2c24321e567967ad-ca-certs\") pod \"kube-controller-manager-ci-4081.3.6-n-3151c5d0e2\" (UID: \"43737e925deaad2f2c24321e567967ad\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.434958 kubelet[2673]: I0307 01:31:19.434917 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/43737e925deaad2f2c24321e567967ad-k8s-certs\") pod \"kube-controller-manager-ci-4081.3.6-n-3151c5d0e2\" (UID: \"43737e925deaad2f2c24321e567967ad\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.434958 kubelet[2673]: I0307 01:31:19.434930 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/43737e925deaad2f2c24321e567967ad-kubeconfig\") pod \"kube-controller-manager-ci-4081.3.6-n-3151c5d0e2\" (UID: \"43737e925deaad2f2c24321e567967ad\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.434958 kubelet[2673]: I0307 01:31:19.434946 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/43737e925deaad2f2c24321e567967ad-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081.3.6-n-3151c5d0e2\" (UID: \"43737e925deaad2f2c24321e567967ad\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.435042 kubelet[2673]: I0307 01:31:19.434963 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/522f6aae96647f1a7eee523613005ced-kubeconfig\") pod \"kube-scheduler-ci-4081.3.6-n-3151c5d0e2\" (UID: \"522f6aae96647f1a7eee523613005ced\") " pod="kube-system/kube-scheduler-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.438516 kubelet[2673]: E0307 01:31:19.438491 2673 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081.3.6-n-3151c5d0e2?timeout=10s\": dial tcp 10.200.20.32:6443: connect: connection refused" interval="400ms" Mar 7 01:31:19.630487 kubelet[2673]: I0307 01:31:19.630391 2673 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.631260 kubelet[2673]: E0307 01:31:19.630689 2673 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://10.200.20.32:6443/api/v1/nodes\": dial tcp 10.200.20.32:6443: connect: connection refused" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:19.684420 containerd[1718]: time="2026-03-07T01:31:19.684383980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081.3.6-n-3151c5d0e2,Uid:43737e925deaad2f2c24321e567967ad,Namespace:kube-system,Attempt:0,}" Mar 7 01:31:19.698307 containerd[1718]: time="2026-03-07T01:31:19.698266552Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081.3.6-n-3151c5d0e2,Uid:522f6aae96647f1a7eee523613005ced,Namespace:kube-system,Attempt:0,}" Mar 7 01:31:19.703278 containerd[1718]: time="2026-03-07T01:31:19.703240622Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081.3.6-n-3151c5d0e2,Uid:a6bcaa92c9c313b14bbb76d46218cdb0,Namespace:kube-system,Attempt:0,}" Mar 7 01:31:19.839635 kubelet[2673]: E0307 01:31:19.839590 2673 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081.3.6-n-3151c5d0e2?timeout=10s\": dial tcp 10.200.20.32:6443: connect: connection refused" interval="800ms" Mar 7 01:31:20.032923 kubelet[2673]: I0307 01:31:20.032896 2673 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:20.033250 kubelet[2673]: E0307 01:31:20.033162 2673 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://10.200.20.32:6443/api/v1/nodes\": dial tcp 10.200.20.32:6443: connect: connection refused" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:20.359815 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4046799745.mount: Deactivated successfully. Mar 7 01:31:20.377510 containerd[1718]: time="2026-03-07T01:31:20.377466896Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:31:20.386776 containerd[1718]: time="2026-03-07T01:31:20.386716565Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269173" Mar 7 01:31:20.389275 containerd[1718]: time="2026-03-07T01:31:20.389225442Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:31:20.392567 containerd[1718]: time="2026-03-07T01:31:20.392310359Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:31:20.395324 containerd[1718]: time="2026-03-07T01:31:20.394783956Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:31:20.398444 containerd[1718]: time="2026-03-07T01:31:20.398407512Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 7 01:31:20.404405 containerd[1718]: time="2026-03-07T01:31:20.404363625Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 7 01:31:20.409799 containerd[1718]: time="2026-03-07T01:31:20.408722060Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:31:20.409799 containerd[1718]: time="2026-03-07T01:31:20.409243539Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 710.895667ms" Mar 7 01:31:20.410244 containerd[1718]: time="2026-03-07T01:31:20.410216258Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 725.755759ms" Mar 7 01:31:20.411118 containerd[1718]: time="2026-03-07T01:31:20.411088457Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 707.783915ms" Mar 7 01:31:20.619387 containerd[1718]: time="2026-03-07T01:31:20.619220021Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:31:20.619387 containerd[1718]: time="2026-03-07T01:31:20.619272221Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:31:20.619387 containerd[1718]: time="2026-03-07T01:31:20.619292421Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:31:20.620393 containerd[1718]: time="2026-03-07T01:31:20.619380661Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:31:20.624285 containerd[1718]: time="2026-03-07T01:31:20.624208255Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:31:20.624285 containerd[1718]: time="2026-03-07T01:31:20.624257095Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:31:20.624399 containerd[1718]: time="2026-03-07T01:31:20.624281415Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:31:20.624399 containerd[1718]: time="2026-03-07T01:31:20.624347895Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:31:20.627401 containerd[1718]: time="2026-03-07T01:31:20.626038613Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:31:20.627401 containerd[1718]: time="2026-03-07T01:31:20.626089053Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:31:20.627401 containerd[1718]: time="2026-03-07T01:31:20.626109413Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:31:20.627401 containerd[1718]: time="2026-03-07T01:31:20.626173253Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:31:20.642367 kubelet[2673]: E0307 01:31:20.642315 2673 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081.3.6-n-3151c5d0e2?timeout=10s\": dial tcp 10.200.20.32:6443: connect: connection refused" interval="1.6s" Mar 7 01:31:20.645736 systemd[1]: Started cri-containerd-2924db6443abf7e02a3b8ed4127ac8b996f179e412b9fbcfb9d1d69de3ab62aa.scope - libcontainer container 2924db6443abf7e02a3b8ed4127ac8b996f179e412b9fbcfb9d1d69de3ab62aa. Mar 7 01:31:20.652318 systemd[1]: Started cri-containerd-43dd5e2b3a30f6f7c6a78285f530292642461a6b68d533097a6a8037205d4a4b.scope - libcontainer container 43dd5e2b3a30f6f7c6a78285f530292642461a6b68d533097a6a8037205d4a4b. Mar 7 01:31:20.654041 systemd[1]: Started cri-containerd-ecba67de8a1d1d870ffb3b0ece4eb6e30e1c673f814d4da937109e7328e8201a.scope - libcontainer container ecba67de8a1d1d870ffb3b0ece4eb6e30e1c673f814d4da937109e7328e8201a. Mar 7 01:31:20.690339 containerd[1718]: time="2026-03-07T01:31:20.689970980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081.3.6-n-3151c5d0e2,Uid:43737e925deaad2f2c24321e567967ad,Namespace:kube-system,Attempt:0,} returns sandbox id \"2924db6443abf7e02a3b8ed4127ac8b996f179e412b9fbcfb9d1d69de3ab62aa\"" Mar 7 01:31:20.703696 containerd[1718]: time="2026-03-07T01:31:20.703077325Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081.3.6-n-3151c5d0e2,Uid:a6bcaa92c9c313b14bbb76d46218cdb0,Namespace:kube-system,Attempt:0,} returns sandbox id \"ecba67de8a1d1d870ffb3b0ece4eb6e30e1c673f814d4da937109e7328e8201a\"" Mar 7 01:31:20.704991 containerd[1718]: time="2026-03-07T01:31:20.704877163Z" level=info msg="CreateContainer within sandbox \"2924db6443abf7e02a3b8ed4127ac8b996f179e412b9fbcfb9d1d69de3ab62aa\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 7 01:31:20.708938 containerd[1718]: time="2026-03-07T01:31:20.708655559Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081.3.6-n-3151c5d0e2,Uid:522f6aae96647f1a7eee523613005ced,Namespace:kube-system,Attempt:0,} returns sandbox id \"43dd5e2b3a30f6f7c6a78285f530292642461a6b68d533097a6a8037205d4a4b\"" Mar 7 01:31:20.713992 containerd[1718]: time="2026-03-07T01:31:20.713902553Z" level=info msg="CreateContainer within sandbox \"ecba67de8a1d1d870ffb3b0ece4eb6e30e1c673f814d4da937109e7328e8201a\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 7 01:31:20.719026 containerd[1718]: time="2026-03-07T01:31:20.719000547Z" level=info msg="CreateContainer within sandbox \"43dd5e2b3a30f6f7c6a78285f530292642461a6b68d533097a6a8037205d4a4b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 7 01:31:20.772774 containerd[1718]: time="2026-03-07T01:31:20.772457286Z" level=info msg="CreateContainer within sandbox \"2924db6443abf7e02a3b8ed4127ac8b996f179e412b9fbcfb9d1d69de3ab62aa\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"e3aaa2d75eed2b869491ddb5812e37d7e6ab945062593691532ab672e528011d\"" Mar 7 01:31:20.780864 containerd[1718]: time="2026-03-07T01:31:20.780829237Z" level=info msg="StartContainer for \"e3aaa2d75eed2b869491ddb5812e37d7e6ab945062593691532ab672e528011d\"" Mar 7 01:31:20.784008 containerd[1718]: time="2026-03-07T01:31:20.782478435Z" level=info msg="CreateContainer within sandbox \"ecba67de8a1d1d870ffb3b0ece4eb6e30e1c673f814d4da937109e7328e8201a\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"b0efa8b0d792aa5dcf45718ef62b7d499f3b973fad9faa3a260ff39a0f22632e\"" Mar 7 01:31:20.784008 containerd[1718]: time="2026-03-07T01:31:20.783021514Z" level=info msg="StartContainer for \"b0efa8b0d792aa5dcf45718ef62b7d499f3b973fad9faa3a260ff39a0f22632e\"" Mar 7 01:31:20.790736 containerd[1718]: time="2026-03-07T01:31:20.790697386Z" level=info msg="CreateContainer within sandbox \"43dd5e2b3a30f6f7c6a78285f530292642461a6b68d533097a6a8037205d4a4b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"01cd39ebfc207eecb9793db9170af14abc843ab67f73aea7fcf91d818541d024\"" Mar 7 01:31:20.791689 containerd[1718]: time="2026-03-07T01:31:20.791664345Z" level=info msg="StartContainer for \"01cd39ebfc207eecb9793db9170af14abc843ab67f73aea7fcf91d818541d024\"" Mar 7 01:31:20.800852 kubelet[2673]: E0307 01:31:20.800736 2673 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.32:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.32:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081.3.6-n-3151c5d0e2.189a6b068e506ea1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081.3.6-n-3151c5d0e2,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081.3.6-n-3151c5d0e2,},FirstTimestamp:2026-03-07 01:31:19.225323169 +0000 UTC m=+1.841499753,LastTimestamp:2026-03-07 01:31:19.225323169 +0000 UTC m=+1.841499753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081.3.6-n-3151c5d0e2,}" Mar 7 01:31:20.812482 systemd[1]: Started cri-containerd-e3aaa2d75eed2b869491ddb5812e37d7e6ab945062593691532ab672e528011d.scope - libcontainer container e3aaa2d75eed2b869491ddb5812e37d7e6ab945062593691532ab672e528011d. Mar 7 01:31:20.820175 systemd[1]: Started cri-containerd-b0efa8b0d792aa5dcf45718ef62b7d499f3b973fad9faa3a260ff39a0f22632e.scope - libcontainer container b0efa8b0d792aa5dcf45718ef62b7d499f3b973fad9faa3a260ff39a0f22632e. Mar 7 01:31:20.828736 systemd[1]: Started cri-containerd-01cd39ebfc207eecb9793db9170af14abc843ab67f73aea7fcf91d818541d024.scope - libcontainer container 01cd39ebfc207eecb9793db9170af14abc843ab67f73aea7fcf91d818541d024. Mar 7 01:31:20.836984 kubelet[2673]: I0307 01:31:20.836953 2673 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:20.837266 kubelet[2673]: E0307 01:31:20.837243 2673 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://10.200.20.32:6443/api/v1/nodes\": dial tcp 10.200.20.32:6443: connect: connection refused" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:20.880145 containerd[1718]: time="2026-03-07T01:31:20.880038244Z" level=info msg="StartContainer for \"e3aaa2d75eed2b869491ddb5812e37d7e6ab945062593691532ab672e528011d\" returns successfully" Mar 7 01:31:20.880609 containerd[1718]: time="2026-03-07T01:31:20.880216244Z" level=info msg="StartContainer for \"b0efa8b0d792aa5dcf45718ef62b7d499f3b973fad9faa3a260ff39a0f22632e\" returns successfully" Mar 7 01:31:20.889454 containerd[1718]: time="2026-03-07T01:31:20.889337874Z" level=info msg="StartContainer for \"01cd39ebfc207eecb9793db9170af14abc843ab67f73aea7fcf91d818541d024\" returns successfully" Mar 7 01:31:21.269730 kubelet[2673]: E0307 01:31:21.269701 2673 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:21.270727 kubelet[2673]: E0307 01:31:21.270704 2673 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:21.273178 kubelet[2673]: E0307 01:31:21.273156 2673 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:22.252867 kubelet[2673]: E0307 01:31:22.252826 2673 nodelease.go:50] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4081.3.6-n-3151c5d0e2\" not found" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:22.276259 kubelet[2673]: E0307 01:31:22.275558 2673 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:22.278824 kubelet[2673]: E0307 01:31:22.278706 2673 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:22.438839 kubelet[2673]: I0307 01:31:22.438812 2673 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:22.452627 kubelet[2673]: I0307 01:31:22.452592 2673 kubelet_node_status.go:77] "Successfully registered node" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:22.452627 kubelet[2673]: E0307 01:31:22.452630 2673 kubelet_node_status.go:474] "Error updating node status, will retry" err="error getting node \"ci-4081.3.6-n-3151c5d0e2\": node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:22.477567 kubelet[2673]: E0307 01:31:22.477050 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:22.577880 kubelet[2673]: E0307 01:31:22.577775 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:22.678408 kubelet[2673]: E0307 01:31:22.678374 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:22.778705 kubelet[2673]: E0307 01:31:22.778667 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:22.879302 kubelet[2673]: E0307 01:31:22.879204 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:22.979808 kubelet[2673]: E0307 01:31:22.979770 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:23.080498 kubelet[2673]: E0307 01:31:23.080443 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:23.181192 kubelet[2673]: E0307 01:31:23.181163 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:23.276873 kubelet[2673]: E0307 01:31:23.276663 2673 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:23.281401 kubelet[2673]: E0307 01:31:23.281372 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:23.382413 kubelet[2673]: E0307 01:31:23.382378 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:23.483487 kubelet[2673]: E0307 01:31:23.483197 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:23.584089 kubelet[2673]: E0307 01:31:23.584036 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:23.684571 kubelet[2673]: E0307 01:31:23.684522 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:23.785384 kubelet[2673]: E0307 01:31:23.785040 2673 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:23.836561 kubelet[2673]: I0307 01:31:23.836514 2673 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:23.847188 kubelet[2673]: I0307 01:31:23.846987 2673 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 7 01:31:23.848228 kubelet[2673]: I0307 01:31:23.847957 2673 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:23.852570 kubelet[2673]: I0307 01:31:23.852198 2673 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 7 01:31:23.852570 kubelet[2673]: I0307 01:31:23.852263 2673 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:23.860659 kubelet[2673]: I0307 01:31:23.860464 2673 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 7 01:31:24.222859 kubelet[2673]: I0307 01:31:24.222832 2673 apiserver.go:52] "Watching apiserver" Mar 7 01:31:24.235116 kubelet[2673]: I0307 01:31:24.235086 2673 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 7 01:31:24.326094 systemd[1]: Reloading requested from client PID 2954 ('systemctl') (unit session-9.scope)... Mar 7 01:31:24.326110 systemd[1]: Reloading... Mar 7 01:31:24.422589 zram_generator::config[2994]: No configuration found. Mar 7 01:31:24.548911 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 01:31:24.638824 systemd[1]: Reloading finished in 312 ms. Mar 7 01:31:24.673076 kubelet[2673]: I0307 01:31:24.673024 2673 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 01:31:24.673701 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:31:24.687519 systemd[1]: kubelet.service: Deactivated successfully. Mar 7 01:31:24.687779 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:31:24.693850 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:31:24.879038 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:31:24.890915 (kubelet)[3058]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 01:31:24.936346 kubelet[3058]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 01:31:24.944571 kubelet[3058]: I0307 01:31:24.943938 3058 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 7 01:31:24.944571 kubelet[3058]: I0307 01:31:24.943988 3058 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 7 01:31:24.944571 kubelet[3058]: I0307 01:31:24.944013 3058 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 7 01:31:24.944571 kubelet[3058]: I0307 01:31:24.944019 3058 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 7 01:31:24.947141 kubelet[3058]: I0307 01:31:24.947117 3058 server.go:951] "Client rotation is on, will bootstrap in background" Mar 7 01:31:24.948343 kubelet[3058]: I0307 01:31:24.948326 3058 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 7 01:31:24.951137 kubelet[3058]: I0307 01:31:24.951002 3058 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 01:31:24.953671 kubelet[3058]: E0307 01:31:24.953649 3058 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 7 01:31:24.953793 kubelet[3058]: I0307 01:31:24.953783 3058 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 7 01:31:24.956612 kubelet[3058]: I0307 01:31:24.956587 3058 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 7 01:31:24.956797 kubelet[3058]: I0307 01:31:24.956772 3058 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 7 01:31:24.956939 kubelet[3058]: I0307 01:31:24.956797 3058 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081.3.6-n-3151c5d0e2","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 7 01:31:24.957013 kubelet[3058]: I0307 01:31:24.956944 3058 topology_manager.go:143] "Creating topology manager with none policy" Mar 7 01:31:24.957013 kubelet[3058]: I0307 01:31:24.956951 3058 container_manager_linux.go:308] "Creating device plugin manager" Mar 7 01:31:24.957013 kubelet[3058]: I0307 01:31:24.956971 3058 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 7 01:31:24.957157 kubelet[3058]: I0307 01:31:24.957145 3058 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 7 01:31:24.957341 kubelet[3058]: I0307 01:31:24.957330 3058 kubelet.go:482] "Attempting to sync node with API server" Mar 7 01:31:24.957371 kubelet[3058]: I0307 01:31:24.957351 3058 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 7 01:31:24.957371 kubelet[3058]: I0307 01:31:24.957368 3058 kubelet.go:394] "Adding apiserver pod source" Mar 7 01:31:24.959326 kubelet[3058]: I0307 01:31:24.957376 3058 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 7 01:31:24.959907 kubelet[3058]: I0307 01:31:24.959887 3058 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 7 01:31:24.960922 kubelet[3058]: I0307 01:31:24.960901 3058 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 7 01:31:24.960984 kubelet[3058]: I0307 01:31:24.960944 3058 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 7 01:31:24.965028 kubelet[3058]: I0307 01:31:24.965008 3058 server.go:1257] "Started kubelet" Mar 7 01:31:24.978920 kubelet[3058]: I0307 01:31:24.977394 3058 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 7 01:31:24.978920 kubelet[3058]: I0307 01:31:24.978344 3058 server.go:317] "Adding debug handlers to kubelet server" Mar 7 01:31:24.981103 kubelet[3058]: I0307 01:31:24.980965 3058 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 7 01:31:24.981103 kubelet[3058]: I0307 01:31:24.981036 3058 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 7 01:31:24.991475 kubelet[3058]: I0307 01:31:24.987310 3058 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 7 01:31:24.991475 kubelet[3058]: I0307 01:31:24.987729 3058 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 7 01:31:24.993267 kubelet[3058]: I0307 01:31:24.992371 3058 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 7 01:31:24.998551 kubelet[3058]: I0307 01:31:24.997299 3058 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 7 01:31:24.998860 kubelet[3058]: I0307 01:31:24.998841 3058 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 7 01:31:24.999049 kubelet[3058]: I0307 01:31:24.999036 3058 reconciler.go:29] "Reconciler: start to sync state" Mar 7 01:31:25.004791 kubelet[3058]: E0307 01:31:25.004756 3058 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-3151c5d0e2\" not found" Mar 7 01:31:25.010351 kubelet[3058]: I0307 01:31:25.010308 3058 factory.go:223] Registration of the systemd container factory successfully Mar 7 01:31:25.010464 kubelet[3058]: I0307 01:31:25.010406 3058 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 7 01:31:25.018838 kubelet[3058]: I0307 01:31:25.018801 3058 factory.go:223] Registration of the containerd container factory successfully Mar 7 01:31:25.028572 kubelet[3058]: I0307 01:31:25.027701 3058 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 7 01:31:25.028572 kubelet[3058]: I0307 01:31:25.028502 3058 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 7 01:31:25.028572 kubelet[3058]: I0307 01:31:25.028518 3058 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 7 01:31:25.028572 kubelet[3058]: I0307 01:31:25.028538 3058 kubelet.go:2501] "Starting kubelet main sync loop" Mar 7 01:31:25.028748 kubelet[3058]: E0307 01:31:25.028630 3058 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 7 01:31:25.081568 kubelet[3058]: I0307 01:31:25.080866 3058 cpu_manager.go:225] "Starting" policy="none" Mar 7 01:31:25.081568 kubelet[3058]: I0307 01:31:25.080882 3058 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 7 01:31:25.081568 kubelet[3058]: I0307 01:31:25.080905 3058 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 7 01:31:25.081568 kubelet[3058]: I0307 01:31:25.081025 3058 state_mem.go:94] "Updated default CPUSet" logger="CPUManager state checkpoint.CPUManager state memory" cpuSet="" Mar 7 01:31:25.081568 kubelet[3058]: I0307 01:31:25.081035 3058 state_mem.go:102] "Updated CPUSet assignments" logger="CPUManager state checkpoint.CPUManager state memory" assignments={} Mar 7 01:31:25.081568 kubelet[3058]: I0307 01:31:25.081063 3058 policy_none.go:50] "Start" Mar 7 01:31:25.081568 kubelet[3058]: I0307 01:31:25.081072 3058 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 7 01:31:25.081568 kubelet[3058]: I0307 01:31:25.081081 3058 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 7 01:31:25.081568 kubelet[3058]: I0307 01:31:25.081174 3058 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 7 01:31:25.081568 kubelet[3058]: I0307 01:31:25.081187 3058 policy_none.go:44] "Start" Mar 7 01:31:25.086570 kubelet[3058]: E0307 01:31:25.086304 3058 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 7 01:31:25.086570 kubelet[3058]: I0307 01:31:25.086454 3058 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 7 01:31:25.086570 kubelet[3058]: I0307 01:31:25.086464 3058 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 7 01:31:25.087262 kubelet[3058]: I0307 01:31:25.087242 3058 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 7 01:31:25.087781 kubelet[3058]: E0307 01:31:25.087754 3058 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 7 01:31:25.129213 kubelet[3058]: I0307 01:31:25.129181 3058 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.130420 kubelet[3058]: I0307 01:31:25.129192 3058 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.130420 kubelet[3058]: I0307 01:31:25.130246 3058 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.143736 kubelet[3058]: I0307 01:31:25.143611 3058 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 7 01:31:25.143736 kubelet[3058]: E0307 01:31:25.143669 3058 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081.3.6-n-3151c5d0e2\" already exists" pod="kube-system/kube-apiserver-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.144073 kubelet[3058]: I0307 01:31:25.144055 3058 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 7 01:31:25.144111 kubelet[3058]: E0307 01:31:25.144093 3058 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081.3.6-n-3151c5d0e2\" already exists" pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.144527 kubelet[3058]: I0307 01:31:25.144502 3058 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 7 01:31:25.144595 kubelet[3058]: E0307 01:31:25.144571 3058 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081.3.6-n-3151c5d0e2\" already exists" pod="kube-system/kube-scheduler-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.188796 kubelet[3058]: I0307 01:31:25.188684 3058 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.199722 kubelet[3058]: I0307 01:31:25.199523 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a6bcaa92c9c313b14bbb76d46218cdb0-k8s-certs\") pod \"kube-apiserver-ci-4081.3.6-n-3151c5d0e2\" (UID: \"a6bcaa92c9c313b14bbb76d46218cdb0\") " pod="kube-system/kube-apiserver-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.199722 kubelet[3058]: I0307 01:31:25.199569 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/43737e925deaad2f2c24321e567967ad-ca-certs\") pod \"kube-controller-manager-ci-4081.3.6-n-3151c5d0e2\" (UID: \"43737e925deaad2f2c24321e567967ad\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.199722 kubelet[3058]: I0307 01:31:25.199590 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/43737e925deaad2f2c24321e567967ad-flexvolume-dir\") pod \"kube-controller-manager-ci-4081.3.6-n-3151c5d0e2\" (UID: \"43737e925deaad2f2c24321e567967ad\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.199722 kubelet[3058]: I0307 01:31:25.199604 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/43737e925deaad2f2c24321e567967ad-kubeconfig\") pod \"kube-controller-manager-ci-4081.3.6-n-3151c5d0e2\" (UID: \"43737e925deaad2f2c24321e567967ad\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.199722 kubelet[3058]: I0307 01:31:25.199619 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/43737e925deaad2f2c24321e567967ad-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081.3.6-n-3151c5d0e2\" (UID: \"43737e925deaad2f2c24321e567967ad\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.199939 kubelet[3058]: I0307 01:31:25.199634 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/522f6aae96647f1a7eee523613005ced-kubeconfig\") pod \"kube-scheduler-ci-4081.3.6-n-3151c5d0e2\" (UID: \"522f6aae96647f1a7eee523613005ced\") " pod="kube-system/kube-scheduler-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.199939 kubelet[3058]: I0307 01:31:25.199649 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a6bcaa92c9c313b14bbb76d46218cdb0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081.3.6-n-3151c5d0e2\" (UID: \"a6bcaa92c9c313b14bbb76d46218cdb0\") " pod="kube-system/kube-apiserver-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.199939 kubelet[3058]: I0307 01:31:25.199669 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/43737e925deaad2f2c24321e567967ad-k8s-certs\") pod \"kube-controller-manager-ci-4081.3.6-n-3151c5d0e2\" (UID: \"43737e925deaad2f2c24321e567967ad\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.199939 kubelet[3058]: I0307 01:31:25.199690 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a6bcaa92c9c313b14bbb76d46218cdb0-ca-certs\") pod \"kube-apiserver-ci-4081.3.6-n-3151c5d0e2\" (UID: \"a6bcaa92c9c313b14bbb76d46218cdb0\") " pod="kube-system/kube-apiserver-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.204089 kubelet[3058]: I0307 01:31:25.203536 3058 kubelet_node_status.go:123] "Node was previously registered" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.204089 kubelet[3058]: I0307 01:31:25.203627 3058 kubelet_node_status.go:77] "Successfully registered node" node="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:25.958026 kubelet[3058]: I0307 01:31:25.957984 3058 apiserver.go:52] "Watching apiserver" Mar 7 01:31:25.999308 kubelet[3058]: I0307 01:31:25.999267 3058 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 7 01:31:26.057570 kubelet[3058]: I0307 01:31:26.055621 3058 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:26.057570 kubelet[3058]: I0307 01:31:26.055859 3058 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:26.057570 kubelet[3058]: I0307 01:31:26.056054 3058 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:26.074540 kubelet[3058]: I0307 01:31:26.074514 3058 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 7 01:31:26.074769 kubelet[3058]: E0307 01:31:26.074752 3058 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081.3.6-n-3151c5d0e2\" already exists" pod="kube-system/kube-scheduler-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:26.076107 kubelet[3058]: I0307 01:31:26.076007 3058 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 7 01:31:26.078566 kubelet[3058]: I0307 01:31:26.076327 3058 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 7 01:31:26.078566 kubelet[3058]: E0307 01:31:26.076550 3058 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081.3.6-n-3151c5d0e2\" already exists" pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:26.078713 kubelet[3058]: E0307 01:31:26.076441 3058 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081.3.6-n-3151c5d0e2\" already exists" pod="kube-system/kube-apiserver-ci-4081.3.6-n-3151c5d0e2" Mar 7 01:31:26.092620 kubelet[3058]: I0307 01:31:26.092566 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081.3.6-n-3151c5d0e2" podStartSLOduration=3.09252655 podStartE2EDuration="3.09252655s" podCreationTimestamp="2026-03-07 01:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:31:26.09233883 +0000 UTC m=+1.196500632" watchObservedRunningTime="2026-03-07 01:31:26.09252655 +0000 UTC m=+1.196688352" Mar 7 01:31:26.115771 kubelet[3058]: I0307 01:31:26.115080 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081.3.6-n-3151c5d0e2" podStartSLOduration=3.115065329 podStartE2EDuration="3.115065329s" podCreationTimestamp="2026-03-07 01:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:31:26.114846329 +0000 UTC m=+1.219008131" watchObservedRunningTime="2026-03-07 01:31:26.115065329 +0000 UTC m=+1.219227171" Mar 7 01:31:26.115943 kubelet[3058]: I0307 01:31:26.115845 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081.3.6-n-3151c5d0e2" podStartSLOduration=3.115835968 podStartE2EDuration="3.115835968s" podCreationTimestamp="2026-03-07 01:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:31:26.103774579 +0000 UTC m=+1.207936381" watchObservedRunningTime="2026-03-07 01:31:26.115835968 +0000 UTC m=+1.219997810" Mar 7 01:31:30.357420 kubelet[3058]: I0307 01:31:30.357389 3058 kuberuntime_manager.go:2062] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 7 01:31:30.357766 containerd[1718]: time="2026-03-07T01:31:30.357682761Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 7 01:31:30.357997 kubelet[3058]: I0307 01:31:30.357969 3058 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 7 01:31:31.187624 update_engine[1698]: I20260307 01:31:31.187574 1698 update_attempter.cc:509] Updating boot flags... Mar 7 01:31:31.243652 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (3115) Mar 7 01:31:31.507486 systemd[1]: Created slice kubepods-besteffort-pod6ae2a3c9_5090_463c_8d8b_040ed4d91efa.slice - libcontainer container kubepods-besteffort-pod6ae2a3c9_5090_463c_8d8b_040ed4d91efa.slice. Mar 7 01:31:31.538783 kubelet[3058]: I0307 01:31:31.538701 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ae2a3c9-5090-463c-8d8b-040ed4d91efa-lib-modules\") pod \"kube-proxy-ht9qz\" (UID: \"6ae2a3c9-5090-463c-8d8b-040ed4d91efa\") " pod="kube-system/kube-proxy-ht9qz" Mar 7 01:31:31.538783 kubelet[3058]: I0307 01:31:31.538743 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/6ae2a3c9-5090-463c-8d8b-040ed4d91efa-kube-proxy\") pod \"kube-proxy-ht9qz\" (UID: \"6ae2a3c9-5090-463c-8d8b-040ed4d91efa\") " pod="kube-system/kube-proxy-ht9qz" Mar 7 01:31:31.538783 kubelet[3058]: I0307 01:31:31.538761 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6ae2a3c9-5090-463c-8d8b-040ed4d91efa-xtables-lock\") pod \"kube-proxy-ht9qz\" (UID: \"6ae2a3c9-5090-463c-8d8b-040ed4d91efa\") " pod="kube-system/kube-proxy-ht9qz" Mar 7 01:31:31.539216 kubelet[3058]: I0307 01:31:31.538793 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7blp\" (UniqueName: \"kubernetes.io/projected/6ae2a3c9-5090-463c-8d8b-040ed4d91efa-kube-api-access-w7blp\") pod \"kube-proxy-ht9qz\" (UID: \"6ae2a3c9-5090-463c-8d8b-040ed4d91efa\") " pod="kube-system/kube-proxy-ht9qz" Mar 7 01:31:31.616685 systemd[1]: Created slice kubepods-besteffort-podf0e24467_5f0f_4fda_a77d_9fd52015db11.slice - libcontainer container kubepods-besteffort-podf0e24467_5f0f_4fda_a77d_9fd52015db11.slice. Mar 7 01:31:31.640131 kubelet[3058]: I0307 01:31:31.640089 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f0e24467-5f0f-4fda-a77d-9fd52015db11-var-lib-calico\") pod \"tigera-operator-6cf4cccc57-kc4v9\" (UID: \"f0e24467-5f0f-4fda-a77d-9fd52015db11\") " pod="tigera-operator/tigera-operator-6cf4cccc57-kc4v9" Mar 7 01:31:31.640264 kubelet[3058]: I0307 01:31:31.640145 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsr2t\" (UniqueName: \"kubernetes.io/projected/f0e24467-5f0f-4fda-a77d-9fd52015db11-kube-api-access-nsr2t\") pod \"tigera-operator-6cf4cccc57-kc4v9\" (UID: \"f0e24467-5f0f-4fda-a77d-9fd52015db11\") " pod="tigera-operator/tigera-operator-6cf4cccc57-kc4v9" Mar 7 01:31:31.824663 containerd[1718]: time="2026-03-07T01:31:31.824569442Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-ht9qz,Uid:6ae2a3c9-5090-463c-8d8b-040ed4d91efa,Namespace:kube-system,Attempt:0,}" Mar 7 01:31:31.859784 containerd[1718]: time="2026-03-07T01:31:31.859694843Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:31:31.859784 containerd[1718]: time="2026-03-07T01:31:31.859747203Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:31:31.859784 containerd[1718]: time="2026-03-07T01:31:31.859762323Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:31:31.860029 containerd[1718]: time="2026-03-07T01:31:31.859831043Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:31:31.878787 systemd[1]: Started cri-containerd-67338c28387f42f92801f01bb1cb8442954e677909e65bbb394ca4bd2ff0a24d.scope - libcontainer container 67338c28387f42f92801f01bb1cb8442954e677909e65bbb394ca4bd2ff0a24d. Mar 7 01:31:31.896663 containerd[1718]: time="2026-03-07T01:31:31.896489363Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-ht9qz,Uid:6ae2a3c9-5090-463c-8d8b-040ed4d91efa,Namespace:kube-system,Attempt:0,} returns sandbox id \"67338c28387f42f92801f01bb1cb8442954e677909e65bbb394ca4bd2ff0a24d\"" Mar 7 01:31:31.905681 containerd[1718]: time="2026-03-07T01:31:31.905564753Z" level=info msg="CreateContainer within sandbox \"67338c28387f42f92801f01bb1cb8442954e677909e65bbb394ca4bd2ff0a24d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 7 01:31:31.926347 containerd[1718]: time="2026-03-07T01:31:31.926300570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-kc4v9,Uid:f0e24467-5f0f-4fda-a77d-9fd52015db11,Namespace:tigera-operator,Attempt:0,}" Mar 7 01:31:31.939199 containerd[1718]: time="2026-03-07T01:31:31.939148876Z" level=info msg="CreateContainer within sandbox \"67338c28387f42f92801f01bb1cb8442954e677909e65bbb394ca4bd2ff0a24d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9f592d576251d149d703a616a7a7724f08ad8947e01f2d434bc6ce252b6ee63f\"" Mar 7 01:31:31.940193 containerd[1718]: time="2026-03-07T01:31:31.940116554Z" level=info msg="StartContainer for \"9f592d576251d149d703a616a7a7724f08ad8947e01f2d434bc6ce252b6ee63f\"" Mar 7 01:31:31.965125 systemd[1]: Started cri-containerd-9f592d576251d149d703a616a7a7724f08ad8947e01f2d434bc6ce252b6ee63f.scope - libcontainer container 9f592d576251d149d703a616a7a7724f08ad8947e01f2d434bc6ce252b6ee63f. Mar 7 01:31:31.981913 containerd[1718]: time="2026-03-07T01:31:31.981728029Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:31:31.981913 containerd[1718]: time="2026-03-07T01:31:31.981782069Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:31:31.981913 containerd[1718]: time="2026-03-07T01:31:31.981798548Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:31:31.982498 containerd[1718]: time="2026-03-07T01:31:31.982239708Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:31:32.000467 containerd[1718]: time="2026-03-07T01:31:32.000216408Z" level=info msg="StartContainer for \"9f592d576251d149d703a616a7a7724f08ad8947e01f2d434bc6ce252b6ee63f\" returns successfully" Mar 7 01:31:32.008760 systemd[1]: Started cri-containerd-176ae5ac74dcda6f5e32a01f93dad89af01fb6dca87323ecab268146a39ef021.scope - libcontainer container 176ae5ac74dcda6f5e32a01f93dad89af01fb6dca87323ecab268146a39ef021. Mar 7 01:31:32.044194 containerd[1718]: time="2026-03-07T01:31:32.043759040Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-kc4v9,Uid:f0e24467-5f0f-4fda-a77d-9fd52015db11,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"176ae5ac74dcda6f5e32a01f93dad89af01fb6dca87323ecab268146a39ef021\"" Mar 7 01:31:32.046673 containerd[1718]: time="2026-03-07T01:31:32.046479717Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 7 01:31:32.843110 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Mar 7 01:31:33.870799 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3223228370.mount: Deactivated successfully. Mar 7 01:31:34.304727 containerd[1718]: time="2026-03-07T01:31:34.303934169Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:34.306637 containerd[1718]: time="2026-03-07T01:31:34.306604327Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Mar 7 01:31:34.310147 containerd[1718]: time="2026-03-07T01:31:34.309805284Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:34.314080 containerd[1718]: time="2026-03-07T01:31:34.314040920Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:34.314874 containerd[1718]: time="2026-03-07T01:31:34.314842160Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 2.268232283s" Mar 7 01:31:34.314874 containerd[1718]: time="2026-03-07T01:31:34.314873880Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Mar 7 01:31:34.322831 containerd[1718]: time="2026-03-07T01:31:34.322796713Z" level=info msg="CreateContainer within sandbox \"176ae5ac74dcda6f5e32a01f93dad89af01fb6dca87323ecab268146a39ef021\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 7 01:31:34.345912 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1019095728.mount: Deactivated successfully. Mar 7 01:31:34.352930 containerd[1718]: time="2026-03-07T01:31:34.352884448Z" level=info msg="CreateContainer within sandbox \"176ae5ac74dcda6f5e32a01f93dad89af01fb6dca87323ecab268146a39ef021\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"881cfa75ac3ad57fb79eeb028a5e7c33b48263587ce06402a7a17b18320d4609\"" Mar 7 01:31:34.353717 containerd[1718]: time="2026-03-07T01:31:34.353700008Z" level=info msg="StartContainer for \"881cfa75ac3ad57fb79eeb028a5e7c33b48263587ce06402a7a17b18320d4609\"" Mar 7 01:31:34.377759 systemd[1]: Started cri-containerd-881cfa75ac3ad57fb79eeb028a5e7c33b48263587ce06402a7a17b18320d4609.scope - libcontainer container 881cfa75ac3ad57fb79eeb028a5e7c33b48263587ce06402a7a17b18320d4609. Mar 7 01:31:34.410657 containerd[1718]: time="2026-03-07T01:31:34.410609320Z" level=info msg="StartContainer for \"881cfa75ac3ad57fb79eeb028a5e7c33b48263587ce06402a7a17b18320d4609\" returns successfully" Mar 7 01:31:34.725120 kubelet[3058]: I0307 01:31:34.724780 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-proxy-ht9qz" podStartSLOduration=3.7247662999999998 podStartE2EDuration="3.7247663s" podCreationTimestamp="2026-03-07 01:31:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:31:32.083805356 +0000 UTC m=+7.187967158" watchObservedRunningTime="2026-03-07 01:31:34.7247663 +0000 UTC m=+9.828928062" Mar 7 01:31:35.096737 kubelet[3058]: I0307 01:31:35.096603 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6cf4cccc57-kc4v9" podStartSLOduration=1.826256712 podStartE2EDuration="4.096591073s" podCreationTimestamp="2026-03-07 01:31:31 +0000 UTC" firstStartedPulling="2026-03-07 01:31:32.045675158 +0000 UTC m=+7.149836960" lastFinishedPulling="2026-03-07 01:31:34.316009519 +0000 UTC m=+9.420171321" observedRunningTime="2026-03-07 01:31:35.096411553 +0000 UTC m=+10.200573355" watchObservedRunningTime="2026-03-07 01:31:35.096591073 +0000 UTC m=+10.200752875" Mar 7 01:31:40.177734 sudo[2176]: pam_unix(sudo:session): session closed for user root Mar 7 01:31:40.257637 sshd[2173]: pam_unix(sshd:session): session closed for user core Mar 7 01:31:40.262007 systemd[1]: sshd@6-10.200.20.32:22-10.200.16.10:36604.service: Deactivated successfully. Mar 7 01:31:40.266620 systemd[1]: session-9.scope: Deactivated successfully. Mar 7 01:31:40.270703 systemd[1]: session-9.scope: Consumed 4.107s CPU time, 152.1M memory peak, 0B memory swap peak. Mar 7 01:31:40.273085 systemd-logind[1680]: Session 9 logged out. Waiting for processes to exit. Mar 7 01:31:40.274394 systemd-logind[1680]: Removed session 9. Mar 7 01:31:46.208498 systemd[1]: Created slice kubepods-besteffort-pod487e61d8_e58d_4c65_8fd2_5e9593e28acb.slice - libcontainer container kubepods-besteffort-pod487e61d8_e58d_4c65_8fd2_5e9593e28acb.slice. Mar 7 01:31:46.227773 kubelet[3058]: I0307 01:31:46.227639 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/487e61d8-e58d-4c65-8fd2-5e9593e28acb-tigera-ca-bundle\") pod \"calico-typha-6bf5d6d795-sdvbq\" (UID: \"487e61d8-e58d-4c65-8fd2-5e9593e28acb\") " pod="calico-system/calico-typha-6bf5d6d795-sdvbq" Mar 7 01:31:46.227773 kubelet[3058]: I0307 01:31:46.227682 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/487e61d8-e58d-4c65-8fd2-5e9593e28acb-typha-certs\") pod \"calico-typha-6bf5d6d795-sdvbq\" (UID: \"487e61d8-e58d-4c65-8fd2-5e9593e28acb\") " pod="calico-system/calico-typha-6bf5d6d795-sdvbq" Mar 7 01:31:46.227773 kubelet[3058]: I0307 01:31:46.227707 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmdjq\" (UniqueName: \"kubernetes.io/projected/487e61d8-e58d-4c65-8fd2-5e9593e28acb-kube-api-access-nmdjq\") pod \"calico-typha-6bf5d6d795-sdvbq\" (UID: \"487e61d8-e58d-4c65-8fd2-5e9593e28acb\") " pod="calico-system/calico-typha-6bf5d6d795-sdvbq" Mar 7 01:31:46.308270 systemd[1]: Created slice kubepods-besteffort-podb1e2175f_a7e7_4709_805f_e001591cecf5.slice - libcontainer container kubepods-besteffort-podb1e2175f_a7e7_4709_805f_e001591cecf5.slice. Mar 7 01:31:46.328673 kubelet[3058]: I0307 01:31:46.328290 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b1e2175f-a7e7-4709-805f-e001591cecf5-node-certs\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.328673 kubelet[3058]: I0307 01:31:46.328333 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b1e2175f-a7e7-4709-805f-e001591cecf5-sys-fs\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.328673 kubelet[3058]: I0307 01:31:46.328360 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b1e2175f-a7e7-4709-805f-e001591cecf5-lib-modules\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.328673 kubelet[3058]: I0307 01:31:46.328375 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b1e2175f-a7e7-4709-805f-e001591cecf5-var-run-calico\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.328673 kubelet[3058]: I0307 01:31:46.328390 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b1e2175f-a7e7-4709-805f-e001591cecf5-xtables-lock\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.328896 kubelet[3058]: I0307 01:31:46.328407 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b1e2175f-a7e7-4709-805f-e001591cecf5-cni-net-dir\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.328896 kubelet[3058]: I0307 01:31:46.328429 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b1e2175f-a7e7-4709-805f-e001591cecf5-policysync\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.328896 kubelet[3058]: I0307 01:31:46.328444 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1e2175f-a7e7-4709-805f-e001591cecf5-tigera-ca-bundle\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.328896 kubelet[3058]: I0307 01:31:46.328459 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/b1e2175f-a7e7-4709-805f-e001591cecf5-bpffs\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.328896 kubelet[3058]: I0307 01:31:46.328476 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b1e2175f-a7e7-4709-805f-e001591cecf5-flexvol-driver-host\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.329003 kubelet[3058]: I0307 01:31:46.328513 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b1e2175f-a7e7-4709-805f-e001591cecf5-var-lib-calico\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.329003 kubelet[3058]: I0307 01:31:46.328540 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b1e2175f-a7e7-4709-805f-e001591cecf5-cni-bin-dir\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.329003 kubelet[3058]: I0307 01:31:46.328570 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b1e2175f-a7e7-4709-805f-e001591cecf5-cni-log-dir\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.330380 kubelet[3058]: I0307 01:31:46.329516 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/b1e2175f-a7e7-4709-805f-e001591cecf5-nodeproc\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.330380 kubelet[3058]: I0307 01:31:46.329583 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9h7d\" (UniqueName: \"kubernetes.io/projected/b1e2175f-a7e7-4709-805f-e001591cecf5-kube-api-access-q9h7d\") pod \"calico-node-j7g6q\" (UID: \"b1e2175f-a7e7-4709-805f-e001591cecf5\") " pod="calico-system/calico-node-j7g6q" Mar 7 01:31:46.416346 kubelet[3058]: E0307 01:31:46.416209 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-d5j7z" podUID="959c57a8-5ea2-429c-aa5b-3c7b113e7280" Mar 7 01:31:46.433369 kubelet[3058]: E0307 01:31:46.433285 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.433369 kubelet[3058]: W0307 01:31:46.433308 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.433369 kubelet[3058]: E0307 01:31:46.433328 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.433720 kubelet[3058]: E0307 01:31:46.433623 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.433720 kubelet[3058]: W0307 01:31:46.433643 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.433720 kubelet[3058]: E0307 01:31:46.433654 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.434884 kubelet[3058]: E0307 01:31:46.434863 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.434884 kubelet[3058]: W0307 01:31:46.434878 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.435129 kubelet[3058]: E0307 01:31:46.434897 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.435129 kubelet[3058]: E0307 01:31:46.435096 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.435129 kubelet[3058]: W0307 01:31:46.435104 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.435129 kubelet[3058]: E0307 01:31:46.435112 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.443403 kubelet[3058]: E0307 01:31:46.441766 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.443403 kubelet[3058]: W0307 01:31:46.441788 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.443403 kubelet[3058]: E0307 01:31:46.441809 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.443669 kubelet[3058]: E0307 01:31:46.443634 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.443669 kubelet[3058]: W0307 01:31:46.443652 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.443919 kubelet[3058]: E0307 01:31:46.443670 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.446167 kubelet[3058]: E0307 01:31:46.446132 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.446167 kubelet[3058]: W0307 01:31:46.446153 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.446167 kubelet[3058]: E0307 01:31:46.446170 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.447004 kubelet[3058]: E0307 01:31:46.446902 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.447004 kubelet[3058]: W0307 01:31:46.446917 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.447004 kubelet[3058]: E0307 01:31:46.446932 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.447340 kubelet[3058]: E0307 01:31:46.447321 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.447340 kubelet[3058]: W0307 01:31:46.447336 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.447521 kubelet[3058]: E0307 01:31:46.447347 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.447990 kubelet[3058]: E0307 01:31:46.447883 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.447990 kubelet[3058]: W0307 01:31:46.447988 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.448099 kubelet[3058]: E0307 01:31:46.448004 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.448683 kubelet[3058]: E0307 01:31:46.448657 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.448683 kubelet[3058]: W0307 01:31:46.448679 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.448782 kubelet[3058]: E0307 01:31:46.448691 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.449288 kubelet[3058]: E0307 01:31:46.449268 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.449288 kubelet[3058]: W0307 01:31:46.449283 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.449388 kubelet[3058]: E0307 01:31:46.449295 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.449865 kubelet[3058]: E0307 01:31:46.449844 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.449865 kubelet[3058]: W0307 01:31:46.449858 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.449957 kubelet[3058]: E0307 01:31:46.449870 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.464907 kubelet[3058]: E0307 01:31:46.464762 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.464907 kubelet[3058]: W0307 01:31:46.464783 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.464907 kubelet[3058]: E0307 01:31:46.464803 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.512574 kubelet[3058]: E0307 01:31:46.512408 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.512574 kubelet[3058]: W0307 01:31:46.512431 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.512574 kubelet[3058]: E0307 01:31:46.512481 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.512801 kubelet[3058]: E0307 01:31:46.512672 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.512801 kubelet[3058]: W0307 01:31:46.512681 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.512801 kubelet[3058]: E0307 01:31:46.512694 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.512977 kubelet[3058]: E0307 01:31:46.512820 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.512977 kubelet[3058]: W0307 01:31:46.512828 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.512977 kubelet[3058]: E0307 01:31:46.512835 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.513105 kubelet[3058]: E0307 01:31:46.513093 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.513105 kubelet[3058]: W0307 01:31:46.513103 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.513181 kubelet[3058]: E0307 01:31:46.513111 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.513286 kubelet[3058]: E0307 01:31:46.513275 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.513286 kubelet[3058]: W0307 01:31:46.513285 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.513423 kubelet[3058]: E0307 01:31:46.513294 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.513423 kubelet[3058]: E0307 01:31:46.513415 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.513423 kubelet[3058]: W0307 01:31:46.513421 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.513526 kubelet[3058]: E0307 01:31:46.513428 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.513570 kubelet[3058]: E0307 01:31:46.513540 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.513570 kubelet[3058]: W0307 01:31:46.513561 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.513570 kubelet[3058]: E0307 01:31:46.513569 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.513808 kubelet[3058]: E0307 01:31:46.513750 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.513808 kubelet[3058]: W0307 01:31:46.513759 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.513808 kubelet[3058]: E0307 01:31:46.513768 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.513926 kubelet[3058]: E0307 01:31:46.513915 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.513926 kubelet[3058]: W0307 01:31:46.513925 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.513997 kubelet[3058]: E0307 01:31:46.513934 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.514057 kubelet[3058]: E0307 01:31:46.514048 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.514057 kubelet[3058]: W0307 01:31:46.514056 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.514109 kubelet[3058]: E0307 01:31:46.514063 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.514185 kubelet[3058]: E0307 01:31:46.514175 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.514185 kubelet[3058]: W0307 01:31:46.514183 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.514256 kubelet[3058]: E0307 01:31:46.514191 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.514318 kubelet[3058]: E0307 01:31:46.514307 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.514318 kubelet[3058]: W0307 01:31:46.514316 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.514368 kubelet[3058]: E0307 01:31:46.514324 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.514523 kubelet[3058]: E0307 01:31:46.514511 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.514523 kubelet[3058]: W0307 01:31:46.514521 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.516938 kubelet[3058]: E0307 01:31:46.514530 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.516938 kubelet[3058]: E0307 01:31:46.514718 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.516938 kubelet[3058]: W0307 01:31:46.514725 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.516938 kubelet[3058]: E0307 01:31:46.514732 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.516938 kubelet[3058]: E0307 01:31:46.514854 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.516938 kubelet[3058]: W0307 01:31:46.514862 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.516938 kubelet[3058]: E0307 01:31:46.514869 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.516938 kubelet[3058]: E0307 01:31:46.514993 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.516938 kubelet[3058]: W0307 01:31:46.515000 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.516938 kubelet[3058]: E0307 01:31:46.515007 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.517138 kubelet[3058]: E0307 01:31:46.515145 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.517138 kubelet[3058]: W0307 01:31:46.515153 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.517138 kubelet[3058]: E0307 01:31:46.515160 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.517138 kubelet[3058]: E0307 01:31:46.515281 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.517138 kubelet[3058]: W0307 01:31:46.515290 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.517138 kubelet[3058]: E0307 01:31:46.515297 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.517138 kubelet[3058]: E0307 01:31:46.515422 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.517138 kubelet[3058]: W0307 01:31:46.515429 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.517138 kubelet[3058]: E0307 01:31:46.515436 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.517138 kubelet[3058]: E0307 01:31:46.515603 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.517333 kubelet[3058]: W0307 01:31:46.515613 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.517333 kubelet[3058]: E0307 01:31:46.515622 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.519100 containerd[1718]: time="2026-03-07T01:31:46.518734160Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6bf5d6d795-sdvbq,Uid:487e61d8-e58d-4c65-8fd2-5e9593e28acb,Namespace:calico-system,Attempt:0,}" Mar 7 01:31:46.532046 kubelet[3058]: E0307 01:31:46.532019 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.533324 kubelet[3058]: W0307 01:31:46.532128 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.533324 kubelet[3058]: E0307 01:31:46.532181 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.533324 kubelet[3058]: I0307 01:31:46.532218 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/959c57a8-5ea2-429c-aa5b-3c7b113e7280-kubelet-dir\") pod \"csi-node-driver-d5j7z\" (UID: \"959c57a8-5ea2-429c-aa5b-3c7b113e7280\") " pod="calico-system/csi-node-driver-d5j7z" Mar 7 01:31:46.533324 kubelet[3058]: E0307 01:31:46.532380 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.533324 kubelet[3058]: W0307 01:31:46.532389 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.533324 kubelet[3058]: E0307 01:31:46.532397 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.533324 kubelet[3058]: I0307 01:31:46.532417 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvhvt\" (UniqueName: \"kubernetes.io/projected/959c57a8-5ea2-429c-aa5b-3c7b113e7280-kube-api-access-hvhvt\") pod \"csi-node-driver-d5j7z\" (UID: \"959c57a8-5ea2-429c-aa5b-3c7b113e7280\") " pod="calico-system/csi-node-driver-d5j7z" Mar 7 01:31:46.533324 kubelet[3058]: E0307 01:31:46.532585 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.533514 kubelet[3058]: W0307 01:31:46.532595 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.533514 kubelet[3058]: E0307 01:31:46.532606 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.533514 kubelet[3058]: E0307 01:31:46.532754 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.533514 kubelet[3058]: W0307 01:31:46.532762 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.533514 kubelet[3058]: E0307 01:31:46.532771 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.533514 kubelet[3058]: E0307 01:31:46.532924 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.533514 kubelet[3058]: W0307 01:31:46.532931 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.533514 kubelet[3058]: E0307 01:31:46.532939 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.533514 kubelet[3058]: I0307 01:31:46.532958 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/959c57a8-5ea2-429c-aa5b-3c7b113e7280-varrun\") pod \"csi-node-driver-d5j7z\" (UID: \"959c57a8-5ea2-429c-aa5b-3c7b113e7280\") " pod="calico-system/csi-node-driver-d5j7z" Mar 7 01:31:46.533719 kubelet[3058]: E0307 01:31:46.533127 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.533719 kubelet[3058]: W0307 01:31:46.533139 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.533719 kubelet[3058]: E0307 01:31:46.533153 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.533956 kubelet[3058]: E0307 01:31:46.533830 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.533956 kubelet[3058]: W0307 01:31:46.533844 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.533956 kubelet[3058]: E0307 01:31:46.533856 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.534182 kubelet[3058]: E0307 01:31:46.534035 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.534182 kubelet[3058]: W0307 01:31:46.534045 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.534182 kubelet[3058]: E0307 01:31:46.534057 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.534182 kubelet[3058]: I0307 01:31:46.534078 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/959c57a8-5ea2-429c-aa5b-3c7b113e7280-registration-dir\") pod \"csi-node-driver-d5j7z\" (UID: \"959c57a8-5ea2-429c-aa5b-3c7b113e7280\") " pod="calico-system/csi-node-driver-d5j7z" Mar 7 01:31:46.534270 kubelet[3058]: E0307 01:31:46.534245 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.534270 kubelet[3058]: W0307 01:31:46.534254 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.534270 kubelet[3058]: E0307 01:31:46.534264 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.534476 kubelet[3058]: E0307 01:31:46.534431 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.534476 kubelet[3058]: W0307 01:31:46.534443 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.534476 kubelet[3058]: E0307 01:31:46.534454 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.534737 kubelet[3058]: E0307 01:31:46.534721 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.534737 kubelet[3058]: W0307 01:31:46.534735 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.534794 kubelet[3058]: E0307 01:31:46.534747 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.534794 kubelet[3058]: I0307 01:31:46.534782 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/959c57a8-5ea2-429c-aa5b-3c7b113e7280-socket-dir\") pod \"csi-node-driver-d5j7z\" (UID: \"959c57a8-5ea2-429c-aa5b-3c7b113e7280\") " pod="calico-system/csi-node-driver-d5j7z" Mar 7 01:31:46.534995 kubelet[3058]: E0307 01:31:46.534980 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.534995 kubelet[3058]: W0307 01:31:46.534993 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.535062 kubelet[3058]: E0307 01:31:46.535005 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.535176 kubelet[3058]: E0307 01:31:46.535162 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.535176 kubelet[3058]: W0307 01:31:46.535175 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.535230 kubelet[3058]: E0307 01:31:46.535184 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.535374 kubelet[3058]: E0307 01:31:46.535362 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.535374 kubelet[3058]: W0307 01:31:46.535372 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.535439 kubelet[3058]: E0307 01:31:46.535384 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.535540 kubelet[3058]: E0307 01:31:46.535529 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.535586 kubelet[3058]: W0307 01:31:46.535539 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.535586 kubelet[3058]: E0307 01:31:46.535569 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.555783 containerd[1718]: time="2026-03-07T01:31:46.555676603Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:31:46.555783 containerd[1718]: time="2026-03-07T01:31:46.555728123Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:31:46.555783 containerd[1718]: time="2026-03-07T01:31:46.555738723Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:31:46.556779 containerd[1718]: time="2026-03-07T01:31:46.555815283Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:31:46.574870 systemd[1]: Started cri-containerd-6da573b22adaed8f9cb7b7e4f9c7e2f73f05d624a1a11bbbb446f25412fa4698.scope - libcontainer container 6da573b22adaed8f9cb7b7e4f9c7e2f73f05d624a1a11bbbb446f25412fa4698. Mar 7 01:31:46.607878 containerd[1718]: time="2026-03-07T01:31:46.607747150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6bf5d6d795-sdvbq,Uid:487e61d8-e58d-4c65-8fd2-5e9593e28acb,Namespace:calico-system,Attempt:0,} returns sandbox id \"6da573b22adaed8f9cb7b7e4f9c7e2f73f05d624a1a11bbbb446f25412fa4698\"" Mar 7 01:31:46.609682 containerd[1718]: time="2026-03-07T01:31:46.609655388Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 7 01:31:46.617168 containerd[1718]: time="2026-03-07T01:31:46.617128101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-j7g6q,Uid:b1e2175f-a7e7-4709-805f-e001591cecf5,Namespace:calico-system,Attempt:0,}" Mar 7 01:31:46.635570 kubelet[3058]: E0307 01:31:46.635507 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.635570 kubelet[3058]: W0307 01:31:46.635540 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.635570 kubelet[3058]: E0307 01:31:46.635577 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.636513 kubelet[3058]: E0307 01:31:46.635789 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.636513 kubelet[3058]: W0307 01:31:46.635798 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.636513 kubelet[3058]: E0307 01:31:46.635807 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.636513 kubelet[3058]: E0307 01:31:46.635978 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.636513 kubelet[3058]: W0307 01:31:46.635985 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.636513 kubelet[3058]: E0307 01:31:46.636006 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.636513 kubelet[3058]: E0307 01:31:46.636183 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.636513 kubelet[3058]: W0307 01:31:46.636191 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.636513 kubelet[3058]: E0307 01:31:46.636199 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.636513 kubelet[3058]: E0307 01:31:46.636364 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.636784 kubelet[3058]: W0307 01:31:46.636372 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.636784 kubelet[3058]: E0307 01:31:46.636394 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.637307 kubelet[3058]: E0307 01:31:46.637029 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.637307 kubelet[3058]: W0307 01:31:46.637141 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.637307 kubelet[3058]: E0307 01:31:46.637160 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.638043 kubelet[3058]: E0307 01:31:46.637971 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.638043 kubelet[3058]: W0307 01:31:46.637984 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.638043 kubelet[3058]: E0307 01:31:46.638002 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.638301 kubelet[3058]: E0307 01:31:46.638287 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.638301 kubelet[3058]: W0307 01:31:46.638300 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.638374 kubelet[3058]: E0307 01:31:46.638310 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.638539 kubelet[3058]: E0307 01:31:46.638507 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.638539 kubelet[3058]: W0307 01:31:46.638525 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.638539 kubelet[3058]: E0307 01:31:46.638534 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.638838 kubelet[3058]: E0307 01:31:46.638820 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.638838 kubelet[3058]: W0307 01:31:46.638833 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.638909 kubelet[3058]: E0307 01:31:46.638843 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.639054 kubelet[3058]: E0307 01:31:46.639041 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.639054 kubelet[3058]: W0307 01:31:46.639052 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.639455 kubelet[3058]: E0307 01:31:46.639060 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.639805 kubelet[3058]: E0307 01:31:46.639771 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.639805 kubelet[3058]: W0307 01:31:46.639794 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.639805 kubelet[3058]: E0307 01:31:46.639807 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.640531 kubelet[3058]: E0307 01:31:46.640507 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.640531 kubelet[3058]: W0307 01:31:46.640524 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.640647 kubelet[3058]: E0307 01:31:46.640537 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.641790 kubelet[3058]: E0307 01:31:46.641771 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.641790 kubelet[3058]: W0307 01:31:46.641786 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.641944 kubelet[3058]: E0307 01:31:46.641796 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.642024 kubelet[3058]: E0307 01:31:46.642011 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.642024 kubelet[3058]: W0307 01:31:46.642021 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.642085 kubelet[3058]: E0307 01:31:46.642031 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.642271 kubelet[3058]: E0307 01:31:46.642255 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.642271 kubelet[3058]: W0307 01:31:46.642269 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.642422 kubelet[3058]: E0307 01:31:46.642278 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.642501 kubelet[3058]: E0307 01:31:46.642486 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.642501 kubelet[3058]: W0307 01:31:46.642499 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.642673 kubelet[3058]: E0307 01:31:46.642507 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.642760 kubelet[3058]: E0307 01:31:46.642745 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.642760 kubelet[3058]: W0307 01:31:46.642757 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.642820 kubelet[3058]: E0307 01:31:46.642766 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.642980 kubelet[3058]: E0307 01:31:46.642968 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.642980 kubelet[3058]: W0307 01:31:46.642978 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.643042 kubelet[3058]: E0307 01:31:46.642987 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.643174 kubelet[3058]: E0307 01:31:46.643161 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.643174 kubelet[3058]: W0307 01:31:46.643173 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.643235 kubelet[3058]: E0307 01:31:46.643182 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.643421 kubelet[3058]: E0307 01:31:46.643406 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.643421 kubelet[3058]: W0307 01:31:46.643419 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.643498 kubelet[3058]: E0307 01:31:46.643429 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.644027 kubelet[3058]: E0307 01:31:46.644010 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.644027 kubelet[3058]: W0307 01:31:46.644024 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.644125 kubelet[3058]: E0307 01:31:46.644035 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.644264 kubelet[3058]: E0307 01:31:46.644251 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.644264 kubelet[3058]: W0307 01:31:46.644262 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.644323 kubelet[3058]: E0307 01:31:46.644271 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.644606 kubelet[3058]: E0307 01:31:46.644590 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.644606 kubelet[3058]: W0307 01:31:46.644603 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.644682 kubelet[3058]: E0307 01:31:46.644613 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.644991 kubelet[3058]: E0307 01:31:46.644976 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.644991 kubelet[3058]: W0307 01:31:46.644989 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.645052 kubelet[3058]: E0307 01:31:46.644999 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.659084 kubelet[3058]: E0307 01:31:46.658636 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:46.659084 kubelet[3058]: W0307 01:31:46.658661 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:46.659084 kubelet[3058]: E0307 01:31:46.658681 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:46.660401 containerd[1718]: time="2026-03-07T01:31:46.660191537Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:31:46.660401 containerd[1718]: time="2026-03-07T01:31:46.660249817Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:31:46.660401 containerd[1718]: time="2026-03-07T01:31:46.660260457Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:31:46.660401 containerd[1718]: time="2026-03-07T01:31:46.660335377Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:31:46.676753 systemd[1]: Started cri-containerd-afcda6d4e63ded693c654273e91d4c6404d6a9d7c7f20e181effad11e2f299d9.scope - libcontainer container afcda6d4e63ded693c654273e91d4c6404d6a9d7c7f20e181effad11e2f299d9. Mar 7 01:31:46.699590 containerd[1718]: time="2026-03-07T01:31:46.699535378Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-j7g6q,Uid:b1e2175f-a7e7-4709-805f-e001591cecf5,Namespace:calico-system,Attempt:0,} returns sandbox id \"afcda6d4e63ded693c654273e91d4c6404d6a9d7c7f20e181effad11e2f299d9\"" Mar 7 01:31:47.903732 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3670604680.mount: Deactivated successfully. Mar 7 01:31:48.029113 kubelet[3058]: E0307 01:31:48.029070 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-d5j7z" podUID="959c57a8-5ea2-429c-aa5b-3c7b113e7280" Mar 7 01:31:48.877958 containerd[1718]: time="2026-03-07T01:31:48.877908737Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:48.880739 containerd[1718]: time="2026-03-07T01:31:48.880696574Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Mar 7 01:31:48.884566 containerd[1718]: time="2026-03-07T01:31:48.884021691Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:48.889056 containerd[1718]: time="2026-03-07T01:31:48.889017486Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:48.889902 containerd[1718]: time="2026-03-07T01:31:48.889872485Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 2.280043937s" Mar 7 01:31:48.889982 containerd[1718]: time="2026-03-07T01:31:48.889903005Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Mar 7 01:31:48.891954 containerd[1718]: time="2026-03-07T01:31:48.891920523Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 7 01:31:48.909749 containerd[1718]: time="2026-03-07T01:31:48.909625065Z" level=info msg="CreateContainer within sandbox \"6da573b22adaed8f9cb7b7e4f9c7e2f73f05d624a1a11bbbb446f25412fa4698\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 7 01:31:48.959147 containerd[1718]: time="2026-03-07T01:31:48.958995655Z" level=info msg="CreateContainer within sandbox \"6da573b22adaed8f9cb7b7e4f9c7e2f73f05d624a1a11bbbb446f25412fa4698\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"151ba0a9c768bb93f035fc92cb92926ec39c7886c42dbcc48edaae49b81d87d1\"" Mar 7 01:31:48.959730 containerd[1718]: time="2026-03-07T01:31:48.959674654Z" level=info msg="StartContainer for \"151ba0a9c768bb93f035fc92cb92926ec39c7886c42dbcc48edaae49b81d87d1\"" Mar 7 01:31:48.985743 systemd[1]: Started cri-containerd-151ba0a9c768bb93f035fc92cb92926ec39c7886c42dbcc48edaae49b81d87d1.scope - libcontainer container 151ba0a9c768bb93f035fc92cb92926ec39c7886c42dbcc48edaae49b81d87d1. Mar 7 01:31:49.025822 containerd[1718]: time="2026-03-07T01:31:49.025535788Z" level=info msg="StartContainer for \"151ba0a9c768bb93f035fc92cb92926ec39c7886c42dbcc48edaae49b81d87d1\" returns successfully" Mar 7 01:31:49.133124 kubelet[3058]: E0307 01:31:49.133018 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.133124 kubelet[3058]: W0307 01:31:49.133049 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.133124 kubelet[3058]: E0307 01:31:49.133101 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.133494 kubelet[3058]: E0307 01:31:49.133270 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.133494 kubelet[3058]: W0307 01:31:49.133281 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.133494 kubelet[3058]: E0307 01:31:49.133290 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.133587 kubelet[3058]: E0307 01:31:49.133514 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.133587 kubelet[3058]: W0307 01:31:49.133523 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.133587 kubelet[3058]: E0307 01:31:49.133532 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.134599 kubelet[3058]: E0307 01:31:49.134574 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.134599 kubelet[3058]: W0307 01:31:49.134594 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.134702 kubelet[3058]: E0307 01:31:49.134608 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.134820 kubelet[3058]: E0307 01:31:49.134804 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.134820 kubelet[3058]: W0307 01:31:49.134817 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.134882 kubelet[3058]: E0307 01:31:49.134826 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.134994 kubelet[3058]: E0307 01:31:49.134978 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.134994 kubelet[3058]: W0307 01:31:49.134991 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.135049 kubelet[3058]: E0307 01:31:49.135000 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.135161 kubelet[3058]: E0307 01:31:49.135142 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.135161 kubelet[3058]: W0307 01:31:49.135156 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.135224 kubelet[3058]: E0307 01:31:49.135165 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.135400 kubelet[3058]: E0307 01:31:49.135322 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.135400 kubelet[3058]: W0307 01:31:49.135340 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.135400 kubelet[3058]: E0307 01:31:49.135349 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.135614 kubelet[3058]: E0307 01:31:49.135506 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.135614 kubelet[3058]: W0307 01:31:49.135517 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.135614 kubelet[3058]: E0307 01:31:49.135528 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.135980 kubelet[3058]: E0307 01:31:49.135721 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.135980 kubelet[3058]: W0307 01:31:49.135734 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.135980 kubelet[3058]: E0307 01:31:49.135743 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.135980 kubelet[3058]: E0307 01:31:49.135892 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.135980 kubelet[3058]: W0307 01:31:49.135899 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.135980 kubelet[3058]: E0307 01:31:49.135907 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.136444 kubelet[3058]: E0307 01:31:49.136423 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.136444 kubelet[3058]: W0307 01:31:49.136439 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.136444 kubelet[3058]: E0307 01:31:49.136452 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.136828 kubelet[3058]: E0307 01:31:49.136696 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.136828 kubelet[3058]: W0307 01:31:49.136709 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.136828 kubelet[3058]: E0307 01:31:49.136722 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.137067 kubelet[3058]: E0307 01:31:49.136979 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.137067 kubelet[3058]: W0307 01:31:49.136993 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.137067 kubelet[3058]: E0307 01:31:49.137003 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.137405 kubelet[3058]: E0307 01:31:49.137316 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.137405 kubelet[3058]: W0307 01:31:49.137331 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.137405 kubelet[3058]: E0307 01:31:49.137342 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.154991 kubelet[3058]: E0307 01:31:49.154935 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.154991 kubelet[3058]: W0307 01:31:49.154957 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.155158 kubelet[3058]: E0307 01:31:49.155076 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.155788 kubelet[3058]: E0307 01:31:49.155600 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.155788 kubelet[3058]: W0307 01:31:49.155786 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.156612 kubelet[3058]: E0307 01:31:49.155803 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.156612 kubelet[3058]: E0307 01:31:49.156105 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.156612 kubelet[3058]: W0307 01:31:49.156115 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.156612 kubelet[3058]: E0307 01:31:49.156126 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.156612 kubelet[3058]: E0307 01:31:49.156342 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.156612 kubelet[3058]: W0307 01:31:49.156351 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.156612 kubelet[3058]: E0307 01:31:49.156359 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.156795 kubelet[3058]: E0307 01:31:49.156739 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.156795 kubelet[3058]: W0307 01:31:49.156749 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.156795 kubelet[3058]: E0307 01:31:49.156761 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.157563 kubelet[3058]: E0307 01:31:49.157131 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.157563 kubelet[3058]: W0307 01:31:49.157143 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.157563 kubelet[3058]: E0307 01:31:49.157155 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.158664 kubelet[3058]: E0307 01:31:49.158562 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.158664 kubelet[3058]: W0307 01:31:49.158577 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.158664 kubelet[3058]: E0307 01:31:49.158590 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.160385 kubelet[3058]: E0307 01:31:49.160234 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.160385 kubelet[3058]: W0307 01:31:49.160252 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.160385 kubelet[3058]: E0307 01:31:49.160266 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.160579 kubelet[3058]: E0307 01:31:49.160568 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.160633 kubelet[3058]: W0307 01:31:49.160623 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.160700 kubelet[3058]: E0307 01:31:49.160686 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.161005 kubelet[3058]: E0307 01:31:49.160914 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.161005 kubelet[3058]: W0307 01:31:49.160924 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.161005 kubelet[3058]: E0307 01:31:49.160935 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.161227 kubelet[3058]: E0307 01:31:49.161215 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.161290 kubelet[3058]: W0307 01:31:49.161280 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.161522 kubelet[3058]: E0307 01:31:49.161391 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.162634 kubelet[3058]: E0307 01:31:49.161727 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.162634 kubelet[3058]: W0307 01:31:49.161742 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.162634 kubelet[3058]: E0307 01:31:49.161755 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.163032 kubelet[3058]: E0307 01:31:49.163008 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.163032 kubelet[3058]: W0307 01:31:49.163029 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.163112 kubelet[3058]: E0307 01:31:49.163052 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.166665 kubelet[3058]: E0307 01:31:49.166631 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.166665 kubelet[3058]: W0307 01:31:49.166659 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.166780 kubelet[3058]: E0307 01:31:49.166682 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.167003 kubelet[3058]: E0307 01:31:49.166984 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.167003 kubelet[3058]: W0307 01:31:49.166996 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.167084 kubelet[3058]: E0307 01:31:49.167007 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.167237 kubelet[3058]: E0307 01:31:49.167223 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.167277 kubelet[3058]: W0307 01:31:49.167240 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.167277 kubelet[3058]: E0307 01:31:49.167252 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.167526 kubelet[3058]: E0307 01:31:49.167511 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.167526 kubelet[3058]: W0307 01:31:49.167523 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.167626 kubelet[3058]: E0307 01:31:49.167533 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:49.168091 kubelet[3058]: E0307 01:31:49.168074 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:49.168091 kubelet[3058]: W0307 01:31:49.168088 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:49.168170 kubelet[3058]: E0307 01:31:49.168100 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.029210 kubelet[3058]: E0307 01:31:50.029166 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-d5j7z" podUID="959c57a8-5ea2-429c-aa5b-3c7b113e7280" Mar 7 01:31:50.050741 containerd[1718]: time="2026-03-07T01:31:50.050698554Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:50.054401 containerd[1718]: time="2026-03-07T01:31:50.054369473Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Mar 7 01:31:50.059093 containerd[1718]: time="2026-03-07T01:31:50.058001432Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:50.063045 containerd[1718]: time="2026-03-07T01:31:50.063008191Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:50.064566 containerd[1718]: time="2026-03-07T01:31:50.063772191Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.171818828s" Mar 7 01:31:50.064566 containerd[1718]: time="2026-03-07T01:31:50.063804751Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Mar 7 01:31:50.074388 containerd[1718]: time="2026-03-07T01:31:50.074355867Z" level=info msg="CreateContainer within sandbox \"afcda6d4e63ded693c654273e91d4c6404d6a9d7c7f20e181effad11e2f299d9\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 7 01:31:50.108754 containerd[1718]: time="2026-03-07T01:31:50.108645897Z" level=info msg="CreateContainer within sandbox \"afcda6d4e63ded693c654273e91d4c6404d6a9d7c7f20e181effad11e2f299d9\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"33cb37ae8b7c4e7eddc922b36d94f60c81cc67a06a4b5e823b10e56b9f5401c5\"" Mar 7 01:31:50.110484 containerd[1718]: time="2026-03-07T01:31:50.110447817Z" level=info msg="StartContainer for \"33cb37ae8b7c4e7eddc922b36d94f60c81cc67a06a4b5e823b10e56b9f5401c5\"" Mar 7 01:31:50.123591 kubelet[3058]: I0307 01:31:50.122768 3058 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 7 01:31:50.142791 systemd[1]: Started cri-containerd-33cb37ae8b7c4e7eddc922b36d94f60c81cc67a06a4b5e823b10e56b9f5401c5.scope - libcontainer container 33cb37ae8b7c4e7eddc922b36d94f60c81cc67a06a4b5e823b10e56b9f5401c5. Mar 7 01:31:50.144887 kubelet[3058]: E0307 01:31:50.144860 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.144887 kubelet[3058]: W0307 01:31:50.144883 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.145200 kubelet[3058]: E0307 01:31:50.144904 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.146096 kubelet[3058]: E0307 01:31:50.146072 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.146096 kubelet[3058]: W0307 01:31:50.146093 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.146221 kubelet[3058]: E0307 01:31:50.146112 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.146433 kubelet[3058]: E0307 01:31:50.146417 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.146487 kubelet[3058]: W0307 01:31:50.146434 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.146487 kubelet[3058]: E0307 01:31:50.146445 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.146755 kubelet[3058]: E0307 01:31:50.146741 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.146755 kubelet[3058]: W0307 01:31:50.146754 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.146938 kubelet[3058]: E0307 01:31:50.146764 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.147156 kubelet[3058]: E0307 01:31:50.147130 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.147156 kubelet[3058]: W0307 01:31:50.147143 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.147156 kubelet[3058]: E0307 01:31:50.147153 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.147424 kubelet[3058]: E0307 01:31:50.147407 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.147424 kubelet[3058]: W0307 01:31:50.147420 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.147588 kubelet[3058]: E0307 01:31:50.147430 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.147796 kubelet[3058]: E0307 01:31:50.147782 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.147796 kubelet[3058]: W0307 01:31:50.147793 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.147860 kubelet[3058]: E0307 01:31:50.147804 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.148091 kubelet[3058]: E0307 01:31:50.148078 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.148091 kubelet[3058]: W0307 01:31:50.148089 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.148246 kubelet[3058]: E0307 01:31:50.148099 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.148383 kubelet[3058]: E0307 01:31:50.148369 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.148383 kubelet[3058]: W0307 01:31:50.148381 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.148531 kubelet[3058]: E0307 01:31:50.148392 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.149001 kubelet[3058]: E0307 01:31:50.148722 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.149001 kubelet[3058]: W0307 01:31:50.148736 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.149001 kubelet[3058]: E0307 01:31:50.148746 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.149114 kubelet[3058]: E0307 01:31:50.149078 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.149114 kubelet[3058]: W0307 01:31:50.149088 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.149114 kubelet[3058]: E0307 01:31:50.149107 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.149289 kubelet[3058]: E0307 01:31:50.149277 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.149289 kubelet[3058]: W0307 01:31:50.149287 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.149386 kubelet[3058]: E0307 01:31:50.149296 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.149683 kubelet[3058]: E0307 01:31:50.149456 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.149683 kubelet[3058]: W0307 01:31:50.149467 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.149683 kubelet[3058]: E0307 01:31:50.149488 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.149683 kubelet[3058]: E0307 01:31:50.149640 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.149683 kubelet[3058]: W0307 01:31:50.149647 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.149683 kubelet[3058]: E0307 01:31:50.149655 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.150600 kubelet[3058]: E0307 01:31:50.149801 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.150600 kubelet[3058]: W0307 01:31:50.149808 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.150600 kubelet[3058]: E0307 01:31:50.149816 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.165667 kubelet[3058]: E0307 01:31:50.165522 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.165667 kubelet[3058]: W0307 01:31:50.165606 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.165667 kubelet[3058]: E0307 01:31:50.165627 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.166246 kubelet[3058]: E0307 01:31:50.166116 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.166246 kubelet[3058]: W0307 01:31:50.166129 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.166246 kubelet[3058]: E0307 01:31:50.166139 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.166528 kubelet[3058]: E0307 01:31:50.166399 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.166528 kubelet[3058]: W0307 01:31:50.166409 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.166528 kubelet[3058]: E0307 01:31:50.166419 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.166757 kubelet[3058]: E0307 01:31:50.166741 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.166872 kubelet[3058]: W0307 01:31:50.166756 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.166909 kubelet[3058]: E0307 01:31:50.166874 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.167225 kubelet[3058]: E0307 01:31:50.167203 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.167225 kubelet[3058]: W0307 01:31:50.167218 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.167348 kubelet[3058]: E0307 01:31:50.167326 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.167692 kubelet[3058]: E0307 01:31:50.167673 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.167692 kubelet[3058]: W0307 01:31:50.167689 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.167825 kubelet[3058]: E0307 01:31:50.167703 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.168080 kubelet[3058]: E0307 01:31:50.168064 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.168080 kubelet[3058]: W0307 01:31:50.168078 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.168177 kubelet[3058]: E0307 01:31:50.168089 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.168332 kubelet[3058]: E0307 01:31:50.168316 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.168332 kubelet[3058]: W0307 01:31:50.168331 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.168436 kubelet[3058]: E0307 01:31:50.168341 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.168725 kubelet[3058]: E0307 01:31:50.168708 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.168725 kubelet[3058]: W0307 01:31:50.168721 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.168812 kubelet[3058]: E0307 01:31:50.168732 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.169119 kubelet[3058]: E0307 01:31:50.169029 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.169119 kubelet[3058]: W0307 01:31:50.169041 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.169119 kubelet[3058]: E0307 01:31:50.169051 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.169702 kubelet[3058]: E0307 01:31:50.169583 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.169702 kubelet[3058]: W0307 01:31:50.169599 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.169702 kubelet[3058]: E0307 01:31:50.169613 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.170437 kubelet[3058]: E0307 01:31:50.170335 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.170437 kubelet[3058]: W0307 01:31:50.170354 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.170437 kubelet[3058]: E0307 01:31:50.170371 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.170972 kubelet[3058]: E0307 01:31:50.170762 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.170972 kubelet[3058]: W0307 01:31:50.170776 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.170972 kubelet[3058]: E0307 01:31:50.170787 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.171927 kubelet[3058]: E0307 01:31:50.171164 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.171927 kubelet[3058]: W0307 01:31:50.171176 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.171927 kubelet[3058]: E0307 01:31:50.171187 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.173209 kubelet[3058]: E0307 01:31:50.173187 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.173209 kubelet[3058]: W0307 01:31:50.173204 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.173302 kubelet[3058]: E0307 01:31:50.173219 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.174039 kubelet[3058]: E0307 01:31:50.173998 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.174039 kubelet[3058]: W0307 01:31:50.174015 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.175792 kubelet[3058]: E0307 01:31:50.175726 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.176027 kubelet[3058]: E0307 01:31:50.176013 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.176373 kubelet[3058]: W0307 01:31:50.176026 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.176373 kubelet[3058]: E0307 01:31:50.176048 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.176441 kubelet[3058]: E0307 01:31:50.176398 3058 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:31:50.176441 kubelet[3058]: W0307 01:31:50.176408 3058 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:31:50.176441 kubelet[3058]: E0307 01:31:50.176418 3058 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:31:50.180856 containerd[1718]: time="2026-03-07T01:31:50.180743395Z" level=info msg="StartContainer for \"33cb37ae8b7c4e7eddc922b36d94f60c81cc67a06a4b5e823b10e56b9f5401c5\" returns successfully" Mar 7 01:31:50.189981 systemd[1]: cri-containerd-33cb37ae8b7c4e7eddc922b36d94f60c81cc67a06a4b5e823b10e56b9f5401c5.scope: Deactivated successfully. Mar 7 01:31:50.719140 containerd[1718]: time="2026-03-07T01:31:50.718936714Z" level=info msg="shim disconnected" id=33cb37ae8b7c4e7eddc922b36d94f60c81cc67a06a4b5e823b10e56b9f5401c5 namespace=k8s.io Mar 7 01:31:50.719140 containerd[1718]: time="2026-03-07T01:31:50.718991114Z" level=warning msg="cleaning up after shim disconnected" id=33cb37ae8b7c4e7eddc922b36d94f60c81cc67a06a4b5e823b10e56b9f5401c5 namespace=k8s.io Mar 7 01:31:50.719140 containerd[1718]: time="2026-03-07T01:31:50.718999434Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:31:50.895511 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-33cb37ae8b7c4e7eddc922b36d94f60c81cc67a06a4b5e823b10e56b9f5401c5-rootfs.mount: Deactivated successfully. Mar 7 01:31:51.128418 containerd[1718]: time="2026-03-07T01:31:51.128325151Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 7 01:31:51.152503 kubelet[3058]: I0307 01:31:51.152440 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-typha-6bf5d6d795-sdvbq" podStartSLOduration=2.8708887279999997 podStartE2EDuration="5.152425263s" podCreationTimestamp="2026-03-07 01:31:46 +0000 UTC" firstStartedPulling="2026-03-07 01:31:46.609290589 +0000 UTC m=+21.713452391" lastFinishedPulling="2026-03-07 01:31:48.890827124 +0000 UTC m=+23.994988926" observedRunningTime="2026-03-07 01:31:49.144199108 +0000 UTC m=+24.248360910" watchObservedRunningTime="2026-03-07 01:31:51.152425263 +0000 UTC m=+26.256587025" Mar 7 01:31:52.029388 kubelet[3058]: E0307 01:31:52.028998 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-d5j7z" podUID="959c57a8-5ea2-429c-aa5b-3c7b113e7280" Mar 7 01:31:54.029510 kubelet[3058]: E0307 01:31:54.029470 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-d5j7z" podUID="959c57a8-5ea2-429c-aa5b-3c7b113e7280" Mar 7 01:31:55.240073 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2041802653.mount: Deactivated successfully. Mar 7 01:31:55.375898 containerd[1718]: time="2026-03-07T01:31:55.375848032Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:55.378524 containerd[1718]: time="2026-03-07T01:31:55.378497949Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Mar 7 01:31:55.382261 containerd[1718]: time="2026-03-07T01:31:55.382231426Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:55.388022 containerd[1718]: time="2026-03-07T01:31:55.387957580Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:55.388824 containerd[1718]: time="2026-03-07T01:31:55.388688940Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 4.25930515s" Mar 7 01:31:55.388824 containerd[1718]: time="2026-03-07T01:31:55.388750500Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Mar 7 01:31:55.397201 containerd[1718]: time="2026-03-07T01:31:55.397075052Z" level=info msg="CreateContainer within sandbox \"afcda6d4e63ded693c654273e91d4c6404d6a9d7c7f20e181effad11e2f299d9\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 7 01:31:55.432470 containerd[1718]: time="2026-03-07T01:31:55.432431419Z" level=info msg="CreateContainer within sandbox \"afcda6d4e63ded693c654273e91d4c6404d6a9d7c7f20e181effad11e2f299d9\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"9db24c02fad23333632fab63d6d3302cf47970b8292245019ed3dac7def23221\"" Mar 7 01:31:55.434010 containerd[1718]: time="2026-03-07T01:31:55.433877977Z" level=info msg="StartContainer for \"9db24c02fad23333632fab63d6d3302cf47970b8292245019ed3dac7def23221\"" Mar 7 01:31:55.463695 systemd[1]: Started cri-containerd-9db24c02fad23333632fab63d6d3302cf47970b8292245019ed3dac7def23221.scope - libcontainer container 9db24c02fad23333632fab63d6d3302cf47970b8292245019ed3dac7def23221. Mar 7 01:31:55.493318 containerd[1718]: time="2026-03-07T01:31:55.493185722Z" level=info msg="StartContainer for \"9db24c02fad23333632fab63d6d3302cf47970b8292245019ed3dac7def23221\" returns successfully" Mar 7 01:31:55.532648 systemd[1]: cri-containerd-9db24c02fad23333632fab63d6d3302cf47970b8292245019ed3dac7def23221.scope: Deactivated successfully. Mar 7 01:31:56.268940 kubelet[3058]: E0307 01:31:56.029232 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-d5j7z" podUID="959c57a8-5ea2-429c-aa5b-3c7b113e7280" Mar 7 01:31:56.240049 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9db24c02fad23333632fab63d6d3302cf47970b8292245019ed3dac7def23221-rootfs.mount: Deactivated successfully. Mar 7 01:31:57.086772 containerd[1718]: time="2026-03-07T01:31:57.086560471Z" level=info msg="shim disconnected" id=9db24c02fad23333632fab63d6d3302cf47970b8292245019ed3dac7def23221 namespace=k8s.io Mar 7 01:31:57.086772 containerd[1718]: time="2026-03-07T01:31:57.086613471Z" level=warning msg="cleaning up after shim disconnected" id=9db24c02fad23333632fab63d6d3302cf47970b8292245019ed3dac7def23221 namespace=k8s.io Mar 7 01:31:57.086772 containerd[1718]: time="2026-03-07T01:31:57.086621311Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:31:57.143155 containerd[1718]: time="2026-03-07T01:31:57.142508099Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 7 01:31:58.028997 kubelet[3058]: E0307 01:31:58.028955 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-d5j7z" podUID="959c57a8-5ea2-429c-aa5b-3c7b113e7280" Mar 7 01:31:59.337929 containerd[1718]: time="2026-03-07T01:31:59.337878423Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:59.340451 containerd[1718]: time="2026-03-07T01:31:59.340419344Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Mar 7 01:31:59.343624 containerd[1718]: time="2026-03-07T01:31:59.343576705Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:59.349030 containerd[1718]: time="2026-03-07T01:31:59.348790907Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:31:59.349538 containerd[1718]: time="2026-03-07T01:31:59.349509027Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 2.206959528s" Mar 7 01:31:59.349620 containerd[1718]: time="2026-03-07T01:31:59.349539027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Mar 7 01:31:59.358213 containerd[1718]: time="2026-03-07T01:31:59.358166990Z" level=info msg="CreateContainer within sandbox \"afcda6d4e63ded693c654273e91d4c6404d6a9d7c7f20e181effad11e2f299d9\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 7 01:31:59.393857 containerd[1718]: time="2026-03-07T01:31:59.393729203Z" level=info msg="CreateContainer within sandbox \"afcda6d4e63ded693c654273e91d4c6404d6a9d7c7f20e181effad11e2f299d9\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"1aeadc3efa1b5bb14c143d5c0efd53bc0ba89f13de71a9f0664f1c92a7d86079\"" Mar 7 01:31:59.396287 containerd[1718]: time="2026-03-07T01:31:59.394864923Z" level=info msg="StartContainer for \"1aeadc3efa1b5bb14c143d5c0efd53bc0ba89f13de71a9f0664f1c92a7d86079\"" Mar 7 01:31:59.424696 systemd[1]: Started cri-containerd-1aeadc3efa1b5bb14c143d5c0efd53bc0ba89f13de71a9f0664f1c92a7d86079.scope - libcontainer container 1aeadc3efa1b5bb14c143d5c0efd53bc0ba89f13de71a9f0664f1c92a7d86079. Mar 7 01:31:59.454214 containerd[1718]: time="2026-03-07T01:31:59.454170065Z" level=info msg="StartContainer for \"1aeadc3efa1b5bb14c143d5c0efd53bc0ba89f13de71a9f0664f1c92a7d86079\" returns successfully" Mar 7 01:32:00.029309 kubelet[3058]: E0307 01:32:00.029258 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-d5j7z" podUID="959c57a8-5ea2-429c-aa5b-3c7b113e7280" Mar 7 01:32:00.635117 containerd[1718]: time="2026-03-07T01:32:00.635072363Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 7 01:32:00.637819 systemd[1]: cri-containerd-1aeadc3efa1b5bb14c143d5c0efd53bc0ba89f13de71a9f0664f1c92a7d86079.scope: Deactivated successfully. Mar 7 01:32:00.659501 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1aeadc3efa1b5bb14c143d5c0efd53bc0ba89f13de71a9f0664f1c92a7d86079-rootfs.mount: Deactivated successfully. Mar 7 01:32:00.718772 kubelet[3058]: I0307 01:32:00.718737 3058 kubelet_node_status.go:427] "Fast updating node status as it just became ready" Mar 7 01:32:01.487615 systemd[1]: Created slice kubepods-burstable-podf1de7e3a_82ca_4e43_88c8_75d72af93053.slice - libcontainer container kubepods-burstable-podf1de7e3a_82ca_4e43_88c8_75d72af93053.slice. Mar 7 01:32:01.500311 containerd[1718]: time="2026-03-07T01:32:01.500191903Z" level=info msg="shim disconnected" id=1aeadc3efa1b5bb14c143d5c0efd53bc0ba89f13de71a9f0664f1c92a7d86079 namespace=k8s.io Mar 7 01:32:01.500311 containerd[1718]: time="2026-03-07T01:32:01.500242103Z" level=warning msg="cleaning up after shim disconnected" id=1aeadc3efa1b5bb14c143d5c0efd53bc0ba89f13de71a9f0664f1c92a7d86079 namespace=k8s.io Mar 7 01:32:01.500311 containerd[1718]: time="2026-03-07T01:32:01.500250223Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:32:01.512977 systemd[1]: Created slice kubepods-besteffort-pod66b3805e_b65e_4139_b6a9_34acf12547d3.slice - libcontainer container kubepods-besteffort-pod66b3805e_b65e_4139_b6a9_34acf12547d3.slice. Mar 7 01:32:01.525962 systemd[1]: Created slice kubepods-besteffort-pod959c57a8_5ea2_429c_aa5b_3c7b113e7280.slice - libcontainer container kubepods-besteffort-pod959c57a8_5ea2_429c_aa5b_3c7b113e7280.slice. Mar 7 01:32:01.535629 systemd[1]: Created slice kubepods-burstable-pod4054fdd0_0f08_4bdd_ad33_94fb06421936.slice - libcontainer container kubepods-burstable-pod4054fdd0_0f08_4bdd_ad33_94fb06421936.slice. Mar 7 01:32:01.540578 containerd[1718]: time="2026-03-07T01:32:01.539980271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-d5j7z,Uid:959c57a8-5ea2-429c-aa5b-3c7b113e7280,Namespace:calico-system,Attempt:0,}" Mar 7 01:32:01.541501 systemd[1]: Created slice kubepods-besteffort-podaa61cab2_0011_4c46_b66c_4d054b7336c6.slice - libcontainer container kubepods-besteffort-podaa61cab2_0011_4c46_b66c_4d054b7336c6.slice. Mar 7 01:32:01.553353 systemd[1]: Created slice kubepods-besteffort-pod4baa109f_be88_432a_8b8a_fa833dcc95d3.slice - libcontainer container kubepods-besteffort-pod4baa109f_be88_432a_8b8a_fa833dcc95d3.slice. Mar 7 01:32:01.560976 systemd[1]: Created slice kubepods-besteffort-pod491bd089_80af_49d7_8a6a_aa3fd4f9da71.slice - libcontainer container kubepods-besteffort-pod491bd089_80af_49d7_8a6a_aa3fd4f9da71.slice. Mar 7 01:32:01.567021 systemd[1]: Created slice kubepods-besteffort-pod94207163_c370_4487_84eb_4023d7e5f07c.slice - libcontainer container kubepods-besteffort-pod94207163_c370_4487_84eb_4023d7e5f07c.slice. Mar 7 01:32:01.604040 kubelet[3058]: I0307 01:32:01.603443 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1de7e3a-82ca-4e43-88c8-75d72af93053-config-volume\") pod \"coredns-7d764666f9-8ttj2\" (UID: \"f1de7e3a-82ca-4e43-88c8-75d72af93053\") " pod="kube-system/coredns-7d764666f9-8ttj2" Mar 7 01:32:01.604040 kubelet[3058]: I0307 01:32:01.603488 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/66b3805e-b65e-4139-b6a9-34acf12547d3-whisker-backend-key-pair\") pod \"whisker-6b744bddfc-zvqxz\" (UID: \"66b3805e-b65e-4139-b6a9-34acf12547d3\") " pod="calico-system/whisker-6b744bddfc-zvqxz" Mar 7 01:32:01.604040 kubelet[3058]: I0307 01:32:01.603512 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b3805e-b65e-4139-b6a9-34acf12547d3-whisker-ca-bundle\") pod \"whisker-6b744bddfc-zvqxz\" (UID: \"66b3805e-b65e-4139-b6a9-34acf12547d3\") " pod="calico-system/whisker-6b744bddfc-zvqxz" Mar 7 01:32:01.604040 kubelet[3058]: I0307 01:32:01.603529 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp9cv\" (UniqueName: \"kubernetes.io/projected/66b3805e-b65e-4139-b6a9-34acf12547d3-kube-api-access-pp9cv\") pod \"whisker-6b744bddfc-zvqxz\" (UID: \"66b3805e-b65e-4139-b6a9-34acf12547d3\") " pod="calico-system/whisker-6b744bddfc-zvqxz" Mar 7 01:32:01.604040 kubelet[3058]: I0307 01:32:01.603555 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t8fl\" (UniqueName: \"kubernetes.io/projected/f1de7e3a-82ca-4e43-88c8-75d72af93053-kube-api-access-2t8fl\") pod \"coredns-7d764666f9-8ttj2\" (UID: \"f1de7e3a-82ca-4e43-88c8-75d72af93053\") " pod="kube-system/coredns-7d764666f9-8ttj2" Mar 7 01:32:01.604534 kubelet[3058]: I0307 01:32:01.603574 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/66b3805e-b65e-4139-b6a9-34acf12547d3-nginx-config\") pod \"whisker-6b744bddfc-zvqxz\" (UID: \"66b3805e-b65e-4139-b6a9-34acf12547d3\") " pod="calico-system/whisker-6b744bddfc-zvqxz" Mar 7 01:32:01.636507 containerd[1718]: time="2026-03-07T01:32:01.636448153Z" level=error msg="Failed to destroy network for sandbox \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:01.638396 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e-shm.mount: Deactivated successfully. Mar 7 01:32:01.641418 containerd[1718]: time="2026-03-07T01:32:01.639579390Z" level=error msg="encountered an error cleaning up failed sandbox \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:01.641418 containerd[1718]: time="2026-03-07T01:32:01.639658510Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-d5j7z,Uid:959c57a8-5ea2-429c-aa5b-3c7b113e7280,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:01.641534 kubelet[3058]: E0307 01:32:01.640333 3058 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:01.641534 kubelet[3058]: E0307 01:32:01.640395 3058 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-d5j7z" Mar 7 01:32:01.641534 kubelet[3058]: E0307 01:32:01.640413 3058 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-d5j7z" Mar 7 01:32:01.641683 kubelet[3058]: E0307 01:32:01.640459 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-d5j7z_calico-system(959c57a8-5ea2-429c-aa5b-3c7b113e7280)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-d5j7z_calico-system(959c57a8-5ea2-429c-aa5b-3c7b113e7280)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-d5j7z" podUID="959c57a8-5ea2-429c-aa5b-3c7b113e7280" Mar 7 01:32:01.704485 kubelet[3058]: I0307 01:32:01.704436 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa61cab2-0011-4c46-b66c-4d054b7336c6-goldmane-ca-bundle\") pod \"goldmane-9f7667bb8-nnkbr\" (UID: \"aa61cab2-0011-4c46-b66c-4d054b7336c6\") " pod="calico-system/goldmane-9f7667bb8-nnkbr" Mar 7 01:32:01.704485 kubelet[3058]: I0307 01:32:01.704483 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m648\" (UniqueName: \"kubernetes.io/projected/aa61cab2-0011-4c46-b66c-4d054b7336c6-kube-api-access-5m648\") pod \"goldmane-9f7667bb8-nnkbr\" (UID: \"aa61cab2-0011-4c46-b66c-4d054b7336c6\") " pod="calico-system/goldmane-9f7667bb8-nnkbr" Mar 7 01:32:01.704485 kubelet[3058]: I0307 01:32:01.704500 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nf7c\" (UniqueName: \"kubernetes.io/projected/94207163-c370-4487-84eb-4023d7e5f07c-kube-api-access-5nf7c\") pod \"calico-apiserver-fb8dfc889-88kxk\" (UID: \"94207163-c370-4487-84eb-4023d7e5f07c\") " pod="calico-system/calico-apiserver-fb8dfc889-88kxk" Mar 7 01:32:01.704701 kubelet[3058]: I0307 01:32:01.704517 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4baa109f-be88-432a-8b8a-fa833dcc95d3-calico-apiserver-certs\") pod \"calico-apiserver-fb8dfc889-wk8q8\" (UID: \"4baa109f-be88-432a-8b8a-fa833dcc95d3\") " pod="calico-system/calico-apiserver-fb8dfc889-wk8q8" Mar 7 01:32:01.704701 kubelet[3058]: I0307 01:32:01.704564 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvfmx\" (UniqueName: \"kubernetes.io/projected/491bd089-80af-49d7-8a6a-aa3fd4f9da71-kube-api-access-zvfmx\") pod \"calico-kube-controllers-8499d985fb-scddx\" (UID: \"491bd089-80af-49d7-8a6a-aa3fd4f9da71\") " pod="calico-system/calico-kube-controllers-8499d985fb-scddx" Mar 7 01:32:01.704701 kubelet[3058]: I0307 01:32:01.704582 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/94207163-c370-4487-84eb-4023d7e5f07c-calico-apiserver-certs\") pod \"calico-apiserver-fb8dfc889-88kxk\" (UID: \"94207163-c370-4487-84eb-4023d7e5f07c\") " pod="calico-system/calico-apiserver-fb8dfc889-88kxk" Mar 7 01:32:01.704701 kubelet[3058]: I0307 01:32:01.704647 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2vb6\" (UniqueName: \"kubernetes.io/projected/4054fdd0-0f08-4bdd-ad33-94fb06421936-kube-api-access-d2vb6\") pod \"coredns-7d764666f9-vbqm5\" (UID: \"4054fdd0-0f08-4bdd-ad33-94fb06421936\") " pod="kube-system/coredns-7d764666f9-vbqm5" Mar 7 01:32:01.704701 kubelet[3058]: I0307 01:32:01.704666 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa61cab2-0011-4c46-b66c-4d054b7336c6-config\") pod \"goldmane-9f7667bb8-nnkbr\" (UID: \"aa61cab2-0011-4c46-b66c-4d054b7336c6\") " pod="calico-system/goldmane-9f7667bb8-nnkbr" Mar 7 01:32:01.704816 kubelet[3058]: I0307 01:32:01.704691 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491bd089-80af-49d7-8a6a-aa3fd4f9da71-tigera-ca-bundle\") pod \"calico-kube-controllers-8499d985fb-scddx\" (UID: \"491bd089-80af-49d7-8a6a-aa3fd4f9da71\") " pod="calico-system/calico-kube-controllers-8499d985fb-scddx" Mar 7 01:32:01.704816 kubelet[3058]: I0307 01:32:01.704707 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzn6h\" (UniqueName: \"kubernetes.io/projected/4baa109f-be88-432a-8b8a-fa833dcc95d3-kube-api-access-lzn6h\") pod \"calico-apiserver-fb8dfc889-wk8q8\" (UID: \"4baa109f-be88-432a-8b8a-fa833dcc95d3\") " pod="calico-system/calico-apiserver-fb8dfc889-wk8q8" Mar 7 01:32:01.704816 kubelet[3058]: I0307 01:32:01.704726 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4054fdd0-0f08-4bdd-ad33-94fb06421936-config-volume\") pod \"coredns-7d764666f9-vbqm5\" (UID: \"4054fdd0-0f08-4bdd-ad33-94fb06421936\") " pod="kube-system/coredns-7d764666f9-vbqm5" Mar 7 01:32:01.704816 kubelet[3058]: I0307 01:32:01.704739 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/aa61cab2-0011-4c46-b66c-4d054b7336c6-goldmane-key-pair\") pod \"goldmane-9f7667bb8-nnkbr\" (UID: \"aa61cab2-0011-4c46-b66c-4d054b7336c6\") " pod="calico-system/goldmane-9f7667bb8-nnkbr" Mar 7 01:32:01.798921 containerd[1718]: time="2026-03-07T01:32:01.798228062Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-8ttj2,Uid:f1de7e3a-82ca-4e43-88c8-75d72af93053,Namespace:kube-system,Attempt:0,}" Mar 7 01:32:01.841607 containerd[1718]: time="2026-03-07T01:32:01.840070908Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6b744bddfc-zvqxz,Uid:66b3805e-b65e-4139-b6a9-34acf12547d3,Namespace:calico-system,Attempt:0,}" Mar 7 01:32:01.851981 containerd[1718]: time="2026-03-07T01:32:01.851745698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-vbqm5,Uid:4054fdd0-0f08-4bdd-ad33-94fb06421936,Namespace:kube-system,Attempt:0,}" Mar 7 01:32:01.858896 containerd[1718]: time="2026-03-07T01:32:01.858667453Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-nnkbr,Uid:aa61cab2-0011-4c46-b66c-4d054b7336c6,Namespace:calico-system,Attempt:0,}" Mar 7 01:32:01.864234 containerd[1718]: time="2026-03-07T01:32:01.863695569Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb8dfc889-wk8q8,Uid:4baa109f-be88-432a-8b8a-fa833dcc95d3,Namespace:calico-system,Attempt:0,}" Mar 7 01:32:01.871086 containerd[1718]: time="2026-03-07T01:32:01.870853963Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8499d985fb-scddx,Uid:491bd089-80af-49d7-8a6a-aa3fd4f9da71,Namespace:calico-system,Attempt:0,}" Mar 7 01:32:01.891743 containerd[1718]: time="2026-03-07T01:32:01.891458226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb8dfc889-88kxk,Uid:94207163-c370-4487-84eb-4023d7e5f07c,Namespace:calico-system,Attempt:0,}" Mar 7 01:32:01.902713 containerd[1718]: time="2026-03-07T01:32:01.902671857Z" level=error msg="Failed to destroy network for sandbox \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:01.903102 containerd[1718]: time="2026-03-07T01:32:01.903079137Z" level=error msg="encountered an error cleaning up failed sandbox \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:01.904557 containerd[1718]: time="2026-03-07T01:32:01.903190057Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-8ttj2,Uid:f1de7e3a-82ca-4e43-88c8-75d72af93053,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:01.904651 kubelet[3058]: E0307 01:32:01.903376 3058 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:01.904651 kubelet[3058]: E0307 01:32:01.903433 3058 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-8ttj2" Mar 7 01:32:01.904651 kubelet[3058]: E0307 01:32:01.903452 3058 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-8ttj2" Mar 7 01:32:01.904739 kubelet[3058]: E0307 01:32:01.903498 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-8ttj2_kube-system(f1de7e3a-82ca-4e43-88c8-75d72af93053)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-8ttj2_kube-system(f1de7e3a-82ca-4e43-88c8-75d72af93053)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-8ttj2" podUID="f1de7e3a-82ca-4e43-88c8-75d72af93053" Mar 7 01:32:02.044962 containerd[1718]: time="2026-03-07T01:32:02.044913942Z" level=error msg="Failed to destroy network for sandbox \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.045525 containerd[1718]: time="2026-03-07T01:32:02.045436982Z" level=error msg="encountered an error cleaning up failed sandbox \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.045525 containerd[1718]: time="2026-03-07T01:32:02.045489262Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6b744bddfc-zvqxz,Uid:66b3805e-b65e-4139-b6a9-34acf12547d3,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.046261 kubelet[3058]: E0307 01:32:02.045856 3058 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.046261 kubelet[3058]: E0307 01:32:02.045907 3058 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6b744bddfc-zvqxz" Mar 7 01:32:02.046261 kubelet[3058]: E0307 01:32:02.045933 3058 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6b744bddfc-zvqxz" Mar 7 01:32:02.046400 kubelet[3058]: E0307 01:32:02.045982 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6b744bddfc-zvqxz_calico-system(66b3805e-b65e-4139-b6a9-34acf12547d3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6b744bddfc-zvqxz_calico-system(66b3805e-b65e-4139-b6a9-34acf12547d3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6b744bddfc-zvqxz" podUID="66b3805e-b65e-4139-b6a9-34acf12547d3" Mar 7 01:32:02.123567 containerd[1718]: time="2026-03-07T01:32:02.123446959Z" level=error msg="Failed to destroy network for sandbox \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.124957 containerd[1718]: time="2026-03-07T01:32:02.124305678Z" level=error msg="encountered an error cleaning up failed sandbox \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.124957 containerd[1718]: time="2026-03-07T01:32:02.124366758Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-vbqm5,Uid:4054fdd0-0f08-4bdd-ad33-94fb06421936,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.125100 kubelet[3058]: E0307 01:32:02.124579 3058 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.125100 kubelet[3058]: E0307 01:32:02.124633 3058 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-vbqm5" Mar 7 01:32:02.125100 kubelet[3058]: E0307 01:32:02.124651 3058 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-vbqm5" Mar 7 01:32:02.125182 kubelet[3058]: E0307 01:32:02.124699 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-vbqm5_kube-system(4054fdd0-0f08-4bdd-ad33-94fb06421936)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-vbqm5_kube-system(4054fdd0-0f08-4bdd-ad33-94fb06421936)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-vbqm5" podUID="4054fdd0-0f08-4bdd-ad33-94fb06421936" Mar 7 01:32:02.128317 containerd[1718]: time="2026-03-07T01:32:02.128186035Z" level=error msg="Failed to destroy network for sandbox \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.128725 containerd[1718]: time="2026-03-07T01:32:02.128695754Z" level=error msg="encountered an error cleaning up failed sandbox \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.128833 containerd[1718]: time="2026-03-07T01:32:02.128810114Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-nnkbr,Uid:aa61cab2-0011-4c46-b66c-4d054b7336c6,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.129288 kubelet[3058]: E0307 01:32:02.129103 3058 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.129288 kubelet[3058]: E0307 01:32:02.129173 3058 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-nnkbr" Mar 7 01:32:02.129288 kubelet[3058]: E0307 01:32:02.129194 3058 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-nnkbr" Mar 7 01:32:02.129418 kubelet[3058]: E0307 01:32:02.129245 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-9f7667bb8-nnkbr_calico-system(aa61cab2-0011-4c46-b66c-4d054b7336c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-9f7667bb8-nnkbr_calico-system(aa61cab2-0011-4c46-b66c-4d054b7336c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-nnkbr" podUID="aa61cab2-0011-4c46-b66c-4d054b7336c6" Mar 7 01:32:02.137387 containerd[1718]: time="2026-03-07T01:32:02.137335307Z" level=error msg="Failed to destroy network for sandbox \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.137925 containerd[1718]: time="2026-03-07T01:32:02.137877067Z" level=error msg="encountered an error cleaning up failed sandbox \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.138003 containerd[1718]: time="2026-03-07T01:32:02.137955387Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8499d985fb-scddx,Uid:491bd089-80af-49d7-8a6a-aa3fd4f9da71,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.139153 kubelet[3058]: E0307 01:32:02.138130 3058 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.139153 kubelet[3058]: E0307 01:32:02.138176 3058 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8499d985fb-scddx" Mar 7 01:32:02.139153 kubelet[3058]: E0307 01:32:02.138192 3058 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8499d985fb-scddx" Mar 7 01:32:02.139280 kubelet[3058]: E0307 01:32:02.138234 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8499d985fb-scddx_calico-system(491bd089-80af-49d7-8a6a-aa3fd4f9da71)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8499d985fb-scddx_calico-system(491bd089-80af-49d7-8a6a-aa3fd4f9da71)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8499d985fb-scddx" podUID="491bd089-80af-49d7-8a6a-aa3fd4f9da71" Mar 7 01:32:02.145918 containerd[1718]: time="2026-03-07T01:32:02.145881340Z" level=error msg="Failed to destroy network for sandbox \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.146331 containerd[1718]: time="2026-03-07T01:32:02.146302660Z" level=error msg="encountered an error cleaning up failed sandbox \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.146463 containerd[1718]: time="2026-03-07T01:32:02.146420180Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb8dfc889-wk8q8,Uid:4baa109f-be88-432a-8b8a-fa833dcc95d3,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.146862 kubelet[3058]: E0307 01:32:02.146749 3058 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.146862 kubelet[3058]: E0307 01:32:02.146804 3058 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-fb8dfc889-wk8q8" Mar 7 01:32:02.146862 kubelet[3058]: E0307 01:32:02.146821 3058 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-fb8dfc889-wk8q8" Mar 7 01:32:02.147259 kubelet[3058]: E0307 01:32:02.147022 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-fb8dfc889-wk8q8_calico-system(4baa109f-be88-432a-8b8a-fa833dcc95d3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-fb8dfc889-wk8q8_calico-system(4baa109f-be88-432a-8b8a-fa833dcc95d3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-fb8dfc889-wk8q8" podUID="4baa109f-be88-432a-8b8a-fa833dcc95d3" Mar 7 01:32:02.152405 containerd[1718]: time="2026-03-07T01:32:02.152368695Z" level=error msg="Failed to destroy network for sandbox \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.154142 containerd[1718]: time="2026-03-07T01:32:02.154066334Z" level=error msg="encountered an error cleaning up failed sandbox \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.154243 containerd[1718]: time="2026-03-07T01:32:02.154140294Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb8dfc889-88kxk,Uid:94207163-c370-4487-84eb-4023d7e5f07c,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.154771 kubelet[3058]: E0307 01:32:02.154421 3058 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.154771 kubelet[3058]: E0307 01:32:02.154473 3058 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-fb8dfc889-88kxk" Mar 7 01:32:02.154771 kubelet[3058]: E0307 01:32:02.154496 3058 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-fb8dfc889-88kxk" Mar 7 01:32:02.154907 kubelet[3058]: E0307 01:32:02.154571 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-fb8dfc889-88kxk_calico-system(94207163-c370-4487-84eb-4023d7e5f07c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-fb8dfc889-88kxk_calico-system(94207163-c370-4487-84eb-4023d7e5f07c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-fb8dfc889-88kxk" podUID="94207163-c370-4487-84eb-4023d7e5f07c" Mar 7 01:32:02.157221 kubelet[3058]: I0307 01:32:02.157146 3058 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Mar 7 01:32:02.158500 containerd[1718]: time="2026-03-07T01:32:02.158396730Z" level=info msg="StopPodSandbox for \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\"" Mar 7 01:32:02.161167 containerd[1718]: time="2026-03-07T01:32:02.160257009Z" level=info msg="Ensure that sandbox 1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc in task-service has been cleanup successfully" Mar 7 01:32:02.162717 kubelet[3058]: I0307 01:32:02.162694 3058 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Mar 7 01:32:02.163596 containerd[1718]: time="2026-03-07T01:32:02.163459086Z" level=info msg="StopPodSandbox for \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\"" Mar 7 01:32:02.164559 containerd[1718]: time="2026-03-07T01:32:02.164507085Z" level=info msg="Ensure that sandbox 223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13 in task-service has been cleanup successfully" Mar 7 01:32:02.166420 kubelet[3058]: I0307 01:32:02.166401 3058 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Mar 7 01:32:02.169187 containerd[1718]: time="2026-03-07T01:32:02.167371323Z" level=info msg="StopPodSandbox for \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\"" Mar 7 01:32:02.170881 containerd[1718]: time="2026-03-07T01:32:02.170852000Z" level=info msg="Ensure that sandbox 794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5 in task-service has been cleanup successfully" Mar 7 01:32:02.174174 kubelet[3058]: I0307 01:32:02.173226 3058 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Mar 7 01:32:02.179426 containerd[1718]: time="2026-03-07T01:32:02.179389753Z" level=info msg="StopPodSandbox for \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\"" Mar 7 01:32:02.179888 containerd[1718]: time="2026-03-07T01:32:02.179868153Z" level=info msg="Ensure that sandbox fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00 in task-service has been cleanup successfully" Mar 7 01:32:02.186945 kubelet[3058]: I0307 01:32:02.186917 3058 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Mar 7 01:32:02.193223 containerd[1718]: time="2026-03-07T01:32:02.192447503Z" level=info msg="StopPodSandbox for \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\"" Mar 7 01:32:02.193223 containerd[1718]: time="2026-03-07T01:32:02.192643143Z" level=info msg="Ensure that sandbox 58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e in task-service has been cleanup successfully" Mar 7 01:32:02.196694 containerd[1718]: time="2026-03-07T01:32:02.196656699Z" level=info msg="CreateContainer within sandbox \"afcda6d4e63ded693c654273e91d4c6404d6a9d7c7f20e181effad11e2f299d9\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 7 01:32:02.198533 kubelet[3058]: I0307 01:32:02.198157 3058 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Mar 7 01:32:02.202184 containerd[1718]: time="2026-03-07T01:32:02.202145415Z" level=info msg="StopPodSandbox for \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\"" Mar 7 01:32:02.202326 containerd[1718]: time="2026-03-07T01:32:02.202306935Z" level=info msg="Ensure that sandbox c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e in task-service has been cleanup successfully" Mar 7 01:32:02.204834 kubelet[3058]: I0307 01:32:02.204806 3058 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Mar 7 01:32:02.207924 containerd[1718]: time="2026-03-07T01:32:02.207888410Z" level=info msg="StopPodSandbox for \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\"" Mar 7 01:32:02.208076 containerd[1718]: time="2026-03-07T01:32:02.208055810Z" level=info msg="Ensure that sandbox 06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e in task-service has been cleanup successfully" Mar 7 01:32:02.264753 containerd[1718]: time="2026-03-07T01:32:02.264695164Z" level=error msg="StopPodSandbox for \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\" failed" error="failed to destroy network for sandbox \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.265268 kubelet[3058]: E0307 01:32:02.264938 3058 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Mar 7 01:32:02.265268 kubelet[3058]: E0307 01:32:02.264998 3058 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e"} Mar 7 01:32:02.265268 kubelet[3058]: E0307 01:32:02.265049 3058 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"4baa109f-be88-432a-8b8a-fa833dcc95d3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 7 01:32:02.265268 kubelet[3058]: E0307 01:32:02.265081 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"4baa109f-be88-432a-8b8a-fa833dcc95d3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-fb8dfc889-wk8q8" podUID="4baa109f-be88-432a-8b8a-fa833dcc95d3" Mar 7 01:32:02.273353 containerd[1718]: time="2026-03-07T01:32:02.272913958Z" level=error msg="StopPodSandbox for \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\" failed" error="failed to destroy network for sandbox \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.274099 kubelet[3058]: E0307 01:32:02.273163 3058 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Mar 7 01:32:02.274099 kubelet[3058]: E0307 01:32:02.273222 3058 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00"} Mar 7 01:32:02.274099 kubelet[3058]: E0307 01:32:02.273259 3058 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"f1de7e3a-82ca-4e43-88c8-75d72af93053\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 7 01:32:02.274099 kubelet[3058]: E0307 01:32:02.273301 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"f1de7e3a-82ca-4e43-88c8-75d72af93053\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-8ttj2" podUID="f1de7e3a-82ca-4e43-88c8-75d72af93053" Mar 7 01:32:02.281842 containerd[1718]: time="2026-03-07T01:32:02.281794110Z" level=error msg="StopPodSandbox for \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\" failed" error="failed to destroy network for sandbox \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.282220 kubelet[3058]: E0307 01:32:02.282188 3058 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Mar 7 01:32:02.282357 kubelet[3058]: E0307 01:32:02.282339 3058 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5"} Mar 7 01:32:02.282458 kubelet[3058]: E0307 01:32:02.282445 3058 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"4054fdd0-0f08-4bdd-ad33-94fb06421936\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 7 01:32:02.282649 kubelet[3058]: E0307 01:32:02.282628 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"4054fdd0-0f08-4bdd-ad33-94fb06421936\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-vbqm5" podUID="4054fdd0-0f08-4bdd-ad33-94fb06421936" Mar 7 01:32:02.285309 containerd[1718]: time="2026-03-07T01:32:02.285266148Z" level=error msg="StopPodSandbox for \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\" failed" error="failed to destroy network for sandbox \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.285714 kubelet[3058]: E0307 01:32:02.285574 3058 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Mar 7 01:32:02.285714 kubelet[3058]: E0307 01:32:02.285613 3058 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc"} Mar 7 01:32:02.285714 kubelet[3058]: E0307 01:32:02.285652 3058 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"491bd089-80af-49d7-8a6a-aa3fd4f9da71\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 7 01:32:02.285714 kubelet[3058]: E0307 01:32:02.285675 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"491bd089-80af-49d7-8a6a-aa3fd4f9da71\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8499d985fb-scddx" podUID="491bd089-80af-49d7-8a6a-aa3fd4f9da71" Mar 7 01:32:02.296418 containerd[1718]: time="2026-03-07T01:32:02.296287539Z" level=error msg="StopPodSandbox for \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\" failed" error="failed to destroy network for sandbox \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.296824 kubelet[3058]: E0307 01:32:02.296664 3058 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Mar 7 01:32:02.296824 kubelet[3058]: E0307 01:32:02.296715 3058 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13"} Mar 7 01:32:02.296824 kubelet[3058]: E0307 01:32:02.296743 3058 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"aa61cab2-0011-4c46-b66c-4d054b7336c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 7 01:32:02.296824 kubelet[3058]: E0307 01:32:02.296769 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"aa61cab2-0011-4c46-b66c-4d054b7336c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-nnkbr" podUID="aa61cab2-0011-4c46-b66c-4d054b7336c6" Mar 7 01:32:02.298169 containerd[1718]: time="2026-03-07T01:32:02.297634218Z" level=error msg="StopPodSandbox for \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\" failed" error="failed to destroy network for sandbox \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.298432 kubelet[3058]: E0307 01:32:02.298319 3058 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Mar 7 01:32:02.298432 kubelet[3058]: E0307 01:32:02.298360 3058 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e"} Mar 7 01:32:02.298432 kubelet[3058]: E0307 01:32:02.298381 3058 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"959c57a8-5ea2-429c-aa5b-3c7b113e7280\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 7 01:32:02.298432 kubelet[3058]: E0307 01:32:02.298405 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"959c57a8-5ea2-429c-aa5b-3c7b113e7280\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-d5j7z" podUID="959c57a8-5ea2-429c-aa5b-3c7b113e7280" Mar 7 01:32:02.298713 containerd[1718]: time="2026-03-07T01:32:02.298686857Z" level=error msg="StopPodSandbox for \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\" failed" error="failed to destroy network for sandbox \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:32:02.299035 kubelet[3058]: E0307 01:32:02.298985 3058 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Mar 7 01:32:02.299089 kubelet[3058]: E0307 01:32:02.299038 3058 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e"} Mar 7 01:32:02.299089 kubelet[3058]: E0307 01:32:02.299065 3058 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"66b3805e-b65e-4139-b6a9-34acf12547d3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 7 01:32:02.299158 kubelet[3058]: E0307 01:32:02.299099 3058 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"66b3805e-b65e-4139-b6a9-34acf12547d3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6b744bddfc-zvqxz" podUID="66b3805e-b65e-4139-b6a9-34acf12547d3" Mar 7 01:32:02.303061 containerd[1718]: time="2026-03-07T01:32:02.303027013Z" level=info msg="CreateContainer within sandbox \"afcda6d4e63ded693c654273e91d4c6404d6a9d7c7f20e181effad11e2f299d9\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"a39cd0747caf4cdb7cc083dfdc5bb4e9b31ce961edca8ff0a32ef550a2a37499\"" Mar 7 01:32:02.304944 containerd[1718]: time="2026-03-07T01:32:02.304917572Z" level=info msg="StartContainer for \"a39cd0747caf4cdb7cc083dfdc5bb4e9b31ce961edca8ff0a32ef550a2a37499\"" Mar 7 01:32:02.341864 systemd[1]: Started cri-containerd-a39cd0747caf4cdb7cc083dfdc5bb4e9b31ce961edca8ff0a32ef550a2a37499.scope - libcontainer container a39cd0747caf4cdb7cc083dfdc5bb4e9b31ce961edca8ff0a32ef550a2a37499. Mar 7 01:32:02.375829 containerd[1718]: time="2026-03-07T01:32:02.375719634Z" level=info msg="StartContainer for \"a39cd0747caf4cdb7cc083dfdc5bb4e9b31ce961edca8ff0a32ef550a2a37499\" returns successfully" Mar 7 01:32:03.207971 kubelet[3058]: I0307 01:32:03.207930 3058 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Mar 7 01:32:03.209305 containerd[1718]: time="2026-03-07T01:32:03.208875600Z" level=info msg="StopPodSandbox for \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\"" Mar 7 01:32:03.209305 containerd[1718]: time="2026-03-07T01:32:03.209124720Z" level=info msg="Ensure that sandbox 0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0 in task-service has been cleanup successfully" Mar 7 01:32:03.214024 containerd[1718]: time="2026-03-07T01:32:03.212981557Z" level=info msg="StopPodSandbox for \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\"" Mar 7 01:32:03.241173 kubelet[3058]: I0307 01:32:03.241104 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-node-j7g6q" podStartSLOduration=1.782224501 podStartE2EDuration="17.241006694s" podCreationTimestamp="2026-03-07 01:31:46 +0000 UTC" firstStartedPulling="2026-03-07 01:31:46.701328216 +0000 UTC m=+21.805490018" lastFinishedPulling="2026-03-07 01:32:02.160110409 +0000 UTC m=+37.264272211" observedRunningTime="2026-03-07 01:32:03.240851614 +0000 UTC m=+38.345013416" watchObservedRunningTime="2026-03-07 01:32:03.241006694 +0000 UTC m=+38.345168496" Mar 7 01:32:03.350157 containerd[1718]: 2026-03-07 01:32:03.299 [INFO][4382] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Mar 7 01:32:03.350157 containerd[1718]: 2026-03-07 01:32:03.300 [INFO][4382] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" iface="eth0" netns="/var/run/netns/cni-06c1d7d7-6867-00af-2b50-6006879ce390" Mar 7 01:32:03.350157 containerd[1718]: 2026-03-07 01:32:03.300 [INFO][4382] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" iface="eth0" netns="/var/run/netns/cni-06c1d7d7-6867-00af-2b50-6006879ce390" Mar 7 01:32:03.350157 containerd[1718]: 2026-03-07 01:32:03.300 [INFO][4382] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" iface="eth0" netns="/var/run/netns/cni-06c1d7d7-6867-00af-2b50-6006879ce390" Mar 7 01:32:03.350157 containerd[1718]: 2026-03-07 01:32:03.300 [INFO][4382] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Mar 7 01:32:03.350157 containerd[1718]: 2026-03-07 01:32:03.300 [INFO][4382] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Mar 7 01:32:03.350157 containerd[1718]: 2026-03-07 01:32:03.325 [INFO][4420] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" HandleID="k8s-pod-network.0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:03.350157 containerd[1718]: 2026-03-07 01:32:03.325 [INFO][4420] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:03.350157 containerd[1718]: 2026-03-07 01:32:03.325 [INFO][4420] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:03.350157 containerd[1718]: 2026-03-07 01:32:03.341 [WARNING][4420] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" HandleID="k8s-pod-network.0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:03.350157 containerd[1718]: 2026-03-07 01:32:03.341 [INFO][4420] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" HandleID="k8s-pod-network.0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:03.350157 containerd[1718]: 2026-03-07 01:32:03.343 [INFO][4420] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:03.350157 containerd[1718]: 2026-03-07 01:32:03.347 [INFO][4382] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Mar 7 01:32:03.350743 containerd[1718]: time="2026-03-07T01:32:03.350713885Z" level=info msg="TearDown network for sandbox \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\" successfully" Mar 7 01:32:03.350820 containerd[1718]: time="2026-03-07T01:32:03.350803325Z" level=info msg="StopPodSandbox for \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\" returns successfully" Mar 7 01:32:03.354040 systemd[1]: run-netns-cni\x2d06c1d7d7\x2d6867\x2d00af\x2d2b50\x2d6006879ce390.mount: Deactivated successfully. Mar 7 01:32:03.361056 containerd[1718]: time="2026-03-07T01:32:03.360992837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb8dfc889-88kxk,Uid:94207163-c370-4487-84eb-4023d7e5f07c,Namespace:calico-system,Attempt:1,}" Mar 7 01:32:03.365904 containerd[1718]: 2026-03-07 01:32:03.305 [INFO][4396] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Mar 7 01:32:03.365904 containerd[1718]: 2026-03-07 01:32:03.306 [INFO][4396] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" iface="eth0" netns="/var/run/netns/cni-3fff71d5-6bed-040f-a08a-3457c579ddd1" Mar 7 01:32:03.365904 containerd[1718]: 2026-03-07 01:32:03.306 [INFO][4396] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" iface="eth0" netns="/var/run/netns/cni-3fff71d5-6bed-040f-a08a-3457c579ddd1" Mar 7 01:32:03.365904 containerd[1718]: 2026-03-07 01:32:03.306 [INFO][4396] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" iface="eth0" netns="/var/run/netns/cni-3fff71d5-6bed-040f-a08a-3457c579ddd1" Mar 7 01:32:03.365904 containerd[1718]: 2026-03-07 01:32:03.306 [INFO][4396] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Mar 7 01:32:03.365904 containerd[1718]: 2026-03-07 01:32:03.306 [INFO][4396] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Mar 7 01:32:03.365904 containerd[1718]: 2026-03-07 01:32:03.333 [INFO][4425] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" HandleID="k8s-pod-network.c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--6b744bddfc--zvqxz-eth0" Mar 7 01:32:03.365904 containerd[1718]: 2026-03-07 01:32:03.333 [INFO][4425] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:03.365904 containerd[1718]: 2026-03-07 01:32:03.343 [INFO][4425] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:03.365904 containerd[1718]: 2026-03-07 01:32:03.357 [WARNING][4425] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" HandleID="k8s-pod-network.c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--6b744bddfc--zvqxz-eth0" Mar 7 01:32:03.365904 containerd[1718]: 2026-03-07 01:32:03.358 [INFO][4425] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" HandleID="k8s-pod-network.c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--6b744bddfc--zvqxz-eth0" Mar 7 01:32:03.365904 containerd[1718]: 2026-03-07 01:32:03.361 [INFO][4425] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:03.365904 containerd[1718]: 2026-03-07 01:32:03.363 [INFO][4396] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Mar 7 01:32:03.369584 containerd[1718]: time="2026-03-07T01:32:03.367131032Z" level=info msg="TearDown network for sandbox \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\" successfully" Mar 7 01:32:03.369584 containerd[1718]: time="2026-03-07T01:32:03.367644791Z" level=info msg="StopPodSandbox for \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\" returns successfully" Mar 7 01:32:03.369406 systemd[1]: run-netns-cni\x2d3fff71d5\x2d6bed\x2d040f\x2da08a\x2d3457c579ddd1.mount: Deactivated successfully. Mar 7 01:32:03.419560 kubelet[3058]: I0307 01:32:03.419162 3058 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/projected/66b3805e-b65e-4139-b6a9-34acf12547d3-kube-api-access-pp9cv\" (UniqueName: \"kubernetes.io/projected/66b3805e-b65e-4139-b6a9-34acf12547d3-kube-api-access-pp9cv\") pod \"66b3805e-b65e-4139-b6a9-34acf12547d3\" (UID: \"66b3805e-b65e-4139-b6a9-34acf12547d3\") " Mar 7 01:32:03.419560 kubelet[3058]: I0307 01:32:03.419209 3058 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/66b3805e-b65e-4139-b6a9-34acf12547d3-nginx-config\" (UniqueName: \"kubernetes.io/configmap/66b3805e-b65e-4139-b6a9-34acf12547d3-nginx-config\") pod \"66b3805e-b65e-4139-b6a9-34acf12547d3\" (UID: \"66b3805e-b65e-4139-b6a9-34acf12547d3\") " Mar 7 01:32:03.419560 kubelet[3058]: I0307 01:32:03.419237 3058 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/66b3805e-b65e-4139-b6a9-34acf12547d3-whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b3805e-b65e-4139-b6a9-34acf12547d3-whisker-ca-bundle\") pod \"66b3805e-b65e-4139-b6a9-34acf12547d3\" (UID: \"66b3805e-b65e-4139-b6a9-34acf12547d3\") " Mar 7 01:32:03.419560 kubelet[3058]: I0307 01:32:03.419263 3058 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/secret/66b3805e-b65e-4139-b6a9-34acf12547d3-whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/66b3805e-b65e-4139-b6a9-34acf12547d3-whisker-backend-key-pair\") pod \"66b3805e-b65e-4139-b6a9-34acf12547d3\" (UID: \"66b3805e-b65e-4139-b6a9-34acf12547d3\") " Mar 7 01:32:03.421185 kubelet[3058]: I0307 01:32:03.421158 3058 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66b3805e-b65e-4139-b6a9-34acf12547d3-nginx-config" pod "66b3805e-b65e-4139-b6a9-34acf12547d3" (UID: "66b3805e-b65e-4139-b6a9-34acf12547d3"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 7 01:32:03.421708 kubelet[3058]: I0307 01:32:03.421681 3058 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66b3805e-b65e-4139-b6a9-34acf12547d3-whisker-ca-bundle" pod "66b3805e-b65e-4139-b6a9-34acf12547d3" (UID: "66b3805e-b65e-4139-b6a9-34acf12547d3"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 7 01:32:03.425033 kubelet[3058]: I0307 01:32:03.424973 3058 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b3805e-b65e-4139-b6a9-34acf12547d3-kube-api-access-pp9cv" pod "66b3805e-b65e-4139-b6a9-34acf12547d3" (UID: "66b3805e-b65e-4139-b6a9-34acf12547d3"). InnerVolumeSpecName "kube-api-access-pp9cv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 7 01:32:03.425319 kubelet[3058]: I0307 01:32:03.425303 3058 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b3805e-b65e-4139-b6a9-34acf12547d3-whisker-backend-key-pair" pod "66b3805e-b65e-4139-b6a9-34acf12547d3" (UID: "66b3805e-b65e-4139-b6a9-34acf12547d3"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 7 01:32:03.500900 systemd-networkd[1617]: caliecc9d9aafe6: Link UP Mar 7 01:32:03.503036 systemd-networkd[1617]: caliecc9d9aafe6: Gained carrier Mar 7 01:32:03.519837 kubelet[3058]: I0307 01:32:03.519519 3058 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pp9cv\" (UniqueName: \"kubernetes.io/projected/66b3805e-b65e-4139-b6a9-34acf12547d3-kube-api-access-pp9cv\") on node \"ci-4081.3.6-n-3151c5d0e2\" DevicePath \"\"" Mar 7 01:32:03.519837 kubelet[3058]: I0307 01:32:03.519564 3058 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/66b3805e-b65e-4139-b6a9-34acf12547d3-nginx-config\") on node \"ci-4081.3.6-n-3151c5d0e2\" DevicePath \"\"" Mar 7 01:32:03.519837 kubelet[3058]: I0307 01:32:03.519574 3058 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b3805e-b65e-4139-b6a9-34acf12547d3-whisker-ca-bundle\") on node \"ci-4081.3.6-n-3151c5d0e2\" DevicePath \"\"" Mar 7 01:32:03.519837 kubelet[3058]: I0307 01:32:03.519582 3058 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/66b3805e-b65e-4139-b6a9-34acf12547d3-whisker-backend-key-pair\") on node \"ci-4081.3.6-n-3151c5d0e2\" DevicePath \"\"" Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.412 [ERROR][4436] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.428 [INFO][4436] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0 calico-apiserver-fb8dfc889- calico-system 94207163-c370-4487-84eb-4023d7e5f07c 874 0 2026-03-07 01:31:44 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:fb8dfc889 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081.3.6-n-3151c5d0e2 calico-apiserver-fb8dfc889-88kxk eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] caliecc9d9aafe6 [] [] }} ContainerID="c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-88kxk" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-" Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.428 [INFO][4436] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-88kxk" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.452 [INFO][4450] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" HandleID="k8s-pod-network.c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.461 [INFO][4450] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" HandleID="k8s-pod-network.c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000273dc0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081.3.6-n-3151c5d0e2", "pod":"calico-apiserver-fb8dfc889-88kxk", "timestamp":"2026-03-07 01:32:03.452202683 +0000 UTC"}, Hostname:"ci-4081.3.6-n-3151c5d0e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003a7080)} Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.461 [INFO][4450] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.461 [INFO][4450] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.461 [INFO][4450] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-3151c5d0e2' Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.463 [INFO][4450] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.468 [INFO][4450] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.474 [INFO][4450] ipam/ipam.go 526: Trying affinity for 192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.475 [INFO][4450] ipam/ipam.go 160: Attempting to load block cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.477 [INFO][4450] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.477 [INFO][4450] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.15.192/26 handle="k8s-pod-network.c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.478 [INFO][4450] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.483 [INFO][4450] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.15.192/26 handle="k8s-pod-network.c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.491 [INFO][4450] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.15.193/26] block=192.168.15.192/26 handle="k8s-pod-network.c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.491 [INFO][4450] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.15.193/26] handle="k8s-pod-network.c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.491 [INFO][4450] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:03.521319 containerd[1718]: 2026-03-07 01:32:03.491 [INFO][4450] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.15.193/26] IPv6=[] ContainerID="c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" HandleID="k8s-pod-network.c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:03.521846 containerd[1718]: 2026-03-07 01:32:03.493 [INFO][4436] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-88kxk" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0", GenerateName:"calico-apiserver-fb8dfc889-", Namespace:"calico-system", SelfLink:"", UID:"94207163-c370-4487-84eb-4023d7e5f07c", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fb8dfc889", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"", Pod:"calico-apiserver-fb8dfc889-88kxk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.15.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliecc9d9aafe6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:03.521846 containerd[1718]: 2026-03-07 01:32:03.493 [INFO][4436] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.15.193/32] ContainerID="c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-88kxk" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:03.521846 containerd[1718]: 2026-03-07 01:32:03.493 [INFO][4436] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliecc9d9aafe6 ContainerID="c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-88kxk" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:03.521846 containerd[1718]: 2026-03-07 01:32:03.502 [INFO][4436] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-88kxk" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:03.521846 containerd[1718]: 2026-03-07 01:32:03.502 [INFO][4436] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-88kxk" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0", GenerateName:"calico-apiserver-fb8dfc889-", Namespace:"calico-system", SelfLink:"", UID:"94207163-c370-4487-84eb-4023d7e5f07c", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fb8dfc889", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f", Pod:"calico-apiserver-fb8dfc889-88kxk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.15.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliecc9d9aafe6", MAC:"3a:fb:ed:8c:be:76", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:03.521846 containerd[1718]: 2026-03-07 01:32:03.515 [INFO][4436] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-88kxk" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:03.538233 containerd[1718]: time="2026-03-07T01:32:03.538094094Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:32:03.538233 containerd[1718]: time="2026-03-07T01:32:03.538175533Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:32:03.538233 containerd[1718]: time="2026-03-07T01:32:03.538198173Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:03.539106 containerd[1718]: time="2026-03-07T01:32:03.539038493Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:03.554698 systemd[1]: Started cri-containerd-c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f.scope - libcontainer container c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f. Mar 7 01:32:03.584503 containerd[1718]: time="2026-03-07T01:32:03.584463216Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb8dfc889-88kxk,Uid:94207163-c370-4487-84eb-4023d7e5f07c,Namespace:calico-system,Attempt:1,} returns sandbox id \"c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f\"" Mar 7 01:32:03.587491 containerd[1718]: time="2026-03-07T01:32:03.586506294Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 7 01:32:03.716215 systemd[1]: var-lib-kubelet-pods-66b3805e\x2db65e\x2d4139\x2db6a9\x2d34acf12547d3-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dpp9cv.mount: Deactivated successfully. Mar 7 01:32:03.716313 systemd[1]: var-lib-kubelet-pods-66b3805e\x2db65e\x2d4139\x2db6a9\x2d34acf12547d3-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 7 01:32:04.233395 systemd[1]: Removed slice kubepods-besteffort-pod66b3805e_b65e_4139_b6a9_34acf12547d3.slice - libcontainer container kubepods-besteffort-pod66b3805e_b65e_4139_b6a9_34acf12547d3.slice. Mar 7 01:32:04.329576 kubelet[3058]: I0307 01:32:04.326283 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/22a59094-fbd9-4f1d-9c9a-04a6a36c96fd-whisker-backend-key-pair\") pod \"whisker-c646d54b-d6d47\" (UID: \"22a59094-fbd9-4f1d-9c9a-04a6a36c96fd\") " pod="calico-system/whisker-c646d54b-d6d47" Mar 7 01:32:04.329576 kubelet[3058]: I0307 01:32:04.326331 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/22a59094-fbd9-4f1d-9c9a-04a6a36c96fd-nginx-config\") pod \"whisker-c646d54b-d6d47\" (UID: \"22a59094-fbd9-4f1d-9c9a-04a6a36c96fd\") " pod="calico-system/whisker-c646d54b-d6d47" Mar 7 01:32:04.329576 kubelet[3058]: I0307 01:32:04.326351 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22a59094-fbd9-4f1d-9c9a-04a6a36c96fd-whisker-ca-bundle\") pod \"whisker-c646d54b-d6d47\" (UID: \"22a59094-fbd9-4f1d-9c9a-04a6a36c96fd\") " pod="calico-system/whisker-c646d54b-d6d47" Mar 7 01:32:04.329576 kubelet[3058]: I0307 01:32:04.326382 3058 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm8dk\" (UniqueName: \"kubernetes.io/projected/22a59094-fbd9-4f1d-9c9a-04a6a36c96fd-kube-api-access-rm8dk\") pod \"whisker-c646d54b-d6d47\" (UID: \"22a59094-fbd9-4f1d-9c9a-04a6a36c96fd\") " pod="calico-system/whisker-c646d54b-d6d47" Mar 7 01:32:04.330867 systemd[1]: Created slice kubepods-besteffort-pod22a59094_fbd9_4f1d_9c9a_04a6a36c96fd.slice - libcontainer container kubepods-besteffort-pod22a59094_fbd9_4f1d_9c9a_04a6a36c96fd.slice. Mar 7 01:32:04.645647 containerd[1718]: time="2026-03-07T01:32:04.645535277Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-c646d54b-d6d47,Uid:22a59094-fbd9-4f1d-9c9a-04a6a36c96fd,Namespace:calico-system,Attempt:0,}" Mar 7 01:32:04.785531 systemd-networkd[1617]: calic2862153166: Link UP Mar 7 01:32:04.785728 systemd-networkd[1617]: calic2862153166: Gained carrier Mar 7 01:32:04.796663 systemd-networkd[1617]: caliecc9d9aafe6: Gained IPv6LL Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.702 [ERROR][4627] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.721 [INFO][4627] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--3151c5d0e2-k8s-whisker--c646d54b--d6d47-eth0 whisker-c646d54b- calico-system 22a59094-fbd9-4f1d-9c9a-04a6a36c96fd 900 0 2026-03-07 01:32:04 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:c646d54b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081.3.6-n-3151c5d0e2 whisker-c646d54b-d6d47 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calic2862153166 [] [] }} ContainerID="891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" Namespace="calico-system" Pod="whisker-c646d54b-d6d47" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--c646d54b--d6d47-" Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.721 [INFO][4627] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" Namespace="calico-system" Pod="whisker-c646d54b-d6d47" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--c646d54b--d6d47-eth0" Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.743 [INFO][4641] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" HandleID="k8s-pod-network.891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--c646d54b--d6d47-eth0" Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.751 [INFO][4641] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" HandleID="k8s-pod-network.891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--c646d54b--d6d47-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002737c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081.3.6-n-3151c5d0e2", "pod":"whisker-c646d54b-d6d47", "timestamp":"2026-03-07 01:32:04.743029358 +0000 UTC"}, Hostname:"ci-4081.3.6-n-3151c5d0e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400030f080)} Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.751 [INFO][4641] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.751 [INFO][4641] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.751 [INFO][4641] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-3151c5d0e2' Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.753 [INFO][4641] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.756 [INFO][4641] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.760 [INFO][4641] ipam/ipam.go 526: Trying affinity for 192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.761 [INFO][4641] ipam/ipam.go 160: Attempting to load block cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.763 [INFO][4641] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.763 [INFO][4641] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.15.192/26 handle="k8s-pod-network.891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.764 [INFO][4641] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935 Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.773 [INFO][4641] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.15.192/26 handle="k8s-pod-network.891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.779 [INFO][4641] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.15.194/26] block=192.168.15.192/26 handle="k8s-pod-network.891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.779 [INFO][4641] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.15.194/26] handle="k8s-pod-network.891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.779 [INFO][4641] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:04.805465 containerd[1718]: 2026-03-07 01:32:04.779 [INFO][4641] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.15.194/26] IPv6=[] ContainerID="891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" HandleID="k8s-pod-network.891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--c646d54b--d6d47-eth0" Mar 7 01:32:04.806051 containerd[1718]: 2026-03-07 01:32:04.781 [INFO][4627] cni-plugin/k8s.go 418: Populated endpoint ContainerID="891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" Namespace="calico-system" Pod="whisker-c646d54b-d6d47" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--c646d54b--d6d47-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-whisker--c646d54b--d6d47-eth0", GenerateName:"whisker-c646d54b-", Namespace:"calico-system", SelfLink:"", UID:"22a59094-fbd9-4f1d-9c9a-04a6a36c96fd", ResourceVersion:"900", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 32, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"c646d54b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"", Pod:"whisker-c646d54b-d6d47", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.15.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calic2862153166", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:04.806051 containerd[1718]: 2026-03-07 01:32:04.782 [INFO][4627] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.15.194/32] ContainerID="891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" Namespace="calico-system" Pod="whisker-c646d54b-d6d47" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--c646d54b--d6d47-eth0" Mar 7 01:32:04.806051 containerd[1718]: 2026-03-07 01:32:04.782 [INFO][4627] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic2862153166 ContainerID="891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" Namespace="calico-system" Pod="whisker-c646d54b-d6d47" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--c646d54b--d6d47-eth0" Mar 7 01:32:04.806051 containerd[1718]: 2026-03-07 01:32:04.784 [INFO][4627] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" Namespace="calico-system" Pod="whisker-c646d54b-d6d47" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--c646d54b--d6d47-eth0" Mar 7 01:32:04.806051 containerd[1718]: 2026-03-07 01:32:04.788 [INFO][4627] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" Namespace="calico-system" Pod="whisker-c646d54b-d6d47" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--c646d54b--d6d47-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-whisker--c646d54b--d6d47-eth0", GenerateName:"whisker-c646d54b-", Namespace:"calico-system", SelfLink:"", UID:"22a59094-fbd9-4f1d-9c9a-04a6a36c96fd", ResourceVersion:"900", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 32, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"c646d54b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935", Pod:"whisker-c646d54b-d6d47", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.15.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calic2862153166", MAC:"6a:e6:97:8e:0c:e9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:04.806051 containerd[1718]: 2026-03-07 01:32:04.801 [INFO][4627] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935" Namespace="calico-system" Pod="whisker-c646d54b-d6d47" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--c646d54b--d6d47-eth0" Mar 7 01:32:04.823830 containerd[1718]: time="2026-03-07T01:32:04.823059373Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:32:04.823830 containerd[1718]: time="2026-03-07T01:32:04.823722093Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:32:04.823830 containerd[1718]: time="2026-03-07T01:32:04.823753653Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:04.824143 containerd[1718]: time="2026-03-07T01:32:04.823896013Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:04.851999 systemd[1]: Started cri-containerd-891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935.scope - libcontainer container 891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935. Mar 7 01:32:04.916564 containerd[1718]: time="2026-03-07T01:32:04.916453418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-c646d54b-d6d47,Uid:22a59094-fbd9-4f1d-9c9a-04a6a36c96fd,Namespace:calico-system,Attempt:0,} returns sandbox id \"891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935\"" Mar 7 01:32:05.042913 kubelet[3058]: I0307 01:32:05.042717 3058 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="66b3805e-b65e-4139-b6a9-34acf12547d3" path="/var/lib/kubelet/pods/66b3805e-b65e-4139-b6a9-34acf12547d3/volumes" Mar 7 01:32:06.589690 systemd-networkd[1617]: calic2862153166: Gained IPv6LL Mar 7 01:32:07.131997 containerd[1718]: time="2026-03-07T01:32:07.130844822Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:07.133994 containerd[1718]: time="2026-03-07T01:32:07.133963300Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Mar 7 01:32:07.137568 containerd[1718]: time="2026-03-07T01:32:07.137300297Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:07.145936 containerd[1718]: time="2026-03-07T01:32:07.145898091Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:07.148260 containerd[1718]: time="2026-03-07T01:32:07.148098489Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 3.560798195s" Mar 7 01:32:07.148260 containerd[1718]: time="2026-03-07T01:32:07.148214809Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 7 01:32:07.152001 containerd[1718]: time="2026-03-07T01:32:07.150216408Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 7 01:32:07.158431 containerd[1718]: time="2026-03-07T01:32:07.158398801Z" level=info msg="CreateContainer within sandbox \"c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 7 01:32:07.473358 containerd[1718]: time="2026-03-07T01:32:07.473239488Z" level=info msg="CreateContainer within sandbox \"c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e2e97ad89be05b333a295c1d145a5d13d625c4d40481145da037a2fd5c3998e8\"" Mar 7 01:32:07.474964 containerd[1718]: time="2026-03-07T01:32:07.473836487Z" level=info msg="StartContainer for \"e2e97ad89be05b333a295c1d145a5d13d625c4d40481145da037a2fd5c3998e8\"" Mar 7 01:32:07.505775 systemd[1]: Started cri-containerd-e2e97ad89be05b333a295c1d145a5d13d625c4d40481145da037a2fd5c3998e8.scope - libcontainer container e2e97ad89be05b333a295c1d145a5d13d625c4d40481145da037a2fd5c3998e8. Mar 7 01:32:07.540371 containerd[1718]: time="2026-03-07T01:32:07.540333918Z" level=info msg="StartContainer for \"e2e97ad89be05b333a295c1d145a5d13d625c4d40481145da037a2fd5c3998e8\" returns successfully" Mar 7 01:32:08.658009 containerd[1718]: time="2026-03-07T01:32:08.657956168Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:08.661171 containerd[1718]: time="2026-03-07T01:32:08.661139606Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Mar 7 01:32:08.664425 containerd[1718]: time="2026-03-07T01:32:08.664397843Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:08.669150 containerd[1718]: time="2026-03-07T01:32:08.669114520Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:08.670346 containerd[1718]: time="2026-03-07T01:32:08.670136879Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 1.519892351s" Mar 7 01:32:08.670346 containerd[1718]: time="2026-03-07T01:32:08.670167959Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Mar 7 01:32:08.678161 containerd[1718]: time="2026-03-07T01:32:08.678131593Z" level=info msg="CreateContainer within sandbox \"891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 7 01:32:08.714512 containerd[1718]: time="2026-03-07T01:32:08.714469086Z" level=info msg="CreateContainer within sandbox \"891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"0d0acb96b11d52f76fee58a22da5317cfb43e20f1203e6e7066f62fbb5b57e21\"" Mar 7 01:32:08.715619 containerd[1718]: time="2026-03-07T01:32:08.715539565Z" level=info msg="StartContainer for \"0d0acb96b11d52f76fee58a22da5317cfb43e20f1203e6e7066f62fbb5b57e21\"" Mar 7 01:32:08.749101 systemd[1]: run-containerd-runc-k8s.io-0d0acb96b11d52f76fee58a22da5317cfb43e20f1203e6e7066f62fbb5b57e21-runc.KAIKFn.mount: Deactivated successfully. Mar 7 01:32:08.758985 systemd[1]: Started cri-containerd-0d0acb96b11d52f76fee58a22da5317cfb43e20f1203e6e7066f62fbb5b57e21.scope - libcontainer container 0d0acb96b11d52f76fee58a22da5317cfb43e20f1203e6e7066f62fbb5b57e21. Mar 7 01:32:08.830472 containerd[1718]: time="2026-03-07T01:32:08.830400160Z" level=info msg="StartContainer for \"0d0acb96b11d52f76fee58a22da5317cfb43e20f1203e6e7066f62fbb5b57e21\" returns successfully" Mar 7 01:32:08.833336 containerd[1718]: time="2026-03-07T01:32:08.833302998Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 7 01:32:09.242507 kubelet[3058]: I0307 01:32:09.242473 3058 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 7 01:32:10.201715 kubelet[3058]: I0307 01:32:10.201597 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-fb8dfc889-88kxk" podStartSLOduration=22.638080709 podStartE2EDuration="26.201580982s" podCreationTimestamp="2026-03-07 01:31:44 +0000 UTC" firstStartedPulling="2026-03-07 01:32:03.586126175 +0000 UTC m=+38.690287977" lastFinishedPulling="2026-03-07 01:32:07.149626448 +0000 UTC m=+42.253788250" observedRunningTime="2026-03-07 01:32:08.26527898 +0000 UTC m=+43.369440782" watchObservedRunningTime="2026-03-07 01:32:10.201580982 +0000 UTC m=+45.305742744" Mar 7 01:32:10.761180 kubelet[3058]: I0307 01:32:10.761147 3058 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 7 01:32:10.876707 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1045756717.mount: Deactivated successfully. Mar 7 01:32:10.935116 containerd[1718]: time="2026-03-07T01:32:10.935050837Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:10.937971 containerd[1718]: time="2026-03-07T01:32:10.937812155Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Mar 7 01:32:10.940779 containerd[1718]: time="2026-03-07T01:32:10.940730113Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:10.944993 containerd[1718]: time="2026-03-07T01:32:10.944944790Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:10.945819 containerd[1718]: time="2026-03-07T01:32:10.945665749Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 2.112315151s" Mar 7 01:32:10.945819 containerd[1718]: time="2026-03-07T01:32:10.945698829Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Mar 7 01:32:10.954526 containerd[1718]: time="2026-03-07T01:32:10.954402743Z" level=info msg="CreateContainer within sandbox \"891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 7 01:32:10.999434 containerd[1718]: time="2026-03-07T01:32:10.999290789Z" level=info msg="CreateContainer within sandbox \"891080fee6b70f45ac1b515fde0a7d8df28bbe56d8d4ef434b5d377fa6273935\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"386b89fa076a74dda89fb7b03b630da3114d7776940ba959e9ad142c7cd21f44\"" Mar 7 01:32:11.000371 containerd[1718]: time="2026-03-07T01:32:11.000258189Z" level=info msg="StartContainer for \"386b89fa076a74dda89fb7b03b630da3114d7776940ba959e9ad142c7cd21f44\"" Mar 7 01:32:11.041982 systemd[1]: Started cri-containerd-386b89fa076a74dda89fb7b03b630da3114d7776940ba959e9ad142c7cd21f44.scope - libcontainer container 386b89fa076a74dda89fb7b03b630da3114d7776940ba959e9ad142c7cd21f44. Mar 7 01:32:11.087185 containerd[1718]: time="2026-03-07T01:32:11.086894564Z" level=info msg="StartContainer for \"386b89fa076a74dda89fb7b03b630da3114d7776940ba959e9ad142c7cd21f44\" returns successfully" Mar 7 01:32:11.185617 kernel: calico-node[4938]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 7 01:32:11.270658 kubelet[3058]: I0307 01:32:11.270057 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/whisker-c646d54b-d6d47" podStartSLOduration=1.243857335 podStartE2EDuration="7.270041348s" podCreationTimestamp="2026-03-07 01:32:04 +0000 UTC" firstStartedPulling="2026-03-07 01:32:04.920417015 +0000 UTC m=+40.024578817" lastFinishedPulling="2026-03-07 01:32:10.946601028 +0000 UTC m=+46.050762830" observedRunningTime="2026-03-07 01:32:11.268951429 +0000 UTC m=+46.373113231" watchObservedRunningTime="2026-03-07 01:32:11.270041348 +0000 UTC m=+46.374203150" Mar 7 01:32:11.565286 systemd-networkd[1617]: vxlan.calico: Link UP Mar 7 01:32:11.565906 systemd-networkd[1617]: vxlan.calico: Gained carrier Mar 7 01:32:13.372692 systemd-networkd[1617]: vxlan.calico: Gained IPv6LL Mar 7 01:32:14.030271 containerd[1718]: time="2026-03-07T01:32:14.030035245Z" level=info msg="StopPodSandbox for \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\"" Mar 7 01:32:14.030834 containerd[1718]: time="2026-03-07T01:32:14.030675164Z" level=info msg="StopPodSandbox for \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\"" Mar 7 01:32:14.148556 containerd[1718]: 2026-03-07 01:32:14.091 [INFO][5123] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Mar 7 01:32:14.148556 containerd[1718]: 2026-03-07 01:32:14.092 [INFO][5123] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" iface="eth0" netns="/var/run/netns/cni-0e8274fd-77fc-3875-470e-9d6115b3901e" Mar 7 01:32:14.148556 containerd[1718]: 2026-03-07 01:32:14.092 [INFO][5123] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" iface="eth0" netns="/var/run/netns/cni-0e8274fd-77fc-3875-470e-9d6115b3901e" Mar 7 01:32:14.148556 containerd[1718]: 2026-03-07 01:32:14.093 [INFO][5123] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" iface="eth0" netns="/var/run/netns/cni-0e8274fd-77fc-3875-470e-9d6115b3901e" Mar 7 01:32:14.148556 containerd[1718]: 2026-03-07 01:32:14.093 [INFO][5123] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Mar 7 01:32:14.148556 containerd[1718]: 2026-03-07 01:32:14.093 [INFO][5123] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Mar 7 01:32:14.148556 containerd[1718]: 2026-03-07 01:32:14.131 [INFO][5136] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" HandleID="k8s-pod-network.fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:14.148556 containerd[1718]: 2026-03-07 01:32:14.132 [INFO][5136] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:14.148556 containerd[1718]: 2026-03-07 01:32:14.132 [INFO][5136] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:14.148556 containerd[1718]: 2026-03-07 01:32:14.141 [WARNING][5136] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" HandleID="k8s-pod-network.fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:14.148556 containerd[1718]: 2026-03-07 01:32:14.141 [INFO][5136] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" HandleID="k8s-pod-network.fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:14.148556 containerd[1718]: 2026-03-07 01:32:14.143 [INFO][5136] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:14.148556 containerd[1718]: 2026-03-07 01:32:14.144 [INFO][5123] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Mar 7 01:32:14.149612 containerd[1718]: time="2026-03-07T01:32:14.149581311Z" level=info msg="TearDown network for sandbox \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\" successfully" Mar 7 01:32:14.149691 containerd[1718]: time="2026-03-07T01:32:14.149678511Z" level=info msg="StopPodSandbox for \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\" returns successfully" Mar 7 01:32:14.151221 systemd[1]: run-netns-cni\x2d0e8274fd\x2d77fc\x2d3875\x2d470e\x2d9d6115b3901e.mount: Deactivated successfully. Mar 7 01:32:14.158973 containerd[1718]: time="2026-03-07T01:32:14.158921027Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-8ttj2,Uid:f1de7e3a-82ca-4e43-88c8-75d72af93053,Namespace:kube-system,Attempt:1,}" Mar 7 01:32:14.163173 containerd[1718]: 2026-03-07 01:32:14.098 [INFO][5122] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Mar 7 01:32:14.163173 containerd[1718]: 2026-03-07 01:32:14.099 [INFO][5122] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" iface="eth0" netns="/var/run/netns/cni-b7c1fbda-c006-f2a0-6020-cd7ffd248ac9" Mar 7 01:32:14.163173 containerd[1718]: 2026-03-07 01:32:14.099 [INFO][5122] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" iface="eth0" netns="/var/run/netns/cni-b7c1fbda-c006-f2a0-6020-cd7ffd248ac9" Mar 7 01:32:14.163173 containerd[1718]: 2026-03-07 01:32:14.099 [INFO][5122] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" iface="eth0" netns="/var/run/netns/cni-b7c1fbda-c006-f2a0-6020-cd7ffd248ac9" Mar 7 01:32:14.163173 containerd[1718]: 2026-03-07 01:32:14.099 [INFO][5122] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Mar 7 01:32:14.163173 containerd[1718]: 2026-03-07 01:32:14.099 [INFO][5122] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Mar 7 01:32:14.163173 containerd[1718]: 2026-03-07 01:32:14.134 [INFO][5141] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" HandleID="k8s-pod-network.1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:14.163173 containerd[1718]: 2026-03-07 01:32:14.135 [INFO][5141] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:14.163173 containerd[1718]: 2026-03-07 01:32:14.143 [INFO][5141] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:14.163173 containerd[1718]: 2026-03-07 01:32:14.158 [WARNING][5141] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" HandleID="k8s-pod-network.1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:14.163173 containerd[1718]: 2026-03-07 01:32:14.158 [INFO][5141] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" HandleID="k8s-pod-network.1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:14.163173 containerd[1718]: 2026-03-07 01:32:14.159 [INFO][5141] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:14.163173 containerd[1718]: 2026-03-07 01:32:14.161 [INFO][5122] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Mar 7 01:32:14.163618 containerd[1718]: time="2026-03-07T01:32:14.163363905Z" level=info msg="TearDown network for sandbox \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\" successfully" Mar 7 01:32:14.163618 containerd[1718]: time="2026-03-07T01:32:14.163397105Z" level=info msg="StopPodSandbox for \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\" returns successfully" Mar 7 01:32:14.166370 systemd[1]: run-netns-cni\x2db7c1fbda\x2dc006\x2df2a0\x2d6020\x2dcd7ffd248ac9.mount: Deactivated successfully. Mar 7 01:32:14.170927 containerd[1718]: time="2026-03-07T01:32:14.170891742Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8499d985fb-scddx,Uid:491bd089-80af-49d7-8a6a-aa3fd4f9da71,Namespace:calico-system,Attempt:1,}" Mar 7 01:32:14.340481 systemd-networkd[1617]: calife7a02b6638: Link UP Mar 7 01:32:14.342661 systemd-networkd[1617]: calife7a02b6638: Gained carrier Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.237 [INFO][5151] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0 coredns-7d764666f9- kube-system f1de7e3a-82ca-4e43-88c8-75d72af93053 964 0 2026-03-07 01:31:31 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081.3.6-n-3151c5d0e2 coredns-7d764666f9-8ttj2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calife7a02b6638 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" Namespace="kube-system" Pod="coredns-7d764666f9-8ttj2" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-" Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.237 [INFO][5151] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" Namespace="kube-system" Pod="coredns-7d764666f9-8ttj2" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.279 [INFO][5174] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" HandleID="k8s-pod-network.6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.292 [INFO][5174] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" HandleID="k8s-pod-network.6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbe80), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081.3.6-n-3151c5d0e2", "pod":"coredns-7d764666f9-8ttj2", "timestamp":"2026-03-07 01:32:14.279407093 +0000 UTC"}, Hostname:"ci-4081.3.6-n-3151c5d0e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40001862c0)} Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.292 [INFO][5174] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.292 [INFO][5174] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.293 [INFO][5174] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-3151c5d0e2' Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.295 [INFO][5174] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.300 [INFO][5174] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.307 [INFO][5174] ipam/ipam.go 526: Trying affinity for 192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.309 [INFO][5174] ipam/ipam.go 160: Attempting to load block cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.312 [INFO][5174] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.313 [INFO][5174] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.15.192/26 handle="k8s-pod-network.6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.314 [INFO][5174] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880 Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.321 [INFO][5174] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.15.192/26 handle="k8s-pod-network.6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.331 [INFO][5174] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.15.195/26] block=192.168.15.192/26 handle="k8s-pod-network.6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.331 [INFO][5174] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.15.195/26] handle="k8s-pod-network.6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.331 [INFO][5174] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:14.367398 containerd[1718]: 2026-03-07 01:32:14.332 [INFO][5174] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.15.195/26] IPv6=[] ContainerID="6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" HandleID="k8s-pod-network.6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:14.368860 containerd[1718]: 2026-03-07 01:32:14.334 [INFO][5151] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" Namespace="kube-system" Pod="coredns-7d764666f9-8ttj2" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"f1de7e3a-82ca-4e43-88c8-75d72af93053", ResourceVersion:"964", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"", Pod:"coredns-7d764666f9-8ttj2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.15.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calife7a02b6638", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:14.368860 containerd[1718]: 2026-03-07 01:32:14.336 [INFO][5151] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.15.195/32] ContainerID="6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" Namespace="kube-system" Pod="coredns-7d764666f9-8ttj2" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:14.368860 containerd[1718]: 2026-03-07 01:32:14.336 [INFO][5151] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calife7a02b6638 ContainerID="6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" Namespace="kube-system" Pod="coredns-7d764666f9-8ttj2" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:14.368860 containerd[1718]: 2026-03-07 01:32:14.342 [INFO][5151] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" Namespace="kube-system" Pod="coredns-7d764666f9-8ttj2" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:14.368860 containerd[1718]: 2026-03-07 01:32:14.344 [INFO][5151] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" Namespace="kube-system" Pod="coredns-7d764666f9-8ttj2" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"f1de7e3a-82ca-4e43-88c8-75d72af93053", ResourceVersion:"964", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880", Pod:"coredns-7d764666f9-8ttj2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.15.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calife7a02b6638", MAC:"be:cb:39:fc:7e:48", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:14.369231 containerd[1718]: 2026-03-07 01:32:14.363 [INFO][5151] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880" Namespace="kube-system" Pod="coredns-7d764666f9-8ttj2" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:14.390498 containerd[1718]: time="2026-03-07T01:32:14.390211843Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:32:14.390498 containerd[1718]: time="2026-03-07T01:32:14.390288443Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:32:14.390498 containerd[1718]: time="2026-03-07T01:32:14.390304123Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:14.390498 containerd[1718]: time="2026-03-07T01:32:14.390393523Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:14.418720 systemd[1]: Started cri-containerd-6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880.scope - libcontainer container 6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880. Mar 7 01:32:14.463978 systemd-networkd[1617]: cali8fe9ae8149d: Link UP Mar 7 01:32:14.466364 systemd-networkd[1617]: cali8fe9ae8149d: Gained carrier Mar 7 01:32:14.481820 containerd[1718]: time="2026-03-07T01:32:14.481785642Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-8ttj2,Uid:f1de7e3a-82ca-4e43-88c8-75d72af93053,Namespace:kube-system,Attempt:1,} returns sandbox id \"6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880\"" Mar 7 01:32:14.494741 containerd[1718]: time="2026-03-07T01:32:14.494471517Z" level=info msg="CreateContainer within sandbox \"6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.267 [INFO][5162] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0 calico-kube-controllers-8499d985fb- calico-system 491bd089-80af-49d7-8a6a-aa3fd4f9da71 965 0 2026-03-07 01:31:46 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8499d985fb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081.3.6-n-3151c5d0e2 calico-kube-controllers-8499d985fb-scddx eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali8fe9ae8149d [] [] }} ContainerID="54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" Namespace="calico-system" Pod="calico-kube-controllers-8499d985fb-scddx" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-" Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.267 [INFO][5162] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" Namespace="calico-system" Pod="calico-kube-controllers-8499d985fb-scddx" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.308 [INFO][5182] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" HandleID="k8s-pod-network.54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.321 [INFO][5182] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" HandleID="k8s-pod-network.54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002f3aa0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081.3.6-n-3151c5d0e2", "pod":"calico-kube-controllers-8499d985fb-scddx", "timestamp":"2026-03-07 01:32:14.30837556 +0000 UTC"}, Hostname:"ci-4081.3.6-n-3151c5d0e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400026cdc0)} Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.321 [INFO][5182] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.331 [INFO][5182] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.331 [INFO][5182] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-3151c5d0e2' Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.396 [INFO][5182] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.411 [INFO][5182] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.422 [INFO][5182] ipam/ipam.go 526: Trying affinity for 192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.424 [INFO][5182] ipam/ipam.go 160: Attempting to load block cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.428 [INFO][5182] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.428 [INFO][5182] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.15.192/26 handle="k8s-pod-network.54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.429 [INFO][5182] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.440 [INFO][5182] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.15.192/26 handle="k8s-pod-network.54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.447 [INFO][5182] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.15.196/26] block=192.168.15.192/26 handle="k8s-pod-network.54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.447 [INFO][5182] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.15.196/26] handle="k8s-pod-network.54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.447 [INFO][5182] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:14.496775 containerd[1718]: 2026-03-07 01:32:14.447 [INFO][5182] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.15.196/26] IPv6=[] ContainerID="54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" HandleID="k8s-pod-network.54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:14.498101 containerd[1718]: 2026-03-07 01:32:14.455 [INFO][5162] cni-plugin/k8s.go 418: Populated endpoint ContainerID="54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" Namespace="calico-system" Pod="calico-kube-controllers-8499d985fb-scddx" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0", GenerateName:"calico-kube-controllers-8499d985fb-", Namespace:"calico-system", SelfLink:"", UID:"491bd089-80af-49d7-8a6a-aa3fd4f9da71", ResourceVersion:"965", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8499d985fb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"", Pod:"calico-kube-controllers-8499d985fb-scddx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.15.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8fe9ae8149d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:14.498101 containerd[1718]: 2026-03-07 01:32:14.455 [INFO][5162] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.15.196/32] ContainerID="54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" Namespace="calico-system" Pod="calico-kube-controllers-8499d985fb-scddx" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:14.498101 containerd[1718]: 2026-03-07 01:32:14.455 [INFO][5162] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8fe9ae8149d ContainerID="54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" Namespace="calico-system" Pod="calico-kube-controllers-8499d985fb-scddx" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:14.498101 containerd[1718]: 2026-03-07 01:32:14.466 [INFO][5162] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" Namespace="calico-system" Pod="calico-kube-controllers-8499d985fb-scddx" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:14.498101 containerd[1718]: 2026-03-07 01:32:14.468 [INFO][5162] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" Namespace="calico-system" Pod="calico-kube-controllers-8499d985fb-scddx" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0", GenerateName:"calico-kube-controllers-8499d985fb-", Namespace:"calico-system", SelfLink:"", UID:"491bd089-80af-49d7-8a6a-aa3fd4f9da71", ResourceVersion:"965", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8499d985fb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba", Pod:"calico-kube-controllers-8499d985fb-scddx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.15.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8fe9ae8149d", MAC:"62:e3:61:36:f7:74", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:14.498101 containerd[1718]: 2026-03-07 01:32:14.490 [INFO][5162] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba" Namespace="calico-system" Pod="calico-kube-controllers-8499d985fb-scddx" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:14.530916 containerd[1718]: time="2026-03-07T01:32:14.530835380Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:32:14.531099 containerd[1718]: time="2026-03-07T01:32:14.531001500Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:32:14.531331 containerd[1718]: time="2026-03-07T01:32:14.531159100Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:14.531331 containerd[1718]: time="2026-03-07T01:32:14.531265900Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:14.536657 containerd[1718]: time="2026-03-07T01:32:14.536614258Z" level=info msg="CreateContainer within sandbox \"6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e73de8121335f9dc52b5854d8a791a166282cbd98276fb0b40a1798ec8e4ccd2\"" Mar 7 01:32:14.537919 containerd[1718]: time="2026-03-07T01:32:14.537710857Z" level=info msg="StartContainer for \"e73de8121335f9dc52b5854d8a791a166282cbd98276fb0b40a1798ec8e4ccd2\"" Mar 7 01:32:14.552872 systemd[1]: Started cri-containerd-54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba.scope - libcontainer container 54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba. Mar 7 01:32:14.569722 systemd[1]: Started cri-containerd-e73de8121335f9dc52b5854d8a791a166282cbd98276fb0b40a1798ec8e4ccd2.scope - libcontainer container e73de8121335f9dc52b5854d8a791a166282cbd98276fb0b40a1798ec8e4ccd2. Mar 7 01:32:14.602587 containerd[1718]: time="2026-03-07T01:32:14.602387548Z" level=info msg="StartContainer for \"e73de8121335f9dc52b5854d8a791a166282cbd98276fb0b40a1798ec8e4ccd2\" returns successfully" Mar 7 01:32:14.617123 containerd[1718]: time="2026-03-07T01:32:14.617084702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8499d985fb-scddx,Uid:491bd089-80af-49d7-8a6a-aa3fd4f9da71,Namespace:calico-system,Attempt:1,} returns sandbox id \"54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba\"" Mar 7 01:32:14.619805 containerd[1718]: time="2026-03-07T01:32:14.619740860Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 7 01:32:15.293105 kubelet[3058]: I0307 01:32:15.293037 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-8ttj2" podStartSLOduration=44.293019359 podStartE2EDuration="44.293019359s" podCreationTimestamp="2026-03-07 01:31:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:32:15.292115039 +0000 UTC m=+50.396276841" watchObservedRunningTime="2026-03-07 01:32:15.293019359 +0000 UTC m=+50.397181161" Mar 7 01:32:16.030194 containerd[1718]: time="2026-03-07T01:32:16.029855189Z" level=info msg="StopPodSandbox for \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\"" Mar 7 01:32:16.030194 containerd[1718]: time="2026-03-07T01:32:16.029855189Z" level=info msg="StopPodSandbox for \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\"" Mar 7 01:32:16.125579 systemd-networkd[1617]: calife7a02b6638: Gained IPv6LL Mar 7 01:32:16.171523 containerd[1718]: 2026-03-07 01:32:16.110 [INFO][5381] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Mar 7 01:32:16.171523 containerd[1718]: 2026-03-07 01:32:16.110 [INFO][5381] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" iface="eth0" netns="/var/run/netns/cni-6c3f2529-1712-d7fc-710f-64c65259d425" Mar 7 01:32:16.171523 containerd[1718]: 2026-03-07 01:32:16.110 [INFO][5381] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" iface="eth0" netns="/var/run/netns/cni-6c3f2529-1712-d7fc-710f-64c65259d425" Mar 7 01:32:16.171523 containerd[1718]: 2026-03-07 01:32:16.111 [INFO][5381] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" iface="eth0" netns="/var/run/netns/cni-6c3f2529-1712-d7fc-710f-64c65259d425" Mar 7 01:32:16.171523 containerd[1718]: 2026-03-07 01:32:16.111 [INFO][5381] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Mar 7 01:32:16.171523 containerd[1718]: 2026-03-07 01:32:16.111 [INFO][5381] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Mar 7 01:32:16.171523 containerd[1718]: 2026-03-07 01:32:16.147 [INFO][5396] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" HandleID="k8s-pod-network.58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:16.171523 containerd[1718]: 2026-03-07 01:32:16.147 [INFO][5396] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:16.171523 containerd[1718]: 2026-03-07 01:32:16.147 [INFO][5396] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:16.171523 containerd[1718]: 2026-03-07 01:32:16.165 [WARNING][5396] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" HandleID="k8s-pod-network.58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:16.171523 containerd[1718]: 2026-03-07 01:32:16.165 [INFO][5396] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" HandleID="k8s-pod-network.58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:16.171523 containerd[1718]: 2026-03-07 01:32:16.167 [INFO][5396] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:16.171523 containerd[1718]: 2026-03-07 01:32:16.170 [INFO][5381] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Mar 7 01:32:16.172044 containerd[1718]: time="2026-03-07T01:32:16.171737845Z" level=info msg="TearDown network for sandbox \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\" successfully" Mar 7 01:32:16.172044 containerd[1718]: time="2026-03-07T01:32:16.171765485Z" level=info msg="StopPodSandbox for \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\" returns successfully" Mar 7 01:32:16.175353 systemd[1]: run-netns-cni\x2d6c3f2529\x2d1712\x2dd7fc\x2d710f\x2d64c65259d425.mount: Deactivated successfully. Mar 7 01:32:16.181640 containerd[1718]: time="2026-03-07T01:32:16.181150801Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb8dfc889-wk8q8,Uid:4baa109f-be88-432a-8b8a-fa833dcc95d3,Namespace:calico-system,Attempt:1,}" Mar 7 01:32:16.191670 containerd[1718]: 2026-03-07 01:32:16.104 [INFO][5373] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Mar 7 01:32:16.191670 containerd[1718]: 2026-03-07 01:32:16.104 [INFO][5373] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" iface="eth0" netns="/var/run/netns/cni-53d63b54-db82-6b9d-0309-7adf86d0e6c7" Mar 7 01:32:16.191670 containerd[1718]: 2026-03-07 01:32:16.105 [INFO][5373] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" iface="eth0" netns="/var/run/netns/cni-53d63b54-db82-6b9d-0309-7adf86d0e6c7" Mar 7 01:32:16.191670 containerd[1718]: 2026-03-07 01:32:16.105 [INFO][5373] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" iface="eth0" netns="/var/run/netns/cni-53d63b54-db82-6b9d-0309-7adf86d0e6c7" Mar 7 01:32:16.191670 containerd[1718]: 2026-03-07 01:32:16.105 [INFO][5373] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Mar 7 01:32:16.191670 containerd[1718]: 2026-03-07 01:32:16.105 [INFO][5373] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Mar 7 01:32:16.191670 containerd[1718]: 2026-03-07 01:32:16.153 [INFO][5391] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" HandleID="k8s-pod-network.223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:16.191670 containerd[1718]: 2026-03-07 01:32:16.153 [INFO][5391] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:16.191670 containerd[1718]: 2026-03-07 01:32:16.167 [INFO][5391] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:16.191670 containerd[1718]: 2026-03-07 01:32:16.183 [WARNING][5391] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" HandleID="k8s-pod-network.223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:16.191670 containerd[1718]: 2026-03-07 01:32:16.185 [INFO][5391] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" HandleID="k8s-pod-network.223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:16.191670 containerd[1718]: 2026-03-07 01:32:16.187 [INFO][5391] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:16.191670 containerd[1718]: 2026-03-07 01:32:16.189 [INFO][5373] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Mar 7 01:32:16.193393 containerd[1718]: time="2026-03-07T01:32:16.192211436Z" level=info msg="TearDown network for sandbox \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\" successfully" Mar 7 01:32:16.193393 containerd[1718]: time="2026-03-07T01:32:16.192241156Z" level=info msg="StopPodSandbox for \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\" returns successfully" Mar 7 01:32:16.195058 systemd[1]: run-netns-cni\x2d53d63b54\x2ddb82\x2d6b9d\x2d0309\x2d7adf86d0e6c7.mount: Deactivated successfully. Mar 7 01:32:16.201936 containerd[1718]: time="2026-03-07T01:32:16.201624232Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-nnkbr,Uid:aa61cab2-0011-4c46-b66c-4d054b7336c6,Namespace:calico-system,Attempt:1,}" Mar 7 01:32:16.254668 systemd-networkd[1617]: cali8fe9ae8149d: Gained IPv6LL Mar 7 01:32:16.439670 systemd-networkd[1617]: calic7da0d099da: Link UP Mar 7 01:32:16.440844 systemd-networkd[1617]: calic7da0d099da: Gained carrier Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.316 [INFO][5410] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0 calico-apiserver-fb8dfc889- calico-system 4baa109f-be88-432a-8b8a-fa833dcc95d3 994 0 2026-03-07 01:31:44 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:fb8dfc889 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081.3.6-n-3151c5d0e2 calico-apiserver-fb8dfc889-wk8q8 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calic7da0d099da [] [] }} ContainerID="a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-wk8q8" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-" Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.317 [INFO][5410] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-wk8q8" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.365 [INFO][5436] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" HandleID="k8s-pod-network.a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.380 [INFO][5436] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" HandleID="k8s-pod-network.a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbaa0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081.3.6-n-3151c5d0e2", "pod":"calico-apiserver-fb8dfc889-wk8q8", "timestamp":"2026-03-07 01:32:16.365399878 +0000 UTC"}, Hostname:"ci-4081.3.6-n-3151c5d0e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000266dc0)} Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.380 [INFO][5436] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.380 [INFO][5436] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.380 [INFO][5436] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-3151c5d0e2' Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.390 [INFO][5436] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.395 [INFO][5436] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.402 [INFO][5436] ipam/ipam.go 526: Trying affinity for 192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.404 [INFO][5436] ipam/ipam.go 160: Attempting to load block cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.407 [INFO][5436] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.408 [INFO][5436] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.15.192/26 handle="k8s-pod-network.a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.410 [INFO][5436] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0 Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.415 [INFO][5436] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.15.192/26 handle="k8s-pod-network.a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.425 [INFO][5436] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.15.197/26] block=192.168.15.192/26 handle="k8s-pod-network.a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.425 [INFO][5436] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.15.197/26] handle="k8s-pod-network.a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.425 [INFO][5436] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:16.465133 containerd[1718]: 2026-03-07 01:32:16.425 [INFO][5436] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.15.197/26] IPv6=[] ContainerID="a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" HandleID="k8s-pod-network.a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:16.466462 containerd[1718]: 2026-03-07 01:32:16.430 [INFO][5410] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-wk8q8" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0", GenerateName:"calico-apiserver-fb8dfc889-", Namespace:"calico-system", SelfLink:"", UID:"4baa109f-be88-432a-8b8a-fa833dcc95d3", ResourceVersion:"994", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fb8dfc889", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"", Pod:"calico-apiserver-fb8dfc889-wk8q8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.15.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calic7da0d099da", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:16.466462 containerd[1718]: 2026-03-07 01:32:16.430 [INFO][5410] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.15.197/32] ContainerID="a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-wk8q8" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:16.466462 containerd[1718]: 2026-03-07 01:32:16.430 [INFO][5410] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic7da0d099da ContainerID="a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-wk8q8" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:16.466462 containerd[1718]: 2026-03-07 01:32:16.441 [INFO][5410] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-wk8q8" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:16.466462 containerd[1718]: 2026-03-07 01:32:16.442 [INFO][5410] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-wk8q8" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0", GenerateName:"calico-apiserver-fb8dfc889-", Namespace:"calico-system", SelfLink:"", UID:"4baa109f-be88-432a-8b8a-fa833dcc95d3", ResourceVersion:"994", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fb8dfc889", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0", Pod:"calico-apiserver-fb8dfc889-wk8q8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.15.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calic7da0d099da", MAC:"e2:ea:63:65:9f:5d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:16.466462 containerd[1718]: 2026-03-07 01:32:16.460 [INFO][5410] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0" Namespace="calico-system" Pod="calico-apiserver-fb8dfc889-wk8q8" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:16.505759 containerd[1718]: time="2026-03-07T01:32:16.504621296Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:32:16.505759 containerd[1718]: time="2026-03-07T01:32:16.504694336Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:32:16.505759 containerd[1718]: time="2026-03-07T01:32:16.504705856Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:16.505759 containerd[1718]: time="2026-03-07T01:32:16.504791616Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:16.534733 systemd[1]: Started cri-containerd-a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0.scope - libcontainer container a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0. Mar 7 01:32:16.547012 systemd-networkd[1617]: calia0a30d35641: Link UP Mar 7 01:32:16.548885 systemd-networkd[1617]: calia0a30d35641: Gained carrier Mar 7 01:32:16.598182 containerd[1718]: time="2026-03-07T01:32:16.598139894Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb8dfc889-wk8q8,Uid:4baa109f-be88-432a-8b8a-fa833dcc95d3,Namespace:calico-system,Attempt:1,} returns sandbox id \"a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0\"" Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.340 [INFO][5420] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0 goldmane-9f7667bb8- calico-system aa61cab2-0011-4c46-b66c-4d054b7336c6 993 0 2026-03-07 01:31:44 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:9f7667bb8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081.3.6-n-3151c5d0e2 goldmane-9f7667bb8-nnkbr eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calia0a30d35641 [] [] }} ContainerID="c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" Namespace="calico-system" Pod="goldmane-9f7667bb8-nnkbr" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-" Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.340 [INFO][5420] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" Namespace="calico-system" Pod="goldmane-9f7667bb8-nnkbr" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.396 [INFO][5443] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" HandleID="k8s-pod-network.c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.409 [INFO][5443] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" HandleID="k8s-pod-network.c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbc70), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081.3.6-n-3151c5d0e2", "pod":"goldmane-9f7667bb8-nnkbr", "timestamp":"2026-03-07 01:32:16.396527864 +0000 UTC"}, Hostname:"ci-4081.3.6-n-3151c5d0e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002db1e0)} Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.409 [INFO][5443] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.428 [INFO][5443] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.428 [INFO][5443] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-3151c5d0e2' Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.489 [INFO][5443] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.499 [INFO][5443] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.508 [INFO][5443] ipam/ipam.go 526: Trying affinity for 192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.511 [INFO][5443] ipam/ipam.go 160: Attempting to load block cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.513 [INFO][5443] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.514 [INFO][5443] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.15.192/26 handle="k8s-pod-network.c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.515 [INFO][5443] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8 Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.521 [INFO][5443] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.15.192/26 handle="k8s-pod-network.c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.534 [INFO][5443] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.15.198/26] block=192.168.15.192/26 handle="k8s-pod-network.c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.534 [INFO][5443] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.15.198/26] handle="k8s-pod-network.c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.534 [INFO][5443] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:16.608111 containerd[1718]: 2026-03-07 01:32:16.535 [INFO][5443] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.15.198/26] IPv6=[] ContainerID="c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" HandleID="k8s-pod-network.c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:16.610022 containerd[1718]: 2026-03-07 01:32:16.539 [INFO][5420] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" Namespace="calico-system" Pod="goldmane-9f7667bb8-nnkbr" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"aa61cab2-0011-4c46-b66c-4d054b7336c6", ResourceVersion:"993", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"", Pod:"goldmane-9f7667bb8-nnkbr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.15.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia0a30d35641", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:16.610022 containerd[1718]: 2026-03-07 01:32:16.539 [INFO][5420] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.15.198/32] ContainerID="c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" Namespace="calico-system" Pod="goldmane-9f7667bb8-nnkbr" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:16.610022 containerd[1718]: 2026-03-07 01:32:16.539 [INFO][5420] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia0a30d35641 ContainerID="c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" Namespace="calico-system" Pod="goldmane-9f7667bb8-nnkbr" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:16.610022 containerd[1718]: 2026-03-07 01:32:16.554 [INFO][5420] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" Namespace="calico-system" Pod="goldmane-9f7667bb8-nnkbr" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:16.610022 containerd[1718]: 2026-03-07 01:32:16.554 [INFO][5420] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" Namespace="calico-system" Pod="goldmane-9f7667bb8-nnkbr" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"aa61cab2-0011-4c46-b66c-4d054b7336c6", ResourceVersion:"993", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8", Pod:"goldmane-9f7667bb8-nnkbr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.15.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia0a30d35641", MAC:"32:ef:a6:1d:d9:83", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:16.610022 containerd[1718]: 2026-03-07 01:32:16.601 [INFO][5420] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8" Namespace="calico-system" Pod="goldmane-9f7667bb8-nnkbr" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:16.614145 containerd[1718]: time="2026-03-07T01:32:16.614045527Z" level=info msg="CreateContainer within sandbox \"a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 7 01:32:16.648367 containerd[1718]: time="2026-03-07T01:32:16.648272471Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:32:16.648670 containerd[1718]: time="2026-03-07T01:32:16.648575831Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:32:16.648670 containerd[1718]: time="2026-03-07T01:32:16.648598191Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:16.648860 containerd[1718]: time="2026-03-07T01:32:16.648810431Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:16.673906 containerd[1718]: time="2026-03-07T01:32:16.673317260Z" level=info msg="CreateContainer within sandbox \"a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"cd10e0dc66097c34cd373bdcee8b7130b486dd77b14c32f890bfd33812006d5c\"" Mar 7 01:32:16.677618 containerd[1718]: time="2026-03-07T01:32:16.676737899Z" level=info msg="StartContainer for \"cd10e0dc66097c34cd373bdcee8b7130b486dd77b14c32f890bfd33812006d5c\"" Mar 7 01:32:16.682295 systemd[1]: Started cri-containerd-c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8.scope - libcontainer container c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8. Mar 7 01:32:16.715806 systemd[1]: Started cri-containerd-cd10e0dc66097c34cd373bdcee8b7130b486dd77b14c32f890bfd33812006d5c.scope - libcontainer container cd10e0dc66097c34cd373bdcee8b7130b486dd77b14c32f890bfd33812006d5c. Mar 7 01:32:16.745704 containerd[1718]: time="2026-03-07T01:32:16.745667948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-nnkbr,Uid:aa61cab2-0011-4c46-b66c-4d054b7336c6,Namespace:calico-system,Attempt:1,} returns sandbox id \"c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8\"" Mar 7 01:32:16.778322 containerd[1718]: time="2026-03-07T01:32:16.778284013Z" level=info msg="StartContainer for \"cd10e0dc66097c34cd373bdcee8b7130b486dd77b14c32f890bfd33812006d5c\" returns successfully" Mar 7 01:32:17.076686 containerd[1718]: time="2026-03-07T01:32:17.035361778Z" level=info msg="StopPodSandbox for \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\"" Mar 7 01:32:17.076686 containerd[1718]: time="2026-03-07T01:32:17.036696417Z" level=info msg="StopPodSandbox for \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\"" Mar 7 01:32:17.213232 containerd[1718]: 2026-03-07 01:32:17.124 [INFO][5638] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Mar 7 01:32:17.213232 containerd[1718]: 2026-03-07 01:32:17.124 [INFO][5638] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" iface="eth0" netns="/var/run/netns/cni-dcfb0581-aa56-fdaf-0b5b-23bf65e7fcca" Mar 7 01:32:17.213232 containerd[1718]: 2026-03-07 01:32:17.124 [INFO][5638] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" iface="eth0" netns="/var/run/netns/cni-dcfb0581-aa56-fdaf-0b5b-23bf65e7fcca" Mar 7 01:32:17.213232 containerd[1718]: 2026-03-07 01:32:17.125 [INFO][5638] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" iface="eth0" netns="/var/run/netns/cni-dcfb0581-aa56-fdaf-0b5b-23bf65e7fcca" Mar 7 01:32:17.213232 containerd[1718]: 2026-03-07 01:32:17.125 [INFO][5638] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Mar 7 01:32:17.213232 containerd[1718]: 2026-03-07 01:32:17.125 [INFO][5638] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Mar 7 01:32:17.213232 containerd[1718]: 2026-03-07 01:32:17.190 [INFO][5655] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" HandleID="k8s-pod-network.06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:17.213232 containerd[1718]: 2026-03-07 01:32:17.190 [INFO][5655] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:17.213232 containerd[1718]: 2026-03-07 01:32:17.190 [INFO][5655] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:17.213232 containerd[1718]: 2026-03-07 01:32:17.206 [WARNING][5655] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" HandleID="k8s-pod-network.06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:17.213232 containerd[1718]: 2026-03-07 01:32:17.206 [INFO][5655] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" HandleID="k8s-pod-network.06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:17.213232 containerd[1718]: 2026-03-07 01:32:17.208 [INFO][5655] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:17.213232 containerd[1718]: 2026-03-07 01:32:17.210 [INFO][5638] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Mar 7 01:32:17.214876 containerd[1718]: time="2026-03-07T01:32:17.214842457Z" level=info msg="TearDown network for sandbox \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\" successfully" Mar 7 01:32:17.214945 containerd[1718]: time="2026-03-07T01:32:17.214932777Z" level=info msg="StopPodSandbox for \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\" returns successfully" Mar 7 01:32:17.220887 systemd[1]: run-netns-cni\x2ddcfb0581\x2daa56\x2dfdaf\x2d0b5b\x2d23bf65e7fcca.mount: Deactivated successfully. Mar 7 01:32:17.221793 containerd[1718]: time="2026-03-07T01:32:17.221766254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-d5j7z,Uid:959c57a8-5ea2-429c-aa5b-3c7b113e7280,Namespace:calico-system,Attempt:1,}" Mar 7 01:32:17.259089 containerd[1718]: 2026-03-07 01:32:17.157 [INFO][5639] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Mar 7 01:32:17.259089 containerd[1718]: 2026-03-07 01:32:17.159 [INFO][5639] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" iface="eth0" netns="/var/run/netns/cni-ac2b9abc-2c92-e1cc-d311-3839085d680e" Mar 7 01:32:17.259089 containerd[1718]: 2026-03-07 01:32:17.160 [INFO][5639] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" iface="eth0" netns="/var/run/netns/cni-ac2b9abc-2c92-e1cc-d311-3839085d680e" Mar 7 01:32:17.259089 containerd[1718]: 2026-03-07 01:32:17.160 [INFO][5639] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" iface="eth0" netns="/var/run/netns/cni-ac2b9abc-2c92-e1cc-d311-3839085d680e" Mar 7 01:32:17.259089 containerd[1718]: 2026-03-07 01:32:17.160 [INFO][5639] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Mar 7 01:32:17.259089 containerd[1718]: 2026-03-07 01:32:17.160 [INFO][5639] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Mar 7 01:32:17.259089 containerd[1718]: 2026-03-07 01:32:17.238 [INFO][5661] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" HandleID="k8s-pod-network.794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:17.259089 containerd[1718]: 2026-03-07 01:32:17.238 [INFO][5661] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:17.259089 containerd[1718]: 2026-03-07 01:32:17.238 [INFO][5661] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:17.259089 containerd[1718]: 2026-03-07 01:32:17.250 [WARNING][5661] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" HandleID="k8s-pod-network.794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:17.259089 containerd[1718]: 2026-03-07 01:32:17.251 [INFO][5661] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" HandleID="k8s-pod-network.794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:17.259089 containerd[1718]: 2026-03-07 01:32:17.253 [INFO][5661] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:17.259089 containerd[1718]: 2026-03-07 01:32:17.257 [INFO][5639] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Mar 7 01:32:17.259630 containerd[1718]: time="2026-03-07T01:32:17.259592157Z" level=info msg="TearDown network for sandbox \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\" successfully" Mar 7 01:32:17.259667 containerd[1718]: time="2026-03-07T01:32:17.259629877Z" level=info msg="StopPodSandbox for \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\" returns successfully" Mar 7 01:32:17.264577 containerd[1718]: time="2026-03-07T01:32:17.264295235Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-vbqm5,Uid:4054fdd0-0f08-4bdd-ad33-94fb06421936,Namespace:kube-system,Attempt:1,}" Mar 7 01:32:17.265338 systemd[1]: run-netns-cni\x2dac2b9abc\x2d2c92\x2de1cc\x2dd311\x2d3839085d680e.mount: Deactivated successfully. Mar 7 01:32:17.330771 kubelet[3058]: I0307 01:32:17.327676 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-fb8dfc889-wk8q8" podStartSLOduration=33.327662127 podStartE2EDuration="33.327662127s" podCreationTimestamp="2026-03-07 01:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:32:17.326887287 +0000 UTC m=+52.431049089" watchObservedRunningTime="2026-03-07 01:32:17.327662127 +0000 UTC m=+52.431823889" Mar 7 01:32:17.534793 systemd-networkd[1617]: cali077028cab56: Link UP Mar 7 01:32:17.534967 systemd-networkd[1617]: cali077028cab56: Gained carrier Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.347 [INFO][5668] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0 csi-node-driver- calico-system 959c57a8-5ea2-429c-aa5b-3c7b113e7280 1009 0 2026-03-07 01:31:46 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:589b8b8d94 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081.3.6-n-3151c5d0e2 csi-node-driver-d5j7z eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali077028cab56 [] [] }} ContainerID="b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" Namespace="calico-system" Pod="csi-node-driver-d5j7z" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-" Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.348 [INFO][5668] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" Namespace="calico-system" Pod="csi-node-driver-d5j7z" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.434 [INFO][5689] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" HandleID="k8s-pod-network.b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.446 [INFO][5689] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" HandleID="k8s-pod-network.b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000380140), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081.3.6-n-3151c5d0e2", "pod":"csi-node-driver-d5j7z", "timestamp":"2026-03-07 01:32:17.434878919 +0000 UTC"}, Hostname:"ci-4081.3.6-n-3151c5d0e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000244000)} Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.446 [INFO][5689] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.446 [INFO][5689] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.446 [INFO][5689] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-3151c5d0e2' Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.450 [INFO][5689] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.456 [INFO][5689] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.462 [INFO][5689] ipam/ipam.go 526: Trying affinity for 192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.465 [INFO][5689] ipam/ipam.go 160: Attempting to load block cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.473 [INFO][5689] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.473 [INFO][5689] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.15.192/26 handle="k8s-pod-network.b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.478 [INFO][5689] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.486 [INFO][5689] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.15.192/26 handle="k8s-pod-network.b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.524 [INFO][5689] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.15.199/26] block=192.168.15.192/26 handle="k8s-pod-network.b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.524 [INFO][5689] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.15.199/26] handle="k8s-pod-network.b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.524 [INFO][5689] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:17.590069 containerd[1718]: 2026-03-07 01:32:17.524 [INFO][5689] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.15.199/26] IPv6=[] ContainerID="b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" HandleID="k8s-pod-network.b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:17.590894 containerd[1718]: 2026-03-07 01:32:17.530 [INFO][5668] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" Namespace="calico-system" Pod="csi-node-driver-d5j7z" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"959c57a8-5ea2-429c-aa5b-3c7b113e7280", ResourceVersion:"1009", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"", Pod:"csi-node-driver-d5j7z", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.15.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali077028cab56", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:17.590894 containerd[1718]: 2026-03-07 01:32:17.530 [INFO][5668] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.15.199/32] ContainerID="b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" Namespace="calico-system" Pod="csi-node-driver-d5j7z" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:17.590894 containerd[1718]: 2026-03-07 01:32:17.530 [INFO][5668] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali077028cab56 ContainerID="b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" Namespace="calico-system" Pod="csi-node-driver-d5j7z" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:17.590894 containerd[1718]: 2026-03-07 01:32:17.534 [INFO][5668] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" Namespace="calico-system" Pod="csi-node-driver-d5j7z" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:17.590894 containerd[1718]: 2026-03-07 01:32:17.539 [INFO][5668] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" Namespace="calico-system" Pod="csi-node-driver-d5j7z" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"959c57a8-5ea2-429c-aa5b-3c7b113e7280", ResourceVersion:"1009", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd", Pod:"csi-node-driver-d5j7z", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.15.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali077028cab56", MAC:"de:af:ed:e1:09:87", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:17.590894 containerd[1718]: 2026-03-07 01:32:17.584 [INFO][5668] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd" Namespace="calico-system" Pod="csi-node-driver-d5j7z" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:17.698952 systemd-networkd[1617]: cali5ee085cfb42: Link UP Mar 7 01:32:17.699595 systemd-networkd[1617]: cali5ee085cfb42: Gained carrier Mar 7 01:32:17.724774 systemd-networkd[1617]: calia0a30d35641: Gained IPv6LL Mar 7 01:32:17.745259 containerd[1718]: time="2026-03-07T01:32:17.742587901Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:32:17.745259 containerd[1718]: time="2026-03-07T01:32:17.742655101Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:32:17.745259 containerd[1718]: time="2026-03-07T01:32:17.742675461Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:17.745259 containerd[1718]: time="2026-03-07T01:32:17.742787101Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.431 [INFO][5679] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0 coredns-7d764666f9- kube-system 4054fdd0-0f08-4bdd-ad33-94fb06421936 1010 0 2026-03-07 01:31:31 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081.3.6-n-3151c5d0e2 coredns-7d764666f9-vbqm5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5ee085cfb42 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" Namespace="kube-system" Pod="coredns-7d764666f9-vbqm5" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-" Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.431 [INFO][5679] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" Namespace="kube-system" Pod="coredns-7d764666f9-vbqm5" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.481 [INFO][5700] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" HandleID="k8s-pod-network.e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.497 [INFO][5700] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" HandleID="k8s-pod-network.e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbf40), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081.3.6-n-3151c5d0e2", "pod":"coredns-7d764666f9-vbqm5", "timestamp":"2026-03-07 01:32:17.481375738 +0000 UTC"}, Hostname:"ci-4081.3.6-n-3151c5d0e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003c71e0)} Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.497 [INFO][5700] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.524 [INFO][5700] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.524 [INFO][5700] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-3151c5d0e2' Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.549 [INFO][5700] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.576 [INFO][5700] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.602 [INFO][5700] ipam/ipam.go 526: Trying affinity for 192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.621 [INFO][5700] ipam/ipam.go 160: Attempting to load block cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.643 [INFO][5700] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.15.192/26 host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.645 [INFO][5700] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.15.192/26 handle="k8s-pod-network.e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.656 [INFO][5700] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.669 [INFO][5700] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.15.192/26 handle="k8s-pod-network.e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.683 [INFO][5700] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.15.200/26] block=192.168.15.192/26 handle="k8s-pod-network.e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.684 [INFO][5700] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.15.200/26] handle="k8s-pod-network.e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" host="ci-4081.3.6-n-3151c5d0e2" Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.684 [INFO][5700] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:17.746412 containerd[1718]: 2026-03-07 01:32:17.684 [INFO][5700] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.15.200/26] IPv6=[] ContainerID="e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" HandleID="k8s-pod-network.e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:17.746957 containerd[1718]: 2026-03-07 01:32:17.687 [INFO][5679] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" Namespace="kube-system" Pod="coredns-7d764666f9-vbqm5" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"4054fdd0-0f08-4bdd-ad33-94fb06421936", ResourceVersion:"1010", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"", Pod:"coredns-7d764666f9-vbqm5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.15.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5ee085cfb42", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:17.746957 containerd[1718]: 2026-03-07 01:32:17.688 [INFO][5679] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.15.200/32] ContainerID="e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" Namespace="kube-system" Pod="coredns-7d764666f9-vbqm5" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:17.746957 containerd[1718]: 2026-03-07 01:32:17.688 [INFO][5679] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5ee085cfb42 ContainerID="e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" Namespace="kube-system" Pod="coredns-7d764666f9-vbqm5" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:17.746957 containerd[1718]: 2026-03-07 01:32:17.701 [INFO][5679] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" Namespace="kube-system" Pod="coredns-7d764666f9-vbqm5" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:17.746957 containerd[1718]: 2026-03-07 01:32:17.702 [INFO][5679] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" Namespace="kube-system" Pod="coredns-7d764666f9-vbqm5" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"4054fdd0-0f08-4bdd-ad33-94fb06421936", ResourceVersion:"1010", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de", Pod:"coredns-7d764666f9-vbqm5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.15.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5ee085cfb42", MAC:"4a:26:1c:6a:dc:2f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:17.747124 containerd[1718]: 2026-03-07 01:32:17.736 [INFO][5679] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de" Namespace="kube-system" Pod="coredns-7d764666f9-vbqm5" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:17.796712 systemd[1]: Started cri-containerd-b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd.scope - libcontainer container b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd. Mar 7 01:32:17.808303 containerd[1718]: time="2026-03-07T01:32:17.807418832Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:32:17.808303 containerd[1718]: time="2026-03-07T01:32:17.807482872Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:32:17.808303 containerd[1718]: time="2026-03-07T01:32:17.807498592Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:17.808303 containerd[1718]: time="2026-03-07T01:32:17.807620552Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:32:17.845731 systemd[1]: Started cri-containerd-e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de.scope - libcontainer container e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de. Mar 7 01:32:17.889562 containerd[1718]: time="2026-03-07T01:32:17.889121795Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-d5j7z,Uid:959c57a8-5ea2-429c-aa5b-3c7b113e7280,Namespace:calico-system,Attempt:1,} returns sandbox id \"b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd\"" Mar 7 01:32:17.917880 containerd[1718]: time="2026-03-07T01:32:17.917842942Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-vbqm5,Uid:4054fdd0-0f08-4bdd-ad33-94fb06421936,Namespace:kube-system,Attempt:1,} returns sandbox id \"e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de\"" Mar 7 01:32:17.928394 containerd[1718]: time="2026-03-07T01:32:17.928359778Z" level=info msg="CreateContainer within sandbox \"e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 7 01:32:17.966231 containerd[1718]: time="2026-03-07T01:32:17.965958961Z" level=info msg="CreateContainer within sandbox \"e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"885e4c79a108e3696a93401634bc83bc92bd8bd87e3e248273f520d723d03fd9\"" Mar 7 01:32:17.967165 containerd[1718]: time="2026-03-07T01:32:17.967139600Z" level=info msg="StartContainer for \"885e4c79a108e3696a93401634bc83bc92bd8bd87e3e248273f520d723d03fd9\"" Mar 7 01:32:18.011716 systemd[1]: Started cri-containerd-885e4c79a108e3696a93401634bc83bc92bd8bd87e3e248273f520d723d03fd9.scope - libcontainer container 885e4c79a108e3696a93401634bc83bc92bd8bd87e3e248273f520d723d03fd9. Mar 7 01:32:18.093344 containerd[1718]: time="2026-03-07T01:32:18.093298064Z" level=info msg="StartContainer for \"885e4c79a108e3696a93401634bc83bc92bd8bd87e3e248273f520d723d03fd9\" returns successfully" Mar 7 01:32:18.109008 systemd-networkd[1617]: calic7da0d099da: Gained IPv6LL Mar 7 01:32:18.344873 kubelet[3058]: I0307 01:32:18.343965 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-vbqm5" podStartSLOduration=47.343949766 podStartE2EDuration="47.343949766s" podCreationTimestamp="2026-03-07 01:31:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:32:18.343787127 +0000 UTC m=+53.447948929" watchObservedRunningTime="2026-03-07 01:32:18.343949766 +0000 UTC m=+53.448111528" Mar 7 01:32:18.568619 containerd[1718]: time="2026-03-07T01:32:18.568571439Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:18.573136 containerd[1718]: time="2026-03-07T01:32:18.573077635Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Mar 7 01:32:18.576183 containerd[1718]: time="2026-03-07T01:32:18.576118073Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:18.581315 containerd[1718]: time="2026-03-07T01:32:18.581113629Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:18.582470 containerd[1718]: time="2026-03-07T01:32:18.581818749Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 3.962039449s" Mar 7 01:32:18.582470 containerd[1718]: time="2026-03-07T01:32:18.581853789Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Mar 7 01:32:18.583904 containerd[1718]: time="2026-03-07T01:32:18.583878027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 7 01:32:18.609867 containerd[1718]: time="2026-03-07T01:32:18.609743568Z" level=info msg="CreateContainer within sandbox \"54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 7 01:32:18.646921 containerd[1718]: time="2026-03-07T01:32:18.646873820Z" level=info msg="CreateContainer within sandbox \"54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"deade8009415fa5e9f8d27c496f345e6e77985844543f2ee5245af5fcb2dc995\"" Mar 7 01:32:18.648611 containerd[1718]: time="2026-03-07T01:32:18.648528899Z" level=info msg="StartContainer for \"deade8009415fa5e9f8d27c496f345e6e77985844543f2ee5245af5fcb2dc995\"" Mar 7 01:32:18.681739 systemd[1]: Started cri-containerd-deade8009415fa5e9f8d27c496f345e6e77985844543f2ee5245af5fcb2dc995.scope - libcontainer container deade8009415fa5e9f8d27c496f345e6e77985844543f2ee5245af5fcb2dc995. Mar 7 01:32:18.725849 containerd[1718]: time="2026-03-07T01:32:18.724943442Z" level=info msg="StartContainer for \"deade8009415fa5e9f8d27c496f345e6e77985844543f2ee5245af5fcb2dc995\" returns successfully" Mar 7 01:32:18.940712 systemd-networkd[1617]: cali077028cab56: Gained IPv6LL Mar 7 01:32:19.350629 kubelet[3058]: I0307 01:32:19.350388 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-8499d985fb-scddx" podStartSLOduration=29.386082968 podStartE2EDuration="33.350370854s" podCreationTimestamp="2026-03-07 01:31:46 +0000 UTC" firstStartedPulling="2026-03-07 01:32:14.619463181 +0000 UTC m=+49.723624983" lastFinishedPulling="2026-03-07 01:32:18.583751067 +0000 UTC m=+53.687912869" observedRunningTime="2026-03-07 01:32:19.346852537 +0000 UTC m=+54.451014339" watchObservedRunningTime="2026-03-07 01:32:19.350370854 +0000 UTC m=+54.454532656" Mar 7 01:32:19.516745 systemd-networkd[1617]: cali5ee085cfb42: Gained IPv6LL Mar 7 01:32:20.608758 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3575973498.mount: Deactivated successfully. Mar 7 01:32:20.952014 containerd[1718]: time="2026-03-07T01:32:20.951187018Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:20.953780 containerd[1718]: time="2026-03-07T01:32:20.953743177Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Mar 7 01:32:20.956901 containerd[1718]: time="2026-03-07T01:32:20.956855694Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:20.961676 containerd[1718]: time="2026-03-07T01:32:20.961624651Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:20.962817 containerd[1718]: time="2026-03-07T01:32:20.962644210Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 2.378724183s" Mar 7 01:32:20.962817 containerd[1718]: time="2026-03-07T01:32:20.962675970Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Mar 7 01:32:20.965008 containerd[1718]: time="2026-03-07T01:32:20.964917328Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 7 01:32:20.972239 containerd[1718]: time="2026-03-07T01:32:20.972201723Z" level=info msg="CreateContainer within sandbox \"c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 7 01:32:21.005412 containerd[1718]: time="2026-03-07T01:32:21.005372018Z" level=info msg="CreateContainer within sandbox \"c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"c7e732542d8d679dea8529f9dfdbb9943538a7553723968ce8f7b6597c315475\"" Mar 7 01:32:21.008593 containerd[1718]: time="2026-03-07T01:32:21.006524977Z" level=info msg="StartContainer for \"c7e732542d8d679dea8529f9dfdbb9943538a7553723968ce8f7b6597c315475\"" Mar 7 01:32:21.050762 systemd[1]: Started cri-containerd-c7e732542d8d679dea8529f9dfdbb9943538a7553723968ce8f7b6597c315475.scope - libcontainer container c7e732542d8d679dea8529f9dfdbb9943538a7553723968ce8f7b6597c315475. Mar 7 01:32:21.091741 containerd[1718]: time="2026-03-07T01:32:21.091655434Z" level=info msg="StartContainer for \"c7e732542d8d679dea8529f9dfdbb9943538a7553723968ce8f7b6597c315475\" returns successfully" Mar 7 01:32:22.128078 containerd[1718]: time="2026-03-07T01:32:22.128030349Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:22.131926 containerd[1718]: time="2026-03-07T01:32:22.131900427Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Mar 7 01:32:22.135196 containerd[1718]: time="2026-03-07T01:32:22.134901226Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:22.140314 containerd[1718]: time="2026-03-07T01:32:22.140272424Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:22.141278 containerd[1718]: time="2026-03-07T01:32:22.141248023Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 1.176301055s" Mar 7 01:32:22.141278 containerd[1718]: time="2026-03-07T01:32:22.141276383Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Mar 7 01:32:22.158117 containerd[1718]: time="2026-03-07T01:32:22.157979897Z" level=info msg="CreateContainer within sandbox \"b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 7 01:32:22.187296 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3661903177.mount: Deactivated successfully. Mar 7 01:32:22.202993 containerd[1718]: time="2026-03-07T01:32:22.202953399Z" level=info msg="CreateContainer within sandbox \"b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"f2d0dc568d2ea07d175080fe58b6d10cd54e371fc3b30b29b55750db10824376\"" Mar 7 01:32:22.203495 containerd[1718]: time="2026-03-07T01:32:22.203415759Z" level=info msg="StartContainer for \"f2d0dc568d2ea07d175080fe58b6d10cd54e371fc3b30b29b55750db10824376\"" Mar 7 01:32:22.237707 systemd[1]: Started cri-containerd-f2d0dc568d2ea07d175080fe58b6d10cd54e371fc3b30b29b55750db10824376.scope - libcontainer container f2d0dc568d2ea07d175080fe58b6d10cd54e371fc3b30b29b55750db10824376. Mar 7 01:32:22.269209 containerd[1718]: time="2026-03-07T01:32:22.269168292Z" level=info msg="StartContainer for \"f2d0dc568d2ea07d175080fe58b6d10cd54e371fc3b30b29b55750db10824376\" returns successfully" Mar 7 01:32:22.270893 containerd[1718]: time="2026-03-07T01:32:22.270831092Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 7 01:32:23.884484 containerd[1718]: time="2026-03-07T01:32:23.883719169Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:23.886554 containerd[1718]: time="2026-03-07T01:32:23.886518368Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Mar 7 01:32:23.890966 containerd[1718]: time="2026-03-07T01:32:23.890676207Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:23.895500 containerd[1718]: time="2026-03-07T01:32:23.895464165Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:32:23.896344 containerd[1718]: time="2026-03-07T01:32:23.896285124Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 1.625423992s" Mar 7 01:32:23.896511 containerd[1718]: time="2026-03-07T01:32:23.896423724Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Mar 7 01:32:23.905227 containerd[1718]: time="2026-03-07T01:32:23.905055721Z" level=info msg="CreateContainer within sandbox \"b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 7 01:32:23.933298 containerd[1718]: time="2026-03-07T01:32:23.933248430Z" level=info msg="CreateContainer within sandbox \"b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"b61cb96f96281509e725450caac6cb7ed6bc19dba2c0c4ad9f29aa716a84f092\"" Mar 7 01:32:23.934843 containerd[1718]: time="2026-03-07T01:32:23.933938629Z" level=info msg="StartContainer for \"b61cb96f96281509e725450caac6cb7ed6bc19dba2c0c4ad9f29aa716a84f092\"" Mar 7 01:32:23.969697 systemd[1]: Started cri-containerd-b61cb96f96281509e725450caac6cb7ed6bc19dba2c0c4ad9f29aa716a84f092.scope - libcontainer container b61cb96f96281509e725450caac6cb7ed6bc19dba2c0c4ad9f29aa716a84f092. Mar 7 01:32:24.032646 containerd[1718]: time="2026-03-07T01:32:24.032589270Z" level=info msg="StartContainer for \"b61cb96f96281509e725450caac6cb7ed6bc19dba2c0c4ad9f29aa716a84f092\" returns successfully" Mar 7 01:32:24.139875 kubelet[3058]: I0307 01:32:24.139468 3058 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 7 01:32:24.139875 kubelet[3058]: I0307 01:32:24.139498 3058 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 7 01:32:24.364079 kubelet[3058]: I0307 01:32:24.364025 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/goldmane-9f7667bb8-nnkbr" podStartSLOduration=36.148308156 podStartE2EDuration="40.364004258s" podCreationTimestamp="2026-03-07 01:31:44 +0000 UTC" firstStartedPulling="2026-03-07 01:32:16.748074667 +0000 UTC m=+51.852236429" lastFinishedPulling="2026-03-07 01:32:20.963770729 +0000 UTC m=+56.067932531" observedRunningTime="2026-03-07 01:32:21.359003154 +0000 UTC m=+56.463164916" watchObservedRunningTime="2026-03-07 01:32:24.364004258 +0000 UTC m=+59.468166020" Mar 7 01:32:24.365658 kubelet[3058]: I0307 01:32:24.365202 3058 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/csi-node-driver-d5j7z" podStartSLOduration=32.361101127 podStartE2EDuration="38.365193098s" podCreationTimestamp="2026-03-07 01:31:46 +0000 UTC" firstStartedPulling="2026-03-07 01:32:17.893076153 +0000 UTC m=+52.997237955" lastFinishedPulling="2026-03-07 01:32:23.897168124 +0000 UTC m=+59.001329926" observedRunningTime="2026-03-07 01:32:24.363099379 +0000 UTC m=+59.467261181" watchObservedRunningTime="2026-03-07 01:32:24.365193098 +0000 UTC m=+59.469354860" Mar 7 01:32:25.014105 containerd[1718]: time="2026-03-07T01:32:25.013823240Z" level=info msg="StopPodSandbox for \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\"" Mar 7 01:32:25.090990 containerd[1718]: 2026-03-07 01:32:25.048 [WARNING][6161] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"4054fdd0-0f08-4bdd-ad33-94fb06421936", ResourceVersion:"1029", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de", Pod:"coredns-7d764666f9-vbqm5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.15.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5ee085cfb42", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:25.090990 containerd[1718]: 2026-03-07 01:32:25.048 [INFO][6161] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Mar 7 01:32:25.090990 containerd[1718]: 2026-03-07 01:32:25.048 [INFO][6161] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" iface="eth0" netns="" Mar 7 01:32:25.090990 containerd[1718]: 2026-03-07 01:32:25.048 [INFO][6161] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Mar 7 01:32:25.090990 containerd[1718]: 2026-03-07 01:32:25.048 [INFO][6161] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Mar 7 01:32:25.090990 containerd[1718]: 2026-03-07 01:32:25.072 [INFO][6170] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" HandleID="k8s-pod-network.794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:25.090990 containerd[1718]: 2026-03-07 01:32:25.073 [INFO][6170] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:25.090990 containerd[1718]: 2026-03-07 01:32:25.073 [INFO][6170] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:25.090990 containerd[1718]: 2026-03-07 01:32:25.082 [WARNING][6170] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" HandleID="k8s-pod-network.794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:25.090990 containerd[1718]: 2026-03-07 01:32:25.082 [INFO][6170] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" HandleID="k8s-pod-network.794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:25.090990 containerd[1718]: 2026-03-07 01:32:25.084 [INFO][6170] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:25.090990 containerd[1718]: 2026-03-07 01:32:25.087 [INFO][6161] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Mar 7 01:32:25.092098 containerd[1718]: time="2026-03-07T01:32:25.091436169Z" level=info msg="TearDown network for sandbox \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\" successfully" Mar 7 01:32:25.092098 containerd[1718]: time="2026-03-07T01:32:25.091465729Z" level=info msg="StopPodSandbox for \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\" returns successfully" Mar 7 01:32:25.094647 containerd[1718]: time="2026-03-07T01:32:25.094507647Z" level=info msg="RemovePodSandbox for \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\"" Mar 7 01:32:25.100193 containerd[1718]: time="2026-03-07T01:32:25.100149085Z" level=info msg="Forcibly stopping sandbox \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\"" Mar 7 01:32:25.169083 containerd[1718]: 2026-03-07 01:32:25.135 [WARNING][6184] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"4054fdd0-0f08-4bdd-ad33-94fb06421936", ResourceVersion:"1029", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"e7a042de4dae7b2c52013b7a67d5b259b74fa14b94329f4620b89c5c4d5c34de", Pod:"coredns-7d764666f9-vbqm5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.15.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5ee085cfb42", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:25.169083 containerd[1718]: 2026-03-07 01:32:25.135 [INFO][6184] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Mar 7 01:32:25.169083 containerd[1718]: 2026-03-07 01:32:25.136 [INFO][6184] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" iface="eth0" netns="" Mar 7 01:32:25.169083 containerd[1718]: 2026-03-07 01:32:25.136 [INFO][6184] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Mar 7 01:32:25.169083 containerd[1718]: 2026-03-07 01:32:25.137 [INFO][6184] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Mar 7 01:32:25.169083 containerd[1718]: 2026-03-07 01:32:25.154 [INFO][6191] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" HandleID="k8s-pod-network.794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:25.169083 containerd[1718]: 2026-03-07 01:32:25.154 [INFO][6191] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:25.169083 containerd[1718]: 2026-03-07 01:32:25.154 [INFO][6191] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:25.169083 containerd[1718]: 2026-03-07 01:32:25.164 [WARNING][6191] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" HandleID="k8s-pod-network.794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:25.169083 containerd[1718]: 2026-03-07 01:32:25.164 [INFO][6191] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" HandleID="k8s-pod-network.794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--vbqm5-eth0" Mar 7 01:32:25.169083 containerd[1718]: 2026-03-07 01:32:25.166 [INFO][6191] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:25.169083 containerd[1718]: 2026-03-07 01:32:25.167 [INFO][6184] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5" Mar 7 01:32:25.169479 containerd[1718]: time="2026-03-07T01:32:25.169124418Z" level=info msg="TearDown network for sandbox \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\" successfully" Mar 7 01:32:25.180989 containerd[1718]: time="2026-03-07T01:32:25.180945293Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:32:25.181102 containerd[1718]: time="2026-03-07T01:32:25.181019933Z" level=info msg="RemovePodSandbox \"794ae2362b313582c7ba68452602b6530edd6573fb7b974080ee6256a4d7cdc5\" returns successfully" Mar 7 01:32:25.182015 containerd[1718]: time="2026-03-07T01:32:25.181758573Z" level=info msg="StopPodSandbox for \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\"" Mar 7 01:32:25.256168 containerd[1718]: 2026-03-07 01:32:25.221 [WARNING][6206] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"f1de7e3a-82ca-4e43-88c8-75d72af93053", ResourceVersion:"984", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880", Pod:"coredns-7d764666f9-8ttj2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.15.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calife7a02b6638", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:25.256168 containerd[1718]: 2026-03-07 01:32:25.221 [INFO][6206] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Mar 7 01:32:25.256168 containerd[1718]: 2026-03-07 01:32:25.221 [INFO][6206] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" iface="eth0" netns="" Mar 7 01:32:25.256168 containerd[1718]: 2026-03-07 01:32:25.221 [INFO][6206] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Mar 7 01:32:25.256168 containerd[1718]: 2026-03-07 01:32:25.221 [INFO][6206] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Mar 7 01:32:25.256168 containerd[1718]: 2026-03-07 01:32:25.243 [INFO][6215] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" HandleID="k8s-pod-network.fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:25.256168 containerd[1718]: 2026-03-07 01:32:25.243 [INFO][6215] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:25.256168 containerd[1718]: 2026-03-07 01:32:25.243 [INFO][6215] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:25.256168 containerd[1718]: 2026-03-07 01:32:25.252 [WARNING][6215] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" HandleID="k8s-pod-network.fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:25.256168 containerd[1718]: 2026-03-07 01:32:25.252 [INFO][6215] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" HandleID="k8s-pod-network.fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:25.256168 containerd[1718]: 2026-03-07 01:32:25.253 [INFO][6215] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:25.256168 containerd[1718]: 2026-03-07 01:32:25.254 [INFO][6206] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Mar 7 01:32:25.256594 containerd[1718]: time="2026-03-07T01:32:25.256202943Z" level=info msg="TearDown network for sandbox \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\" successfully" Mar 7 01:32:25.256594 containerd[1718]: time="2026-03-07T01:32:25.256228583Z" level=info msg="StopPodSandbox for \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\" returns successfully" Mar 7 01:32:25.256977 containerd[1718]: time="2026-03-07T01:32:25.256954303Z" level=info msg="RemovePodSandbox for \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\"" Mar 7 01:32:25.257018 containerd[1718]: time="2026-03-07T01:32:25.257000663Z" level=info msg="Forcibly stopping sandbox \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\"" Mar 7 01:32:25.325107 containerd[1718]: 2026-03-07 01:32:25.291 [WARNING][6232] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"f1de7e3a-82ca-4e43-88c8-75d72af93053", ResourceVersion:"984", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"6e19f5f300f87712fc745886c81c39a6c6e58596b06a30cec01daabf54b6b880", Pod:"coredns-7d764666f9-8ttj2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.15.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calife7a02b6638", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:25.325107 containerd[1718]: 2026-03-07 01:32:25.291 [INFO][6232] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Mar 7 01:32:25.325107 containerd[1718]: 2026-03-07 01:32:25.291 [INFO][6232] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" iface="eth0" netns="" Mar 7 01:32:25.325107 containerd[1718]: 2026-03-07 01:32:25.291 [INFO][6232] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Mar 7 01:32:25.325107 containerd[1718]: 2026-03-07 01:32:25.291 [INFO][6232] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Mar 7 01:32:25.325107 containerd[1718]: 2026-03-07 01:32:25.310 [INFO][6240] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" HandleID="k8s-pod-network.fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:25.325107 containerd[1718]: 2026-03-07 01:32:25.311 [INFO][6240] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:25.325107 containerd[1718]: 2026-03-07 01:32:25.311 [INFO][6240] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:25.325107 containerd[1718]: 2026-03-07 01:32:25.320 [WARNING][6240] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" HandleID="k8s-pod-network.fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:25.325107 containerd[1718]: 2026-03-07 01:32:25.320 [INFO][6240] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" HandleID="k8s-pod-network.fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-coredns--7d764666f9--8ttj2-eth0" Mar 7 01:32:25.325107 containerd[1718]: 2026-03-07 01:32:25.322 [INFO][6240] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:25.325107 containerd[1718]: 2026-03-07 01:32:25.323 [INFO][6232] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00" Mar 7 01:32:25.325107 containerd[1718]: time="2026-03-07T01:32:25.325056116Z" level=info msg="TearDown network for sandbox \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\" successfully" Mar 7 01:32:25.341427 containerd[1718]: time="2026-03-07T01:32:25.341250429Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:32:25.341427 containerd[1718]: time="2026-03-07T01:32:25.341329989Z" level=info msg="RemovePodSandbox \"fe91ff865bd3d7964d974fef8081a54196e75efab8d7dd2aca8f4e0fda53bf00\" returns successfully" Mar 7 01:32:25.342104 containerd[1718]: time="2026-03-07T01:32:25.342004189Z" level=info msg="StopPodSandbox for \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\"" Mar 7 01:32:25.404927 containerd[1718]: 2026-03-07 01:32:25.374 [WARNING][6254] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"aa61cab2-0011-4c46-b66c-4d054b7336c6", ResourceVersion:"1056", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8", Pod:"goldmane-9f7667bb8-nnkbr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.15.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia0a30d35641", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:25.404927 containerd[1718]: 2026-03-07 01:32:25.374 [INFO][6254] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Mar 7 01:32:25.404927 containerd[1718]: 2026-03-07 01:32:25.374 [INFO][6254] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" iface="eth0" netns="" Mar 7 01:32:25.404927 containerd[1718]: 2026-03-07 01:32:25.374 [INFO][6254] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Mar 7 01:32:25.404927 containerd[1718]: 2026-03-07 01:32:25.374 [INFO][6254] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Mar 7 01:32:25.404927 containerd[1718]: 2026-03-07 01:32:25.391 [INFO][6262] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" HandleID="k8s-pod-network.223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:25.404927 containerd[1718]: 2026-03-07 01:32:25.391 [INFO][6262] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:25.404927 containerd[1718]: 2026-03-07 01:32:25.391 [INFO][6262] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:25.404927 containerd[1718]: 2026-03-07 01:32:25.400 [WARNING][6262] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" HandleID="k8s-pod-network.223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:25.404927 containerd[1718]: 2026-03-07 01:32:25.400 [INFO][6262] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" HandleID="k8s-pod-network.223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:25.404927 containerd[1718]: 2026-03-07 01:32:25.401 [INFO][6262] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:25.404927 containerd[1718]: 2026-03-07 01:32:25.403 [INFO][6254] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Mar 7 01:32:25.405354 containerd[1718]: time="2026-03-07T01:32:25.404969364Z" level=info msg="TearDown network for sandbox \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\" successfully" Mar 7 01:32:25.405354 containerd[1718]: time="2026-03-07T01:32:25.404994164Z" level=info msg="StopPodSandbox for \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\" returns successfully" Mar 7 01:32:25.405439 containerd[1718]: time="2026-03-07T01:32:25.405409644Z" level=info msg="RemovePodSandbox for \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\"" Mar 7 01:32:25.405475 containerd[1718]: time="2026-03-07T01:32:25.405441884Z" level=info msg="Forcibly stopping sandbox \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\"" Mar 7 01:32:25.467856 containerd[1718]: 2026-03-07 01:32:25.436 [WARNING][6276] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"aa61cab2-0011-4c46-b66c-4d054b7336c6", ResourceVersion:"1056", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"c307bfbe1520f3833d7ebf2b556c8cbb4720cc7c4ca4fc142fba5213495bb2b8", Pod:"goldmane-9f7667bb8-nnkbr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.15.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia0a30d35641", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:25.467856 containerd[1718]: 2026-03-07 01:32:25.436 [INFO][6276] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Mar 7 01:32:25.467856 containerd[1718]: 2026-03-07 01:32:25.436 [INFO][6276] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" iface="eth0" netns="" Mar 7 01:32:25.467856 containerd[1718]: 2026-03-07 01:32:25.436 [INFO][6276] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Mar 7 01:32:25.467856 containerd[1718]: 2026-03-07 01:32:25.436 [INFO][6276] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Mar 7 01:32:25.467856 containerd[1718]: 2026-03-07 01:32:25.455 [INFO][6283] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" HandleID="k8s-pod-network.223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:25.467856 containerd[1718]: 2026-03-07 01:32:25.455 [INFO][6283] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:25.467856 containerd[1718]: 2026-03-07 01:32:25.455 [INFO][6283] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:25.467856 containerd[1718]: 2026-03-07 01:32:25.463 [WARNING][6283] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" HandleID="k8s-pod-network.223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:25.467856 containerd[1718]: 2026-03-07 01:32:25.463 [INFO][6283] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" HandleID="k8s-pod-network.223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-goldmane--9f7667bb8--nnkbr-eth0" Mar 7 01:32:25.467856 containerd[1718]: 2026-03-07 01:32:25.464 [INFO][6283] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:25.467856 containerd[1718]: 2026-03-07 01:32:25.466 [INFO][6276] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13" Mar 7 01:32:25.468955 containerd[1718]: time="2026-03-07T01:32:25.468319459Z" level=info msg="TearDown network for sandbox \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\" successfully" Mar 7 01:32:25.475689 containerd[1718]: time="2026-03-07T01:32:25.475635696Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:32:25.475871 containerd[1718]: time="2026-03-07T01:32:25.475710336Z" level=info msg="RemovePodSandbox \"223e05aa10ae510016050e5dd07a75610d24c125425ec4c4053162fd7bd33c13\" returns successfully" Mar 7 01:32:25.476398 containerd[1718]: time="2026-03-07T01:32:25.476140935Z" level=info msg="StopPodSandbox for \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\"" Mar 7 01:32:25.537344 containerd[1718]: 2026-03-07 01:32:25.507 [WARNING][6297] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"959c57a8-5ea2-429c-aa5b-3c7b113e7280", ResourceVersion:"1085", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd", Pod:"csi-node-driver-d5j7z", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.15.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali077028cab56", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:25.537344 containerd[1718]: 2026-03-07 01:32:25.508 [INFO][6297] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Mar 7 01:32:25.537344 containerd[1718]: 2026-03-07 01:32:25.508 [INFO][6297] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" iface="eth0" netns="" Mar 7 01:32:25.537344 containerd[1718]: 2026-03-07 01:32:25.508 [INFO][6297] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Mar 7 01:32:25.537344 containerd[1718]: 2026-03-07 01:32:25.508 [INFO][6297] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Mar 7 01:32:25.537344 containerd[1718]: 2026-03-07 01:32:25.524 [INFO][6304] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" HandleID="k8s-pod-network.06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:25.537344 containerd[1718]: 2026-03-07 01:32:25.525 [INFO][6304] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:25.537344 containerd[1718]: 2026-03-07 01:32:25.525 [INFO][6304] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:25.537344 containerd[1718]: 2026-03-07 01:32:25.533 [WARNING][6304] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" HandleID="k8s-pod-network.06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:25.537344 containerd[1718]: 2026-03-07 01:32:25.533 [INFO][6304] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" HandleID="k8s-pod-network.06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:25.537344 containerd[1718]: 2026-03-07 01:32:25.534 [INFO][6304] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:25.537344 containerd[1718]: 2026-03-07 01:32:25.535 [INFO][6297] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Mar 7 01:32:25.538038 containerd[1718]: time="2026-03-07T01:32:25.537826751Z" level=info msg="TearDown network for sandbox \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\" successfully" Mar 7 01:32:25.538038 containerd[1718]: time="2026-03-07T01:32:25.537853871Z" level=info msg="StopPodSandbox for \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\" returns successfully" Mar 7 01:32:25.538537 containerd[1718]: time="2026-03-07T01:32:25.538259391Z" level=info msg="RemovePodSandbox for \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\"" Mar 7 01:32:25.538537 containerd[1718]: time="2026-03-07T01:32:25.538286951Z" level=info msg="Forcibly stopping sandbox \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\"" Mar 7 01:32:25.597965 containerd[1718]: 2026-03-07 01:32:25.567 [WARNING][6318] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"959c57a8-5ea2-429c-aa5b-3c7b113e7280", ResourceVersion:"1085", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"b31b5754d1425e21b1944cff26e86d9f7f75de18f516c12d4344a7224ee50cdd", Pod:"csi-node-driver-d5j7z", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.15.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali077028cab56", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:25.597965 containerd[1718]: 2026-03-07 01:32:25.568 [INFO][6318] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Mar 7 01:32:25.597965 containerd[1718]: 2026-03-07 01:32:25.568 [INFO][6318] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" iface="eth0" netns="" Mar 7 01:32:25.597965 containerd[1718]: 2026-03-07 01:32:25.568 [INFO][6318] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Mar 7 01:32:25.597965 containerd[1718]: 2026-03-07 01:32:25.568 [INFO][6318] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Mar 7 01:32:25.597965 containerd[1718]: 2026-03-07 01:32:25.585 [INFO][6325] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" HandleID="k8s-pod-network.06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:25.597965 containerd[1718]: 2026-03-07 01:32:25.585 [INFO][6325] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:25.597965 containerd[1718]: 2026-03-07 01:32:25.585 [INFO][6325] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:25.597965 containerd[1718]: 2026-03-07 01:32:25.593 [WARNING][6325] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" HandleID="k8s-pod-network.06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:25.597965 containerd[1718]: 2026-03-07 01:32:25.593 [INFO][6325] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" HandleID="k8s-pod-network.06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-csi--node--driver--d5j7z-eth0" Mar 7 01:32:25.597965 containerd[1718]: 2026-03-07 01:32:25.594 [INFO][6325] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:25.597965 containerd[1718]: 2026-03-07 01:32:25.596 [INFO][6318] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e" Mar 7 01:32:25.599128 containerd[1718]: time="2026-03-07T01:32:25.598440047Z" level=info msg="TearDown network for sandbox \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\" successfully" Mar 7 01:32:25.606260 containerd[1718]: time="2026-03-07T01:32:25.606218204Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:32:25.606417 containerd[1718]: time="2026-03-07T01:32:25.606401924Z" level=info msg="RemovePodSandbox \"06457aeb756f7754867d6caad3ae860349774734ee3a70c1f6388eecf1df3a7e\" returns successfully" Mar 7 01:32:25.606987 containerd[1718]: time="2026-03-07T01:32:25.606960283Z" level=info msg="StopPodSandbox for \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\"" Mar 7 01:32:25.675283 containerd[1718]: 2026-03-07 01:32:25.641 [WARNING][6339] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0", GenerateName:"calico-apiserver-fb8dfc889-", Namespace:"calico-system", SelfLink:"", UID:"94207163-c370-4487-84eb-4023d7e5f07c", ResourceVersion:"928", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fb8dfc889", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f", Pod:"calico-apiserver-fb8dfc889-88kxk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.15.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliecc9d9aafe6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:25.675283 containerd[1718]: 2026-03-07 01:32:25.641 [INFO][6339] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Mar 7 01:32:25.675283 containerd[1718]: 2026-03-07 01:32:25.641 [INFO][6339] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" iface="eth0" netns="" Mar 7 01:32:25.675283 containerd[1718]: 2026-03-07 01:32:25.641 [INFO][6339] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Mar 7 01:32:25.675283 containerd[1718]: 2026-03-07 01:32:25.641 [INFO][6339] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Mar 7 01:32:25.675283 containerd[1718]: 2026-03-07 01:32:25.660 [INFO][6347] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" HandleID="k8s-pod-network.0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:25.675283 containerd[1718]: 2026-03-07 01:32:25.660 [INFO][6347] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:25.675283 containerd[1718]: 2026-03-07 01:32:25.660 [INFO][6347] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:25.675283 containerd[1718]: 2026-03-07 01:32:25.669 [WARNING][6347] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" HandleID="k8s-pod-network.0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:25.675283 containerd[1718]: 2026-03-07 01:32:25.669 [INFO][6347] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" HandleID="k8s-pod-network.0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:25.675283 containerd[1718]: 2026-03-07 01:32:25.672 [INFO][6347] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:25.675283 containerd[1718]: 2026-03-07 01:32:25.673 [INFO][6339] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Mar 7 01:32:25.675283 containerd[1718]: time="2026-03-07T01:32:25.675172096Z" level=info msg="TearDown network for sandbox \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\" successfully" Mar 7 01:32:25.675283 containerd[1718]: time="2026-03-07T01:32:25.675197536Z" level=info msg="StopPodSandbox for \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\" returns successfully" Mar 7 01:32:25.676011 containerd[1718]: time="2026-03-07T01:32:25.675983896Z" level=info msg="RemovePodSandbox for \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\"" Mar 7 01:32:25.676041 containerd[1718]: time="2026-03-07T01:32:25.676018736Z" level=info msg="Forcibly stopping sandbox \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\"" Mar 7 01:32:25.739502 containerd[1718]: 2026-03-07 01:32:25.707 [WARNING][6361] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0", GenerateName:"calico-apiserver-fb8dfc889-", Namespace:"calico-system", SelfLink:"", UID:"94207163-c370-4487-84eb-4023d7e5f07c", ResourceVersion:"928", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fb8dfc889", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"c57a64220e0adb6d566cd03c56c22d292c33341692888184e88e72fad401a72f", Pod:"calico-apiserver-fb8dfc889-88kxk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.15.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliecc9d9aafe6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:25.739502 containerd[1718]: 2026-03-07 01:32:25.707 [INFO][6361] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Mar 7 01:32:25.739502 containerd[1718]: 2026-03-07 01:32:25.708 [INFO][6361] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" iface="eth0" netns="" Mar 7 01:32:25.739502 containerd[1718]: 2026-03-07 01:32:25.708 [INFO][6361] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Mar 7 01:32:25.739502 containerd[1718]: 2026-03-07 01:32:25.708 [INFO][6361] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Mar 7 01:32:25.739502 containerd[1718]: 2026-03-07 01:32:25.726 [INFO][6369] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" HandleID="k8s-pod-network.0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:25.739502 containerd[1718]: 2026-03-07 01:32:25.727 [INFO][6369] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:25.739502 containerd[1718]: 2026-03-07 01:32:25.727 [INFO][6369] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:25.739502 containerd[1718]: 2026-03-07 01:32:25.735 [WARNING][6369] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" HandleID="k8s-pod-network.0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:25.739502 containerd[1718]: 2026-03-07 01:32:25.735 [INFO][6369] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" HandleID="k8s-pod-network.0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--88kxk-eth0" Mar 7 01:32:25.739502 containerd[1718]: 2026-03-07 01:32:25.736 [INFO][6369] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:25.739502 containerd[1718]: 2026-03-07 01:32:25.738 [INFO][6361] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0" Mar 7 01:32:25.739912 containerd[1718]: time="2026-03-07T01:32:25.739562351Z" level=info msg="TearDown network for sandbox \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\" successfully" Mar 7 01:32:25.747109 containerd[1718]: time="2026-03-07T01:32:25.747065548Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:32:25.747199 containerd[1718]: time="2026-03-07T01:32:25.747139628Z" level=info msg="RemovePodSandbox \"0df0e624cf14921a504bfb2780052adaa0290c8a9fdc2d756972a70174547aa0\" returns successfully" Mar 7 01:32:25.747600 containerd[1718]: time="2026-03-07T01:32:25.747577027Z" level=info msg="StopPodSandbox for \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\"" Mar 7 01:32:25.810530 containerd[1718]: 2026-03-07 01:32:25.778 [WARNING][6383] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0", GenerateName:"calico-kube-controllers-8499d985fb-", Namespace:"calico-system", SelfLink:"", UID:"491bd089-80af-49d7-8a6a-aa3fd4f9da71", ResourceVersion:"1047", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8499d985fb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba", Pod:"calico-kube-controllers-8499d985fb-scddx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.15.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8fe9ae8149d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:25.810530 containerd[1718]: 2026-03-07 01:32:25.778 [INFO][6383] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Mar 7 01:32:25.810530 containerd[1718]: 2026-03-07 01:32:25.778 [INFO][6383] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" iface="eth0" netns="" Mar 7 01:32:25.810530 containerd[1718]: 2026-03-07 01:32:25.778 [INFO][6383] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Mar 7 01:32:25.810530 containerd[1718]: 2026-03-07 01:32:25.778 [INFO][6383] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Mar 7 01:32:25.810530 containerd[1718]: 2026-03-07 01:32:25.795 [INFO][6390] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" HandleID="k8s-pod-network.1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:25.810530 containerd[1718]: 2026-03-07 01:32:25.795 [INFO][6390] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:25.810530 containerd[1718]: 2026-03-07 01:32:25.795 [INFO][6390] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:25.810530 containerd[1718]: 2026-03-07 01:32:25.804 [WARNING][6390] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" HandleID="k8s-pod-network.1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:25.810530 containerd[1718]: 2026-03-07 01:32:25.804 [INFO][6390] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" HandleID="k8s-pod-network.1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:25.810530 containerd[1718]: 2026-03-07 01:32:25.805 [INFO][6390] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:25.810530 containerd[1718]: 2026-03-07 01:32:25.806 [INFO][6383] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Mar 7 01:32:25.811083 containerd[1718]: time="2026-03-07T01:32:25.810650442Z" level=info msg="TearDown network for sandbox \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\" successfully" Mar 7 01:32:25.811083 containerd[1718]: time="2026-03-07T01:32:25.810677162Z" level=info msg="StopPodSandbox for \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\" returns successfully" Mar 7 01:32:25.811850 containerd[1718]: time="2026-03-07T01:32:25.811563042Z" level=info msg="RemovePodSandbox for \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\"" Mar 7 01:32:25.811850 containerd[1718]: time="2026-03-07T01:32:25.811592162Z" level=info msg="Forcibly stopping sandbox \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\"" Mar 7 01:32:25.879243 containerd[1718]: 2026-03-07 01:32:25.843 [WARNING][6404] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0", GenerateName:"calico-kube-controllers-8499d985fb-", Namespace:"calico-system", SelfLink:"", UID:"491bd089-80af-49d7-8a6a-aa3fd4f9da71", ResourceVersion:"1047", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8499d985fb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"54661b2d85efe30a5cd9da9da9a7969982e164c0ff971d1e17d36bbb6c8698ba", Pod:"calico-kube-controllers-8499d985fb-scddx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.15.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8fe9ae8149d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:25.879243 containerd[1718]: 2026-03-07 01:32:25.844 [INFO][6404] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Mar 7 01:32:25.879243 containerd[1718]: 2026-03-07 01:32:25.844 [INFO][6404] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" iface="eth0" netns="" Mar 7 01:32:25.879243 containerd[1718]: 2026-03-07 01:32:25.844 [INFO][6404] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Mar 7 01:32:25.879243 containerd[1718]: 2026-03-07 01:32:25.844 [INFO][6404] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Mar 7 01:32:25.879243 containerd[1718]: 2026-03-07 01:32:25.864 [INFO][6411] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" HandleID="k8s-pod-network.1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:25.879243 containerd[1718]: 2026-03-07 01:32:25.864 [INFO][6411] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:25.879243 containerd[1718]: 2026-03-07 01:32:25.864 [INFO][6411] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:25.879243 containerd[1718]: 2026-03-07 01:32:25.874 [WARNING][6411] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" HandleID="k8s-pod-network.1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:25.879243 containerd[1718]: 2026-03-07 01:32:25.874 [INFO][6411] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" HandleID="k8s-pod-network.1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--kube--controllers--8499d985fb--scddx-eth0" Mar 7 01:32:25.879243 containerd[1718]: 2026-03-07 01:32:25.875 [INFO][6411] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:25.879243 containerd[1718]: 2026-03-07 01:32:25.877 [INFO][6404] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc" Mar 7 01:32:25.879243 containerd[1718]: time="2026-03-07T01:32:25.878907135Z" level=info msg="TearDown network for sandbox \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\" successfully" Mar 7 01:32:25.886241 containerd[1718]: time="2026-03-07T01:32:25.886188732Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:32:25.886361 containerd[1718]: time="2026-03-07T01:32:25.886297652Z" level=info msg="RemovePodSandbox \"1819d10e1018354536ea9e73115e31af87732d8ee6ecc0b10b76504661b68bcc\" returns successfully" Mar 7 01:32:25.886820 containerd[1718]: time="2026-03-07T01:32:25.886796252Z" level=info msg="StopPodSandbox for \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\"" Mar 7 01:32:25.952301 containerd[1718]: 2026-03-07 01:32:25.921 [WARNING][6425] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0", GenerateName:"calico-apiserver-fb8dfc889-", Namespace:"calico-system", SelfLink:"", UID:"4baa109f-be88-432a-8b8a-fa833dcc95d3", ResourceVersion:"1040", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fb8dfc889", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0", Pod:"calico-apiserver-fb8dfc889-wk8q8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.15.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calic7da0d099da", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:25.952301 containerd[1718]: 2026-03-07 01:32:25.921 [INFO][6425] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Mar 7 01:32:25.952301 containerd[1718]: 2026-03-07 01:32:25.921 [INFO][6425] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" iface="eth0" netns="" Mar 7 01:32:25.952301 containerd[1718]: 2026-03-07 01:32:25.921 [INFO][6425] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Mar 7 01:32:25.952301 containerd[1718]: 2026-03-07 01:32:25.921 [INFO][6425] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Mar 7 01:32:25.952301 containerd[1718]: 2026-03-07 01:32:25.939 [INFO][6432] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" HandleID="k8s-pod-network.58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:25.952301 containerd[1718]: 2026-03-07 01:32:25.939 [INFO][6432] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:25.952301 containerd[1718]: 2026-03-07 01:32:25.939 [INFO][6432] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:25.952301 containerd[1718]: 2026-03-07 01:32:25.947 [WARNING][6432] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" HandleID="k8s-pod-network.58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:25.952301 containerd[1718]: 2026-03-07 01:32:25.948 [INFO][6432] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" HandleID="k8s-pod-network.58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:25.952301 containerd[1718]: 2026-03-07 01:32:25.949 [INFO][6432] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:25.952301 containerd[1718]: 2026-03-07 01:32:25.950 [INFO][6425] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Mar 7 01:32:25.952815 containerd[1718]: time="2026-03-07T01:32:25.952346026Z" level=info msg="TearDown network for sandbox \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\" successfully" Mar 7 01:32:25.952815 containerd[1718]: time="2026-03-07T01:32:25.952371506Z" level=info msg="StopPodSandbox for \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\" returns successfully" Mar 7 01:32:25.952815 containerd[1718]: time="2026-03-07T01:32:25.952792066Z" level=info msg="RemovePodSandbox for \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\"" Mar 7 01:32:25.952901 containerd[1718]: time="2026-03-07T01:32:25.952816946Z" level=info msg="Forcibly stopping sandbox \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\"" Mar 7 01:32:26.016822 containerd[1718]: 2026-03-07 01:32:25.984 [WARNING][6446] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0", GenerateName:"calico-apiserver-fb8dfc889-", Namespace:"calico-system", SelfLink:"", UID:"4baa109f-be88-432a-8b8a-fa833dcc95d3", ResourceVersion:"1040", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 31, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fb8dfc889", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-3151c5d0e2", ContainerID:"a6af6c1bb134f3e77b5abc19db58fdc8fa48f356d9f5a93449eae5664c48a8d0", Pod:"calico-apiserver-fb8dfc889-wk8q8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.15.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calic7da0d099da", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:32:26.016822 containerd[1718]: 2026-03-07 01:32:25.984 [INFO][6446] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Mar 7 01:32:26.016822 containerd[1718]: 2026-03-07 01:32:25.984 [INFO][6446] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" iface="eth0" netns="" Mar 7 01:32:26.016822 containerd[1718]: 2026-03-07 01:32:25.984 [INFO][6446] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Mar 7 01:32:26.016822 containerd[1718]: 2026-03-07 01:32:25.984 [INFO][6446] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Mar 7 01:32:26.016822 containerd[1718]: 2026-03-07 01:32:26.003 [INFO][6453] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" HandleID="k8s-pod-network.58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:26.016822 containerd[1718]: 2026-03-07 01:32:26.003 [INFO][6453] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:26.016822 containerd[1718]: 2026-03-07 01:32:26.003 [INFO][6453] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:26.016822 containerd[1718]: 2026-03-07 01:32:26.012 [WARNING][6453] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" HandleID="k8s-pod-network.58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:26.016822 containerd[1718]: 2026-03-07 01:32:26.012 [INFO][6453] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" HandleID="k8s-pod-network.58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-calico--apiserver--fb8dfc889--wk8q8-eth0" Mar 7 01:32:26.016822 containerd[1718]: 2026-03-07 01:32:26.013 [INFO][6453] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:26.016822 containerd[1718]: 2026-03-07 01:32:26.015 [INFO][6446] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e" Mar 7 01:32:26.017610 containerd[1718]: time="2026-03-07T01:32:26.016862760Z" level=info msg="TearDown network for sandbox \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\" successfully" Mar 7 01:32:26.024926 containerd[1718]: time="2026-03-07T01:32:26.024876957Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:32:26.025002 containerd[1718]: time="2026-03-07T01:32:26.024983597Z" level=info msg="RemovePodSandbox \"58cc0262016c89d55041039f892a21b4b72ddc9560553b296dbe74f1a7f6821e\" returns successfully" Mar 7 01:32:26.025664 containerd[1718]: time="2026-03-07T01:32:26.025395277Z" level=info msg="StopPodSandbox for \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\"" Mar 7 01:32:26.095828 containerd[1718]: 2026-03-07 01:32:26.057 [WARNING][6467] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--6b744bddfc--zvqxz-eth0" Mar 7 01:32:26.095828 containerd[1718]: 2026-03-07 01:32:26.057 [INFO][6467] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Mar 7 01:32:26.095828 containerd[1718]: 2026-03-07 01:32:26.057 [INFO][6467] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" iface="eth0" netns="" Mar 7 01:32:26.095828 containerd[1718]: 2026-03-07 01:32:26.057 [INFO][6467] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Mar 7 01:32:26.095828 containerd[1718]: 2026-03-07 01:32:26.057 [INFO][6467] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Mar 7 01:32:26.095828 containerd[1718]: 2026-03-07 01:32:26.078 [INFO][6474] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" HandleID="k8s-pod-network.c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--6b744bddfc--zvqxz-eth0" Mar 7 01:32:26.095828 containerd[1718]: 2026-03-07 01:32:26.078 [INFO][6474] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:26.095828 containerd[1718]: 2026-03-07 01:32:26.078 [INFO][6474] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:26.095828 containerd[1718]: 2026-03-07 01:32:26.091 [WARNING][6474] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" HandleID="k8s-pod-network.c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--6b744bddfc--zvqxz-eth0" Mar 7 01:32:26.095828 containerd[1718]: 2026-03-07 01:32:26.091 [INFO][6474] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" HandleID="k8s-pod-network.c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--6b744bddfc--zvqxz-eth0" Mar 7 01:32:26.095828 containerd[1718]: 2026-03-07 01:32:26.092 [INFO][6474] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:26.095828 containerd[1718]: 2026-03-07 01:32:26.094 [INFO][6467] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Mar 7 01:32:26.096273 containerd[1718]: time="2026-03-07T01:32:26.095867769Z" level=info msg="TearDown network for sandbox \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\" successfully" Mar 7 01:32:26.096273 containerd[1718]: time="2026-03-07T01:32:26.095893329Z" level=info msg="StopPodSandbox for \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\" returns successfully" Mar 7 01:32:26.096920 containerd[1718]: time="2026-03-07T01:32:26.096651968Z" level=info msg="RemovePodSandbox for \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\"" Mar 7 01:32:26.096920 containerd[1718]: time="2026-03-07T01:32:26.096722128Z" level=info msg="Forcibly stopping sandbox \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\"" Mar 7 01:32:26.161149 containerd[1718]: 2026-03-07 01:32:26.128 [WARNING][6488] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" WorkloadEndpoint="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--6b744bddfc--zvqxz-eth0" Mar 7 01:32:26.161149 containerd[1718]: 2026-03-07 01:32:26.128 [INFO][6488] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Mar 7 01:32:26.161149 containerd[1718]: 2026-03-07 01:32:26.128 [INFO][6488] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" iface="eth0" netns="" Mar 7 01:32:26.161149 containerd[1718]: 2026-03-07 01:32:26.128 [INFO][6488] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Mar 7 01:32:26.161149 containerd[1718]: 2026-03-07 01:32:26.128 [INFO][6488] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Mar 7 01:32:26.161149 containerd[1718]: 2026-03-07 01:32:26.146 [INFO][6495] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" HandleID="k8s-pod-network.c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--6b744bddfc--zvqxz-eth0" Mar 7 01:32:26.161149 containerd[1718]: 2026-03-07 01:32:26.146 [INFO][6495] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:32:26.161149 containerd[1718]: 2026-03-07 01:32:26.146 [INFO][6495] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:32:26.161149 containerd[1718]: 2026-03-07 01:32:26.154 [WARNING][6495] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" HandleID="k8s-pod-network.c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--6b744bddfc--zvqxz-eth0" Mar 7 01:32:26.161149 containerd[1718]: 2026-03-07 01:32:26.154 [INFO][6495] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" HandleID="k8s-pod-network.c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Workload="ci--4081.3.6--n--3151c5d0e2-k8s-whisker--6b744bddfc--zvqxz-eth0" Mar 7 01:32:26.161149 containerd[1718]: 2026-03-07 01:32:26.156 [INFO][6495] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:32:26.161149 containerd[1718]: 2026-03-07 01:32:26.158 [INFO][6488] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e" Mar 7 01:32:26.161149 containerd[1718]: time="2026-03-07T01:32:26.161021503Z" level=info msg="TearDown network for sandbox \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\" successfully" Mar 7 01:32:26.168792 containerd[1718]: time="2026-03-07T01:32:26.168740260Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:32:26.168912 containerd[1718]: time="2026-03-07T01:32:26.168846260Z" level=info msg="RemovePodSandbox \"c039eecedc5fd16fbd779b77f0f8d1311467d6d84dfe2ad4bd7c7e05ebbc3d4e\" returns successfully" Mar 7 01:32:34.239767 systemd[1]: run-containerd-runc-k8s.io-a39cd0747caf4cdb7cc083dfdc5bb4e9b31ce961edca8ff0a32ef550a2a37499-runc.oZTDUX.mount: Deactivated successfully. Mar 7 01:32:39.092823 systemd[1]: Started sshd@7-10.200.20.32:22-10.200.16.10:53230.service - OpenSSH per-connection server daemon (10.200.16.10:53230). Mar 7 01:32:39.581570 sshd[6564]: Accepted publickey for core from 10.200.16.10 port 53230 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:32:39.583591 sshd[6564]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:32:39.587589 systemd-logind[1680]: New session 10 of user core. Mar 7 01:32:39.594699 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 7 01:32:40.003316 sshd[6564]: pam_unix(sshd:session): session closed for user core Mar 7 01:32:40.006870 systemd[1]: sshd@7-10.200.20.32:22-10.200.16.10:53230.service: Deactivated successfully. Mar 7 01:32:40.008883 systemd[1]: session-10.scope: Deactivated successfully. Mar 7 01:32:40.009630 systemd-logind[1680]: Session 10 logged out. Waiting for processes to exit. Mar 7 01:32:40.010415 systemd-logind[1680]: Removed session 10. Mar 7 01:32:45.092790 systemd[1]: Started sshd@8-10.200.20.32:22-10.200.16.10:39778.service - OpenSSH per-connection server daemon (10.200.16.10:39778). Mar 7 01:32:45.586126 sshd[6578]: Accepted publickey for core from 10.200.16.10 port 39778 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:32:45.586970 sshd[6578]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:32:45.590535 systemd-logind[1680]: New session 11 of user core. Mar 7 01:32:45.593682 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 7 01:32:45.998648 sshd[6578]: pam_unix(sshd:session): session closed for user core Mar 7 01:32:46.003336 systemd[1]: sshd@8-10.200.20.32:22-10.200.16.10:39778.service: Deactivated successfully. Mar 7 01:32:46.005988 systemd[1]: session-11.scope: Deactivated successfully. Mar 7 01:32:46.006948 systemd-logind[1680]: Session 11 logged out. Waiting for processes to exit. Mar 7 01:32:46.007822 systemd-logind[1680]: Removed session 11. Mar 7 01:32:49.350321 systemd[1]: run-containerd-runc-k8s.io-deade8009415fa5e9f8d27c496f345e6e77985844543f2ee5245af5fcb2dc995-runc.xUUvO8.mount: Deactivated successfully. Mar 7 01:32:51.090618 systemd[1]: Started sshd@9-10.200.20.32:22-10.200.16.10:46506.service - OpenSSH per-connection server daemon (10.200.16.10:46506). Mar 7 01:32:51.582801 sshd[6610]: Accepted publickey for core from 10.200.16.10 port 46506 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:32:51.584243 sshd[6610]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:32:51.588200 systemd-logind[1680]: New session 12 of user core. Mar 7 01:32:51.597723 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 7 01:32:52.017091 sshd[6610]: pam_unix(sshd:session): session closed for user core Mar 7 01:32:52.021332 systemd[1]: sshd@9-10.200.20.32:22-10.200.16.10:46506.service: Deactivated successfully. Mar 7 01:32:52.023059 systemd[1]: session-12.scope: Deactivated successfully. Mar 7 01:32:52.024318 systemd-logind[1680]: Session 12 logged out. Waiting for processes to exit. Mar 7 01:32:52.025296 systemd-logind[1680]: Removed session 12. Mar 7 01:32:57.106185 systemd[1]: Started sshd@10-10.200.20.32:22-10.200.16.10:46512.service - OpenSSH per-connection server daemon (10.200.16.10:46512). Mar 7 01:32:57.605695 sshd[6663]: Accepted publickey for core from 10.200.16.10 port 46512 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:32:57.636396 sshd[6663]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:32:57.640787 systemd-logind[1680]: New session 13 of user core. Mar 7 01:32:57.645675 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 7 01:32:58.022390 sshd[6663]: pam_unix(sshd:session): session closed for user core Mar 7 01:32:58.025775 systemd[1]: sshd@10-10.200.20.32:22-10.200.16.10:46512.service: Deactivated successfully. Mar 7 01:32:58.027459 systemd[1]: session-13.scope: Deactivated successfully. Mar 7 01:32:58.028480 systemd-logind[1680]: Session 13 logged out. Waiting for processes to exit. Mar 7 01:32:58.029306 systemd-logind[1680]: Removed session 13. Mar 7 01:32:58.110232 systemd[1]: Started sshd@11-10.200.20.32:22-10.200.16.10:46514.service - OpenSSH per-connection server daemon (10.200.16.10:46514). Mar 7 01:32:58.607015 sshd[6695]: Accepted publickey for core from 10.200.16.10 port 46514 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:32:58.608882 sshd[6695]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:32:58.612750 systemd-logind[1680]: New session 14 of user core. Mar 7 01:32:58.621695 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 7 01:32:59.060667 sshd[6695]: pam_unix(sshd:session): session closed for user core Mar 7 01:32:59.065812 systemd-logind[1680]: Session 14 logged out. Waiting for processes to exit. Mar 7 01:32:59.066406 systemd[1]: sshd@11-10.200.20.32:22-10.200.16.10:46514.service: Deactivated successfully. Mar 7 01:32:59.068761 systemd[1]: session-14.scope: Deactivated successfully. Mar 7 01:32:59.070098 systemd-logind[1680]: Removed session 14. Mar 7 01:32:59.157007 systemd[1]: Started sshd@12-10.200.20.32:22-10.200.16.10:46530.service - OpenSSH per-connection server daemon (10.200.16.10:46530). Mar 7 01:32:59.646953 sshd[6707]: Accepted publickey for core from 10.200.16.10 port 46530 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:32:59.648314 sshd[6707]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:32:59.652714 systemd-logind[1680]: New session 15 of user core. Mar 7 01:32:59.656720 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 7 01:33:00.063953 sshd[6707]: pam_unix(sshd:session): session closed for user core Mar 7 01:33:00.067721 systemd[1]: sshd@12-10.200.20.32:22-10.200.16.10:46530.service: Deactivated successfully. Mar 7 01:33:00.069518 systemd[1]: session-15.scope: Deactivated successfully. Mar 7 01:33:00.070254 systemd-logind[1680]: Session 15 logged out. Waiting for processes to exit. Mar 7 01:33:00.071618 systemd-logind[1680]: Removed session 15. Mar 7 01:33:04.238739 systemd[1]: run-containerd-runc-k8s.io-a39cd0747caf4cdb7cc083dfdc5bb4e9b31ce961edca8ff0a32ef550a2a37499-runc.lPaIOp.mount: Deactivated successfully. Mar 7 01:33:05.153218 systemd[1]: Started sshd@13-10.200.20.32:22-10.200.16.10:42790.service - OpenSSH per-connection server daemon (10.200.16.10:42790). Mar 7 01:33:05.654472 sshd[6764]: Accepted publickey for core from 10.200.16.10 port 42790 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:33:05.655339 sshd[6764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:33:05.659311 systemd-logind[1680]: New session 16 of user core. Mar 7 01:33:05.664744 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 7 01:33:06.067273 sshd[6764]: pam_unix(sshd:session): session closed for user core Mar 7 01:33:06.071689 systemd-logind[1680]: Session 16 logged out. Waiting for processes to exit. Mar 7 01:33:06.072488 systemd[1]: sshd@13-10.200.20.32:22-10.200.16.10:42790.service: Deactivated successfully. Mar 7 01:33:06.075067 systemd[1]: session-16.scope: Deactivated successfully. Mar 7 01:33:06.076109 systemd-logind[1680]: Removed session 16. Mar 7 01:33:06.160050 systemd[1]: Started sshd@14-10.200.20.32:22-10.200.16.10:42794.service - OpenSSH per-connection server daemon (10.200.16.10:42794). Mar 7 01:33:06.655007 sshd[6777]: Accepted publickey for core from 10.200.16.10 port 42794 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:33:06.655842 sshd[6777]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:33:06.660189 systemd-logind[1680]: New session 17 of user core. Mar 7 01:33:06.667714 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 7 01:33:07.244220 sshd[6777]: pam_unix(sshd:session): session closed for user core Mar 7 01:33:07.248048 systemd[1]: sshd@14-10.200.20.32:22-10.200.16.10:42794.service: Deactivated successfully. Mar 7 01:33:07.252445 systemd[1]: session-17.scope: Deactivated successfully. Mar 7 01:33:07.253557 systemd-logind[1680]: Session 17 logged out. Waiting for processes to exit. Mar 7 01:33:07.254479 systemd-logind[1680]: Removed session 17. Mar 7 01:33:07.332898 systemd[1]: Started sshd@15-10.200.20.32:22-10.200.16.10:42806.service - OpenSSH per-connection server daemon (10.200.16.10:42806). Mar 7 01:33:07.826588 sshd[6789]: Accepted publickey for core from 10.200.16.10 port 42806 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:33:07.829157 sshd[6789]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:33:07.834612 systemd-logind[1680]: New session 18 of user core. Mar 7 01:33:07.840713 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 7 01:33:08.882775 sshd[6789]: pam_unix(sshd:session): session closed for user core Mar 7 01:33:08.886626 systemd[1]: sshd@15-10.200.20.32:22-10.200.16.10:42806.service: Deactivated successfully. Mar 7 01:33:08.891304 systemd[1]: session-18.scope: Deactivated successfully. Mar 7 01:33:08.893366 systemd-logind[1680]: Session 18 logged out. Waiting for processes to exit. Mar 7 01:33:08.895014 systemd-logind[1680]: Removed session 18. Mar 7 01:33:08.978870 systemd[1]: Started sshd@16-10.200.20.32:22-10.200.16.10:42818.service - OpenSSH per-connection server daemon (10.200.16.10:42818). Mar 7 01:33:09.466017 sshd[6836]: Accepted publickey for core from 10.200.16.10 port 42818 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:33:09.467351 sshd[6836]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:33:09.471652 systemd-logind[1680]: New session 19 of user core. Mar 7 01:33:09.475683 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 7 01:33:10.002239 sshd[6836]: pam_unix(sshd:session): session closed for user core Mar 7 01:33:10.005763 systemd[1]: sshd@16-10.200.20.32:22-10.200.16.10:42818.service: Deactivated successfully. Mar 7 01:33:10.009473 systemd[1]: session-19.scope: Deactivated successfully. Mar 7 01:33:10.010323 systemd-logind[1680]: Session 19 logged out. Waiting for processes to exit. Mar 7 01:33:10.011356 systemd-logind[1680]: Removed session 19. Mar 7 01:33:10.096779 systemd[1]: Started sshd@17-10.200.20.32:22-10.200.16.10:49546.service - OpenSSH per-connection server daemon (10.200.16.10:49546). Mar 7 01:33:10.586049 sshd[6849]: Accepted publickey for core from 10.200.16.10 port 49546 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:33:10.587482 sshd[6849]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:33:10.591228 systemd-logind[1680]: New session 20 of user core. Mar 7 01:33:10.598716 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 7 01:33:10.997321 sshd[6849]: pam_unix(sshd:session): session closed for user core Mar 7 01:33:11.001022 systemd[1]: sshd@17-10.200.20.32:22-10.200.16.10:49546.service: Deactivated successfully. Mar 7 01:33:11.002793 systemd[1]: session-20.scope: Deactivated successfully. Mar 7 01:33:11.003448 systemd-logind[1680]: Session 20 logged out. Waiting for processes to exit. Mar 7 01:33:11.004646 systemd-logind[1680]: Removed session 20. Mar 7 01:33:16.086569 systemd[1]: Started sshd@18-10.200.20.32:22-10.200.16.10:49554.service - OpenSSH per-connection server daemon (10.200.16.10:49554). Mar 7 01:33:16.575893 sshd[6863]: Accepted publickey for core from 10.200.16.10 port 49554 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:33:16.576930 sshd[6863]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:33:16.582082 systemd-logind[1680]: New session 21 of user core. Mar 7 01:33:16.586732 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 7 01:33:16.982308 sshd[6863]: pam_unix(sshd:session): session closed for user core Mar 7 01:33:16.986336 systemd[1]: sshd@18-10.200.20.32:22-10.200.16.10:49554.service: Deactivated successfully. Mar 7 01:33:16.989643 systemd[1]: session-21.scope: Deactivated successfully. Mar 7 01:33:16.991293 systemd-logind[1680]: Session 21 logged out. Waiting for processes to exit. Mar 7 01:33:16.992256 systemd-logind[1680]: Removed session 21. Mar 7 01:33:22.087282 systemd[1]: Started sshd@19-10.200.20.32:22-10.200.16.10:58286.service - OpenSSH per-connection server daemon (10.200.16.10:58286). Mar 7 01:33:22.575830 sshd[6894]: Accepted publickey for core from 10.200.16.10 port 58286 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:33:22.576665 sshd[6894]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:33:22.580168 systemd-logind[1680]: New session 22 of user core. Mar 7 01:33:22.583697 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 7 01:33:22.982078 sshd[6894]: pam_unix(sshd:session): session closed for user core Mar 7 01:33:22.985848 systemd[1]: sshd@19-10.200.20.32:22-10.200.16.10:58286.service: Deactivated successfully. Mar 7 01:33:22.988077 systemd[1]: session-22.scope: Deactivated successfully. Mar 7 01:33:22.990227 systemd-logind[1680]: Session 22 logged out. Waiting for processes to exit. Mar 7 01:33:22.991351 systemd-logind[1680]: Removed session 22. Mar 7 01:33:28.077281 systemd[1]: Started sshd@20-10.200.20.32:22-10.200.16.10:58302.service - OpenSSH per-connection server daemon (10.200.16.10:58302). Mar 7 01:33:28.561802 sshd[6928]: Accepted publickey for core from 10.200.16.10 port 58302 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:33:28.563123 sshd[6928]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:33:28.567804 systemd-logind[1680]: New session 23 of user core. Mar 7 01:33:28.574697 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 7 01:33:28.970424 sshd[6928]: pam_unix(sshd:session): session closed for user core Mar 7 01:33:28.974402 systemd[1]: sshd@20-10.200.20.32:22-10.200.16.10:58302.service: Deactivated successfully. Mar 7 01:33:28.979404 systemd[1]: session-23.scope: Deactivated successfully. Mar 7 01:33:28.981759 systemd-logind[1680]: Session 23 logged out. Waiting for processes to exit. Mar 7 01:33:28.982616 systemd-logind[1680]: Removed session 23. Mar 7 01:33:34.059924 systemd[1]: Started sshd@21-10.200.20.32:22-10.200.16.10:32898.service - OpenSSH per-connection server daemon (10.200.16.10:32898). Mar 7 01:33:34.560415 sshd[6953]: Accepted publickey for core from 10.200.16.10 port 32898 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:33:34.561763 sshd[6953]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:33:34.566282 systemd-logind[1680]: New session 24 of user core. Mar 7 01:33:34.569678 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 7 01:33:34.966656 sshd[6953]: pam_unix(sshd:session): session closed for user core Mar 7 01:33:34.970148 systemd[1]: sshd@21-10.200.20.32:22-10.200.16.10:32898.service: Deactivated successfully. Mar 7 01:33:34.971826 systemd[1]: session-24.scope: Deactivated successfully. Mar 7 01:33:34.972444 systemd-logind[1680]: Session 24 logged out. Waiting for processes to exit. Mar 7 01:33:34.973538 systemd-logind[1680]: Removed session 24. Mar 7 01:33:40.061968 systemd[1]: Started sshd@22-10.200.20.32:22-10.200.16.10:51780.service - OpenSSH per-connection server daemon (10.200.16.10:51780). Mar 7 01:33:40.556425 sshd[6994]: Accepted publickey for core from 10.200.16.10 port 51780 ssh2: RSA SHA256:Tb1i62CHrltPPFOOnyFLIDbf3+3IpEiMvqzbTlMp5Qo Mar 7 01:33:40.580910 sshd[6994]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:33:40.585597 systemd-logind[1680]: New session 25 of user core. Mar 7 01:33:40.591713 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 7 01:33:40.969947 sshd[6994]: pam_unix(sshd:session): session closed for user core Mar 7 01:33:40.974016 systemd[1]: sshd@22-10.200.20.32:22-10.200.16.10:51780.service: Deactivated successfully. Mar 7 01:33:40.976313 systemd[1]: session-25.scope: Deactivated successfully. Mar 7 01:33:40.977292 systemd-logind[1680]: Session 25 logged out. Waiting for processes to exit. Mar 7 01:33:40.978373 systemd-logind[1680]: Removed session 25.