Mar 6 02:52:46.082046 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Mar 6 02:52:46.082063 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Thu Mar 5 23:10:47 -00 2026 Mar 6 02:52:46.082070 kernel: KASLR enabled Mar 6 02:52:46.082074 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Mar 6 02:52:46.082077 kernel: printk: legacy bootconsole [pl11] enabled Mar 6 02:52:46.082082 kernel: efi: EFI v2.7 by EDK II Mar 6 02:52:46.082088 kernel: efi: ACPI 2.0=0x3f979018 SMBIOS=0x3f8a0000 SMBIOS 3.0=0x3f880000 MEMATTR=0x3e89c018 RNG=0x3f979998 MEMRESERVE=0x3db83598 Mar 6 02:52:46.082092 kernel: random: crng init done Mar 6 02:52:46.082095 kernel: secureboot: Secure boot disabled Mar 6 02:52:46.082099 kernel: ACPI: Early table checksum verification disabled Mar 6 02:52:46.082103 kernel: ACPI: RSDP 0x000000003F979018 000024 (v02 VRTUAL) Mar 6 02:52:46.082107 kernel: ACPI: XSDT 0x000000003F979F18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:52:46.082111 kernel: ACPI: FACP 0x000000003F979C18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:52:46.082115 kernel: ACPI: DSDT 0x000000003F95A018 01E046 (v02 MSFTVM DSDT01 00000001 INTL 20230628) Mar 6 02:52:46.082121 kernel: ACPI: DBG2 0x000000003F979B18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:52:46.082125 kernel: ACPI: GTDT 0x000000003F979D98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:52:46.082130 kernel: ACPI: OEM0 0x000000003F979098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:52:46.082134 kernel: ACPI: SPCR 0x000000003F979A98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:52:46.082138 kernel: ACPI: APIC 0x000000003F979818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:52:46.082143 kernel: ACPI: SRAT 0x000000003F979198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:52:46.082147 kernel: ACPI: PPTT 0x000000003F979418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Mar 6 02:52:46.082152 kernel: ACPI: BGRT 0x000000003F979E98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:52:46.082156 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Mar 6 02:52:46.082160 kernel: ACPI: Use ACPI SPCR as default console: Yes Mar 6 02:52:46.082164 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Mar 6 02:52:46.082168 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Mar 6 02:52:46.082172 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Mar 6 02:52:46.082177 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Mar 6 02:52:46.082181 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Mar 6 02:52:46.082185 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Mar 6 02:52:46.082190 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Mar 6 02:52:46.082194 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Mar 6 02:52:46.082198 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Mar 6 02:52:46.082202 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Mar 6 02:52:46.082207 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Mar 6 02:52:46.082211 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Mar 6 02:52:46.082236 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Mar 6 02:52:46.082240 kernel: NODE_DATA(0) allocated [mem 0x1bf7ffa00-0x1bf806fff] Mar 6 02:52:46.082244 kernel: Zone ranges: Mar 6 02:52:46.082249 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Mar 6 02:52:46.082256 kernel: DMA32 empty Mar 6 02:52:46.082260 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Mar 6 02:52:46.082265 kernel: Device empty Mar 6 02:52:46.082269 kernel: Movable zone start for each node Mar 6 02:52:46.082273 kernel: Early memory node ranges Mar 6 02:52:46.082278 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Mar 6 02:52:46.082283 kernel: node 0: [mem 0x0000000000824000-0x000000003f38ffff] Mar 6 02:52:46.082288 kernel: node 0: [mem 0x000000003f390000-0x000000003f93ffff] Mar 6 02:52:46.082292 kernel: node 0: [mem 0x000000003f940000-0x000000003f9effff] Mar 6 02:52:46.082296 kernel: node 0: [mem 0x000000003f9f0000-0x000000003fdeffff] Mar 6 02:52:46.082301 kernel: node 0: [mem 0x000000003fdf0000-0x000000003fffffff] Mar 6 02:52:46.082305 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Mar 6 02:52:46.082309 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Mar 6 02:52:46.082314 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Mar 6 02:52:46.082318 kernel: cma: Reserved 16 MiB at 0x000000003ca00000 on node -1 Mar 6 02:52:46.082323 kernel: psci: probing for conduit method from ACPI. Mar 6 02:52:46.082327 kernel: psci: PSCIv1.3 detected in firmware. Mar 6 02:52:46.082331 kernel: psci: Using standard PSCI v0.2 function IDs Mar 6 02:52:46.082336 kernel: psci: MIGRATE_INFO_TYPE not supported. Mar 6 02:52:46.082341 kernel: psci: SMC Calling Convention v1.4 Mar 6 02:52:46.082345 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Mar 6 02:52:46.082350 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Mar 6 02:52:46.082354 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Mar 6 02:52:46.082359 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Mar 6 02:52:46.082363 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 6 02:52:46.082368 kernel: Detected PIPT I-cache on CPU0 Mar 6 02:52:46.082372 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Mar 6 02:52:46.082377 kernel: CPU features: detected: GIC system register CPU interface Mar 6 02:52:46.082381 kernel: CPU features: detected: Spectre-v4 Mar 6 02:52:46.082385 kernel: CPU features: detected: Spectre-BHB Mar 6 02:52:46.082391 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 6 02:52:46.082395 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 6 02:52:46.082399 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Mar 6 02:52:46.082404 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 6 02:52:46.082408 kernel: alternatives: applying boot alternatives Mar 6 02:52:46.082413 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=68c9ef230e3eed1360dd8114dada95b6a934f07952c3a5d42725f3006977f027 Mar 6 02:52:46.082418 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 6 02:52:46.082423 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 6 02:52:46.082427 kernel: Fallback order for Node 0: 0 Mar 6 02:52:46.082431 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Mar 6 02:52:46.082437 kernel: Policy zone: Normal Mar 6 02:52:46.082441 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 6 02:52:46.082446 kernel: software IO TLB: area num 2. Mar 6 02:52:46.082450 kernel: software IO TLB: mapped [mem 0x0000000035900000-0x0000000039900000] (64MB) Mar 6 02:52:46.082454 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 6 02:52:46.082459 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 6 02:52:46.082464 kernel: rcu: RCU event tracing is enabled. Mar 6 02:52:46.082468 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 6 02:52:46.082473 kernel: Trampoline variant of Tasks RCU enabled. Mar 6 02:52:46.082477 kernel: Tracing variant of Tasks RCU enabled. Mar 6 02:52:46.082482 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 6 02:52:46.082486 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 6 02:52:46.082492 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 6 02:52:46.082496 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 6 02:52:46.082500 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 6 02:52:46.082505 kernel: GICv3: 960 SPIs implemented Mar 6 02:52:46.082509 kernel: GICv3: 0 Extended SPIs implemented Mar 6 02:52:46.082513 kernel: Root IRQ handler: gic_handle_irq Mar 6 02:52:46.082518 kernel: GICv3: GICv3 features: 16 PPIs, RSS Mar 6 02:52:46.082522 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Mar 6 02:52:46.082527 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Mar 6 02:52:46.082531 kernel: ITS: No ITS available, not enabling LPIs Mar 6 02:52:46.082535 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 6 02:52:46.082541 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Mar 6 02:52:46.082545 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 6 02:52:46.082550 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Mar 6 02:52:46.082554 kernel: Console: colour dummy device 80x25 Mar 6 02:52:46.082559 kernel: printk: legacy console [tty1] enabled Mar 6 02:52:46.082563 kernel: ACPI: Core revision 20240827 Mar 6 02:52:46.082568 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Mar 6 02:52:46.082573 kernel: pid_max: default: 32768 minimum: 301 Mar 6 02:52:46.082577 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 6 02:52:46.082582 kernel: landlock: Up and running. Mar 6 02:52:46.082587 kernel: SELinux: Initializing. Mar 6 02:52:46.082591 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 6 02:52:46.082596 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 6 02:52:46.082601 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0xa0000e, misc 0x31e1 Mar 6 02:52:46.082605 kernel: Hyper-V: Host Build 10.0.26102.1212-1-0 Mar 6 02:52:46.082613 kernel: Hyper-V: enabling crash_kexec_post_notifiers Mar 6 02:52:46.082618 kernel: rcu: Hierarchical SRCU implementation. Mar 6 02:52:46.082623 kernel: rcu: Max phase no-delay instances is 400. Mar 6 02:52:46.082628 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 6 02:52:46.082633 kernel: Remapping and enabling EFI services. Mar 6 02:52:46.082637 kernel: smp: Bringing up secondary CPUs ... Mar 6 02:52:46.082642 kernel: Detected PIPT I-cache on CPU1 Mar 6 02:52:46.082648 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Mar 6 02:52:46.082653 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Mar 6 02:52:46.082657 kernel: smp: Brought up 1 node, 2 CPUs Mar 6 02:52:46.082662 kernel: SMP: Total of 2 processors activated. Mar 6 02:52:46.082667 kernel: CPU: All CPU(s) started at EL1 Mar 6 02:52:46.082673 kernel: CPU features: detected: 32-bit EL0 Support Mar 6 02:52:46.082678 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Mar 6 02:52:46.082682 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 6 02:52:46.082687 kernel: CPU features: detected: Common not Private translations Mar 6 02:52:46.082692 kernel: CPU features: detected: CRC32 instructions Mar 6 02:52:46.082697 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Mar 6 02:52:46.082702 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 6 02:52:46.082706 kernel: CPU features: detected: LSE atomic instructions Mar 6 02:52:46.082711 kernel: CPU features: detected: Privileged Access Never Mar 6 02:52:46.082717 kernel: CPU features: detected: Speculation barrier (SB) Mar 6 02:52:46.082722 kernel: CPU features: detected: TLB range maintenance instructions Mar 6 02:52:46.082727 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 6 02:52:46.082731 kernel: CPU features: detected: Scalable Vector Extension Mar 6 02:52:46.082736 kernel: alternatives: applying system-wide alternatives Mar 6 02:52:46.082741 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Mar 6 02:52:46.082746 kernel: SVE: maximum available vector length 16 bytes per vector Mar 6 02:52:46.082750 kernel: SVE: default vector length 16 bytes per vector Mar 6 02:52:46.082756 kernel: Memory: 3952828K/4194160K available (11200K kernel code, 2458K rwdata, 9088K rodata, 39552K init, 1038K bss, 220144K reserved, 16384K cma-reserved) Mar 6 02:52:46.082761 kernel: devtmpfs: initialized Mar 6 02:52:46.082766 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 6 02:52:46.082771 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 6 02:52:46.082776 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 6 02:52:46.082781 kernel: 0 pages in range for non-PLT usage Mar 6 02:52:46.082785 kernel: 508400 pages in range for PLT usage Mar 6 02:52:46.082790 kernel: pinctrl core: initialized pinctrl subsystem Mar 6 02:52:46.082795 kernel: SMBIOS 3.1.0 present. Mar 6 02:52:46.082800 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 06/10/2025 Mar 6 02:52:46.082805 kernel: DMI: Memory slots populated: 2/2 Mar 6 02:52:46.082810 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 6 02:52:46.082815 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 6 02:52:46.082819 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 6 02:52:46.082824 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 6 02:52:46.082829 kernel: audit: initializing netlink subsys (disabled) Mar 6 02:52:46.082834 kernel: audit: type=2000 audit(0.059:1): state=initialized audit_enabled=0 res=1 Mar 6 02:52:46.082839 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 6 02:52:46.082844 kernel: cpuidle: using governor menu Mar 6 02:52:46.082849 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 6 02:52:46.082854 kernel: ASID allocator initialised with 32768 entries Mar 6 02:52:46.082859 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 6 02:52:46.082863 kernel: Serial: AMBA PL011 UART driver Mar 6 02:52:46.082868 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 6 02:52:46.082873 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 6 02:52:46.082878 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 6 02:52:46.082882 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 6 02:52:46.082888 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 6 02:52:46.082893 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 6 02:52:46.082897 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 6 02:52:46.082902 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 6 02:52:46.082907 kernel: ACPI: Added _OSI(Module Device) Mar 6 02:52:46.082912 kernel: ACPI: Added _OSI(Processor Device) Mar 6 02:52:46.082916 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 6 02:52:46.082921 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 6 02:52:46.082926 kernel: ACPI: Interpreter enabled Mar 6 02:52:46.082931 kernel: ACPI: Using GIC for interrupt routing Mar 6 02:52:46.082936 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Mar 6 02:52:46.082941 kernel: printk: legacy console [ttyAMA0] enabled Mar 6 02:52:46.082946 kernel: printk: legacy bootconsole [pl11] disabled Mar 6 02:52:46.082951 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Mar 6 02:52:46.082955 kernel: ACPI: CPU0 has been hot-added Mar 6 02:52:46.082960 kernel: ACPI: CPU1 has been hot-added Mar 6 02:52:46.082965 kernel: iommu: Default domain type: Translated Mar 6 02:52:46.082969 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 6 02:52:46.082975 kernel: efivars: Registered efivars operations Mar 6 02:52:46.082980 kernel: vgaarb: loaded Mar 6 02:52:46.082984 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 6 02:52:46.082989 kernel: VFS: Disk quotas dquot_6.6.0 Mar 6 02:52:46.082994 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 6 02:52:46.082999 kernel: pnp: PnP ACPI init Mar 6 02:52:46.083003 kernel: pnp: PnP ACPI: found 0 devices Mar 6 02:52:46.083008 kernel: NET: Registered PF_INET protocol family Mar 6 02:52:46.083013 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 6 02:52:46.083018 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 6 02:52:46.083023 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 6 02:52:46.083028 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 6 02:52:46.083033 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 6 02:52:46.083038 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 6 02:52:46.083043 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 6 02:52:46.083048 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 6 02:52:46.083052 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 6 02:52:46.083057 kernel: PCI: CLS 0 bytes, default 64 Mar 6 02:52:46.083062 kernel: kvm [1]: HYP mode not available Mar 6 02:52:46.083067 kernel: Initialise system trusted keyrings Mar 6 02:52:46.083072 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 6 02:52:46.083077 kernel: Key type asymmetric registered Mar 6 02:52:46.083082 kernel: Asymmetric key parser 'x509' registered Mar 6 02:52:46.083086 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Mar 6 02:52:46.083091 kernel: io scheduler mq-deadline registered Mar 6 02:52:46.083096 kernel: io scheduler kyber registered Mar 6 02:52:46.083101 kernel: io scheduler bfq registered Mar 6 02:52:46.083105 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 6 02:52:46.083111 kernel: thunder_xcv, ver 1.0 Mar 6 02:52:46.083116 kernel: thunder_bgx, ver 1.0 Mar 6 02:52:46.083120 kernel: nicpf, ver 1.0 Mar 6 02:52:46.083125 kernel: nicvf, ver 1.0 Mar 6 02:52:46.083255 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 6 02:52:46.083311 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-06T02:52:45 UTC (1772765565) Mar 6 02:52:46.083318 kernel: efifb: probing for efifb Mar 6 02:52:46.083325 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Mar 6 02:52:46.083330 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Mar 6 02:52:46.083335 kernel: efifb: scrolling: redraw Mar 6 02:52:46.083339 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 6 02:52:46.083344 kernel: Console: switching to colour frame buffer device 128x48 Mar 6 02:52:46.083349 kernel: fb0: EFI VGA frame buffer device Mar 6 02:52:46.083354 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Mar 6 02:52:46.083359 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 6 02:52:46.083364 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Mar 6 02:52:46.083369 kernel: watchdog: NMI not fully supported Mar 6 02:52:46.083374 kernel: watchdog: Hard watchdog permanently disabled Mar 6 02:52:46.083379 kernel: NET: Registered PF_INET6 protocol family Mar 6 02:52:46.083384 kernel: Segment Routing with IPv6 Mar 6 02:52:46.083388 kernel: In-situ OAM (IOAM) with IPv6 Mar 6 02:52:46.083393 kernel: NET: Registered PF_PACKET protocol family Mar 6 02:52:46.083398 kernel: Key type dns_resolver registered Mar 6 02:52:46.083403 kernel: registered taskstats version 1 Mar 6 02:52:46.083407 kernel: Loading compiled-in X.509 certificates Mar 6 02:52:46.083412 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 3a2ba669b0bb3660035f2ce1faaa856d46d520ff' Mar 6 02:52:46.083418 kernel: Demotion targets for Node 0: null Mar 6 02:52:46.083423 kernel: Key type .fscrypt registered Mar 6 02:52:46.083427 kernel: Key type fscrypt-provisioning registered Mar 6 02:52:46.083432 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 6 02:52:46.083437 kernel: ima: Allocated hash algorithm: sha1 Mar 6 02:52:46.083441 kernel: ima: No architecture policies found Mar 6 02:52:46.083446 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 6 02:52:46.083451 kernel: clk: Disabling unused clocks Mar 6 02:52:46.083456 kernel: PM: genpd: Disabling unused power domains Mar 6 02:52:46.083462 kernel: Warning: unable to open an initial console. Mar 6 02:52:46.083466 kernel: Freeing unused kernel memory: 39552K Mar 6 02:52:46.083471 kernel: Run /init as init process Mar 6 02:52:46.083476 kernel: with arguments: Mar 6 02:52:46.083481 kernel: /init Mar 6 02:52:46.083485 kernel: with environment: Mar 6 02:52:46.083490 kernel: HOME=/ Mar 6 02:52:46.083494 kernel: TERM=linux Mar 6 02:52:46.083500 systemd[1]: Successfully made /usr/ read-only. Mar 6 02:52:46.083508 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 6 02:52:46.083514 systemd[1]: Detected virtualization microsoft. Mar 6 02:52:46.083519 systemd[1]: Detected architecture arm64. Mar 6 02:52:46.083523 systemd[1]: Running in initrd. Mar 6 02:52:46.083528 systemd[1]: No hostname configured, using default hostname. Mar 6 02:52:46.083534 systemd[1]: Hostname set to . Mar 6 02:52:46.083539 systemd[1]: Initializing machine ID from random generator. Mar 6 02:52:46.083545 systemd[1]: Queued start job for default target initrd.target. Mar 6 02:52:46.083550 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 6 02:52:46.083555 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 6 02:52:46.083561 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 6 02:52:46.083566 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 6 02:52:46.083571 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 6 02:52:46.083577 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 6 02:52:46.083584 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 6 02:52:46.083590 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 6 02:52:46.083595 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 6 02:52:46.083600 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 6 02:52:46.083605 systemd[1]: Reached target paths.target - Path Units. Mar 6 02:52:46.083610 systemd[1]: Reached target slices.target - Slice Units. Mar 6 02:52:46.083615 systemd[1]: Reached target swap.target - Swaps. Mar 6 02:52:46.083620 systemd[1]: Reached target timers.target - Timer Units. Mar 6 02:52:46.083627 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 6 02:52:46.083632 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 6 02:52:46.083637 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 6 02:52:46.083642 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 6 02:52:46.083647 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 6 02:52:46.083653 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 6 02:52:46.083658 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 6 02:52:46.083663 systemd[1]: Reached target sockets.target - Socket Units. Mar 6 02:52:46.083669 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 6 02:52:46.083675 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 6 02:52:46.083680 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 6 02:52:46.083685 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 6 02:52:46.083691 systemd[1]: Starting systemd-fsck-usr.service... Mar 6 02:52:46.083696 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 6 02:52:46.083701 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 6 02:52:46.083716 systemd-journald[225]: Collecting audit messages is disabled. Mar 6 02:52:46.083731 systemd-journald[225]: Journal started Mar 6 02:52:46.083744 systemd-journald[225]: Runtime Journal (/run/log/journal/9a56cf954a3b4f8ea29847994cecee89) is 8M, max 78.3M, 70.3M free. Mar 6 02:52:46.087247 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:52:46.092099 systemd-modules-load[227]: Inserted module 'overlay' Mar 6 02:52:46.115082 systemd[1]: Started systemd-journald.service - Journal Service. Mar 6 02:52:46.115120 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 6 02:52:46.122231 kernel: Bridge firewalling registered Mar 6 02:52:46.122321 systemd-modules-load[227]: Inserted module 'br_netfilter' Mar 6 02:52:46.123693 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 6 02:52:46.137924 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 6 02:52:46.143795 systemd[1]: Finished systemd-fsck-usr.service. Mar 6 02:52:46.152328 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 6 02:52:46.159602 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:52:46.170387 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 6 02:52:46.188676 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 6 02:52:46.193145 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 6 02:52:46.204715 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 6 02:52:46.221910 systemd-tmpfiles[247]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 6 02:52:46.234182 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 6 02:52:46.239147 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 6 02:52:46.248775 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 6 02:52:46.259057 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 6 02:52:46.271618 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 6 02:52:46.287931 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 6 02:52:46.299672 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 6 02:52:46.315195 dracut-cmdline[262]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=68c9ef230e3eed1360dd8114dada95b6a934f07952c3a5d42725f3006977f027 Mar 6 02:52:46.318491 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 6 02:52:46.370559 systemd-resolved[263]: Positive Trust Anchors: Mar 6 02:52:46.370576 systemd-resolved[263]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 6 02:52:46.370596 systemd-resolved[263]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 6 02:52:46.372199 systemd-resolved[263]: Defaulting to hostname 'linux'. Mar 6 02:52:46.373747 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 6 02:52:46.385095 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 6 02:52:46.471240 kernel: SCSI subsystem initialized Mar 6 02:52:46.477228 kernel: Loading iSCSI transport class v2.0-870. Mar 6 02:52:46.484344 kernel: iscsi: registered transport (tcp) Mar 6 02:52:46.497217 kernel: iscsi: registered transport (qla4xxx) Mar 6 02:52:46.497236 kernel: QLogic iSCSI HBA Driver Mar 6 02:52:46.510456 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 6 02:52:46.538298 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 6 02:52:46.544515 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 6 02:52:46.590299 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 6 02:52:46.597352 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 6 02:52:46.662235 kernel: raid6: neonx8 gen() 18530 MB/s Mar 6 02:52:46.678225 kernel: raid6: neonx4 gen() 18547 MB/s Mar 6 02:52:46.697226 kernel: raid6: neonx2 gen() 17077 MB/s Mar 6 02:52:46.717224 kernel: raid6: neonx1 gen() 15007 MB/s Mar 6 02:52:46.736314 kernel: raid6: int64x8 gen() 10529 MB/s Mar 6 02:52:46.755312 kernel: raid6: int64x4 gen() 10609 MB/s Mar 6 02:52:46.775245 kernel: raid6: int64x2 gen() 8978 MB/s Mar 6 02:52:46.796945 kernel: raid6: int64x1 gen() 7001 MB/s Mar 6 02:52:46.796957 kernel: raid6: using algorithm neonx4 gen() 18547 MB/s Mar 6 02:52:46.819479 kernel: raid6: .... xor() 15142 MB/s, rmw enabled Mar 6 02:52:46.819521 kernel: raid6: using neon recovery algorithm Mar 6 02:52:46.827564 kernel: xor: measuring software checksum speed Mar 6 02:52:46.827573 kernel: 8regs : 28605 MB/sec Mar 6 02:52:46.830785 kernel: 32regs : 28753 MB/sec Mar 6 02:52:46.833544 kernel: arm64_neon : 37242 MB/sec Mar 6 02:52:46.836841 kernel: xor: using function: arm64_neon (37242 MB/sec) Mar 6 02:52:46.875233 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 6 02:52:46.880454 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 6 02:52:46.889354 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 6 02:52:46.915730 systemd-udevd[474]: Using default interface naming scheme 'v255'. Mar 6 02:52:46.920006 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 6 02:52:46.926837 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 6 02:52:46.958464 dracut-pre-trigger[483]: rd.md=0: removing MD RAID activation Mar 6 02:52:46.978607 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 6 02:52:46.989048 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 6 02:52:47.029929 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 6 02:52:47.043047 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 6 02:52:47.102324 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 6 02:52:47.113979 kernel: hv_vmbus: Vmbus version:5.3 Mar 6 02:52:47.106796 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:52:47.119357 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:52:47.137223 kernel: hv_vmbus: registering driver hid_hyperv Mar 6 02:52:47.137247 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input0 Mar 6 02:52:47.138023 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:52:47.164243 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Mar 6 02:52:47.164373 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 6 02:52:47.164348 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 6 02:52:47.174249 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 6 02:52:47.166567 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:52:47.196484 kernel: hv_vmbus: registering driver hv_netvsc Mar 6 02:52:47.196501 kernel: hv_vmbus: registering driver hv_storvsc Mar 6 02:52:47.196508 kernel: PTP clock support registered Mar 6 02:52:47.196515 kernel: hv_vmbus: registering driver hyperv_keyboard Mar 6 02:52:47.195995 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 6 02:52:47.221993 kernel: scsi host0: storvsc_host_t Mar 6 02:52:47.222138 kernel: scsi host1: storvsc_host_t Mar 6 02:52:47.222211 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Mar 6 02:52:47.222248 kernel: hv_utils: Registering HyperV Utility Driver Mar 6 02:52:47.222256 kernel: hv_vmbus: registering driver hv_utils Mar 6 02:52:47.198198 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:52:47.193446 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Mar 6 02:52:47.200397 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input1 Mar 6 02:52:47.200411 kernel: hv_utils: Heartbeat IC version 3.0 Mar 6 02:52:47.200416 kernel: hv_utils: Shutdown IC version 3.2 Mar 6 02:52:47.200424 kernel: hv_utils: TimeSync IC version 4.0 Mar 6 02:52:47.200429 systemd-journald[225]: Time jumped backwards, rotating. Mar 6 02:52:47.180668 systemd-resolved[263]: Clock change detected. Flushing caches. Mar 6 02:52:47.212384 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Mar 6 02:52:47.212530 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Mar 6 02:52:47.215595 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 6 02:52:47.223078 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Mar 6 02:52:47.223205 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Mar 6 02:52:47.223270 kernel: hv_netvsc 000d3ac6-007e-000d-3ac6-007e000d3ac6 eth0: VF slot 1 added Mar 6 02:52:47.239758 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 6 02:52:47.239791 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 6 02:52:47.236953 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:52:47.259255 kernel: hv_vmbus: registering driver hv_pci Mar 6 02:52:47.259273 kernel: hv_pci 75dbd391-dd39-4111-9506-9918e75bf3a2: PCI VMBus probing: Using version 0x10004 Mar 6 02:52:47.261854 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Mar 6 02:52:47.262018 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 6 02:52:47.268916 kernel: hv_pci 75dbd391-dd39-4111-9506-9918e75bf3a2: PCI host bridge to bus dd39:00 Mar 6 02:52:47.274557 kernel: pci_bus dd39:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Mar 6 02:52:47.279235 kernel: pci_bus dd39:00: No busn resource found for root bus, will use [bus 00-ff] Mar 6 02:52:47.280010 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Mar 6 02:52:47.285912 kernel: pci dd39:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Mar 6 02:52:47.291931 kernel: pci dd39:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 6 02:52:47.296923 kernel: pci dd39:00:02.0: enabling Extended Tags Mar 6 02:52:47.311924 kernel: pci dd39:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at dd39:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Mar 6 02:52:47.321752 kernel: pci_bus dd39:00: busn_res: [bus 00-ff] end is updated to 00 Mar 6 02:52:47.321919 kernel: pci dd39:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Mar 6 02:52:47.345930 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#40 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 6 02:52:47.368921 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#117 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 6 02:52:47.408045 kernel: mlx5_core dd39:00:02.0: enabling device (0000 -> 0002) Mar 6 02:52:47.417685 kernel: mlx5_core dd39:00:02.0: PTM is not supported by PCIe Mar 6 02:52:47.417841 kernel: mlx5_core dd39:00:02.0: firmware version: 16.30.5026 Mar 6 02:52:47.593888 kernel: hv_netvsc 000d3ac6-007e-000d-3ac6-007e000d3ac6 eth0: VF registering: eth1 Mar 6 02:52:47.594095 kernel: mlx5_core dd39:00:02.0 eth1: joined to eth0 Mar 6 02:52:47.595925 kernel: mlx5_core dd39:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Mar 6 02:52:47.614915 kernel: mlx5_core dd39:00:02.0 enP56633s1: renamed from eth1 Mar 6 02:52:47.747462 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Mar 6 02:52:47.837045 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Mar 6 02:52:47.842390 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Mar 6 02:52:47.868083 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 6 02:52:47.891561 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Mar 6 02:52:47.912906 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 6 02:52:47.926527 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 6 02:52:47.925675 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 6 02:52:47.934308 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 6 02:52:47.945659 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 6 02:52:47.952515 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 6 02:52:47.965842 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 6 02:52:47.993877 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 6 02:52:48.957588 disk-uuid[654]: The operation has completed successfully. Mar 6 02:52:48.961544 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 6 02:52:49.031452 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 6 02:52:49.031548 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 6 02:52:49.055379 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 6 02:52:49.075502 sh[823]: Success Mar 6 02:52:49.111282 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 6 02:52:49.111330 kernel: device-mapper: uevent: version 1.0.3 Mar 6 02:52:49.116512 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 6 02:52:49.126916 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Mar 6 02:52:49.376404 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 6 02:52:49.387278 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 6 02:52:49.392395 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 6 02:52:49.426462 kernel: BTRFS: device fsid fcb4e7bf-1206-4803-90fb-6606b15e3aea devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (841) Mar 6 02:52:49.426506 kernel: BTRFS info (device dm-0): first mount of filesystem fcb4e7bf-1206-4803-90fb-6606b15e3aea Mar 6 02:52:49.431293 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 6 02:52:49.670460 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 6 02:52:49.670536 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 6 02:52:49.705029 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 6 02:52:49.709038 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 6 02:52:49.716527 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 6 02:52:49.717172 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 6 02:52:49.736413 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 6 02:52:49.766909 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (864) Mar 6 02:52:49.779351 kernel: BTRFS info (device sda6): first mount of filesystem 890f9900-ea91-473b-9515-ad9b05b1880b Mar 6 02:52:49.779388 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 6 02:52:49.806068 kernel: BTRFS info (device sda6): turning on async discard Mar 6 02:52:49.806120 kernel: BTRFS info (device sda6): enabling free space tree Mar 6 02:52:49.814918 kernel: BTRFS info (device sda6): last unmount of filesystem 890f9900-ea91-473b-9515-ad9b05b1880b Mar 6 02:52:49.816281 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 6 02:52:49.828950 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 6 02:52:49.869363 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 6 02:52:49.881131 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 6 02:52:49.909454 systemd-networkd[1010]: lo: Link UP Mar 6 02:52:49.909465 systemd-networkd[1010]: lo: Gained carrier Mar 6 02:52:49.910234 systemd-networkd[1010]: Enumeration completed Mar 6 02:52:49.912464 systemd-networkd[1010]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 02:52:49.912466 systemd-networkd[1010]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 6 02:52:49.912542 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 6 02:52:49.920388 systemd[1]: Reached target network.target - Network. Mar 6 02:52:49.993918 kernel: mlx5_core dd39:00:02.0 enP56633s1: Link up Mar 6 02:52:50.028920 kernel: hv_netvsc 000d3ac6-007e-000d-3ac6-007e000d3ac6 eth0: Data path switched to VF: enP56633s1 Mar 6 02:52:50.029280 systemd-networkd[1010]: enP56633s1: Link UP Mar 6 02:52:50.029340 systemd-networkd[1010]: eth0: Link UP Mar 6 02:52:50.029405 systemd-networkd[1010]: eth0: Gained carrier Mar 6 02:52:50.029428 systemd-networkd[1010]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 02:52:50.039072 systemd-networkd[1010]: enP56633s1: Gained carrier Mar 6 02:52:50.065931 systemd-networkd[1010]: eth0: DHCPv4 address 10.200.20.34/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 6 02:52:51.084568 ignition[955]: Ignition 2.22.0 Mar 6 02:52:51.084584 ignition[955]: Stage: fetch-offline Mar 6 02:52:51.089038 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 6 02:52:51.084677 ignition[955]: no configs at "/usr/lib/ignition/base.d" Mar 6 02:52:51.097963 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 6 02:52:51.084685 ignition[955]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 6 02:52:51.084754 ignition[955]: parsed url from cmdline: "" Mar 6 02:52:51.084756 ignition[955]: no config URL provided Mar 6 02:52:51.084760 ignition[955]: reading system config file "/usr/lib/ignition/user.ign" Mar 6 02:52:51.084766 ignition[955]: no config at "/usr/lib/ignition/user.ign" Mar 6 02:52:51.084772 ignition[955]: failed to fetch config: resource requires networking Mar 6 02:52:51.085050 ignition[955]: Ignition finished successfully Mar 6 02:52:51.132171 ignition[1019]: Ignition 2.22.0 Mar 6 02:52:51.132176 ignition[1019]: Stage: fetch Mar 6 02:52:51.132397 ignition[1019]: no configs at "/usr/lib/ignition/base.d" Mar 6 02:52:51.132404 ignition[1019]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 6 02:52:51.132473 ignition[1019]: parsed url from cmdline: "" Mar 6 02:52:51.132475 ignition[1019]: no config URL provided Mar 6 02:52:51.132479 ignition[1019]: reading system config file "/usr/lib/ignition/user.ign" Mar 6 02:52:51.132485 ignition[1019]: no config at "/usr/lib/ignition/user.ign" Mar 6 02:52:51.132501 ignition[1019]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Mar 6 02:52:51.244579 ignition[1019]: GET result: OK Mar 6 02:52:51.244641 ignition[1019]: config has been read from IMDS userdata Mar 6 02:52:51.244663 ignition[1019]: parsing config with SHA512: 839f67ff18947487f2949513afdf09cbffdf85f2dcc30d51aecb508a02da28502cd47ecba6b1aab20e99703fd6c8b7fe75d13fd65e614cb2f95cd90d8f978a01 Mar 6 02:52:51.247944 unknown[1019]: fetched base config from "system" Mar 6 02:52:51.248256 ignition[1019]: fetch: fetch complete Mar 6 02:52:51.247949 unknown[1019]: fetched base config from "system" Mar 6 02:52:51.248259 ignition[1019]: fetch: fetch passed Mar 6 02:52:51.247953 unknown[1019]: fetched user config from "azure" Mar 6 02:52:51.248304 ignition[1019]: Ignition finished successfully Mar 6 02:52:51.252556 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 6 02:52:51.261982 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 6 02:52:51.308431 ignition[1026]: Ignition 2.22.0 Mar 6 02:52:51.311287 ignition[1026]: Stage: kargs Mar 6 02:52:51.311475 ignition[1026]: no configs at "/usr/lib/ignition/base.d" Mar 6 02:52:51.315999 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 6 02:52:51.311482 ignition[1026]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 6 02:52:51.321705 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 6 02:52:51.312019 ignition[1026]: kargs: kargs passed Mar 6 02:52:51.312066 ignition[1026]: Ignition finished successfully Mar 6 02:52:51.356268 ignition[1032]: Ignition 2.22.0 Mar 6 02:52:51.356285 ignition[1032]: Stage: disks Mar 6 02:52:51.356463 ignition[1032]: no configs at "/usr/lib/ignition/base.d" Mar 6 02:52:51.362267 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 6 02:52:51.356470 ignition[1032]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 6 02:52:51.367458 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 6 02:52:51.356973 ignition[1032]: disks: disks passed Mar 6 02:52:51.375930 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 6 02:52:51.357012 ignition[1032]: Ignition finished successfully Mar 6 02:52:51.385724 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 6 02:52:51.395596 systemd[1]: Reached target sysinit.target - System Initialization. Mar 6 02:52:51.405386 systemd[1]: Reached target basic.target - Basic System. Mar 6 02:52:51.413306 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 6 02:52:51.503927 systemd-fsck[1040]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Mar 6 02:52:51.513189 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 6 02:52:51.520916 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 6 02:52:51.683053 systemd-networkd[1010]: eth0: Gained IPv6LL Mar 6 02:52:51.766914 kernel: EXT4-fs (sda9): mounted filesystem f0884ab3-756d-49e8-9d95-af187b4f35fb r/w with ordered data mode. Quota mode: none. Mar 6 02:52:51.768013 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 6 02:52:51.771958 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 6 02:52:51.794767 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 6 02:52:51.802593 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 6 02:52:51.811673 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 6 02:52:51.820207 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 6 02:52:51.820234 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 6 02:52:51.832235 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 6 02:52:51.847561 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 6 02:52:51.871913 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1054) Mar 6 02:52:51.883164 kernel: BTRFS info (device sda6): first mount of filesystem 890f9900-ea91-473b-9515-ad9b05b1880b Mar 6 02:52:51.883199 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 6 02:52:51.892985 kernel: BTRFS info (device sda6): turning on async discard Mar 6 02:52:51.893023 kernel: BTRFS info (device sda6): enabling free space tree Mar 6 02:52:51.894211 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 6 02:52:52.352766 coreos-metadata[1056]: Mar 06 02:52:52.352 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 6 02:52:52.361113 coreos-metadata[1056]: Mar 06 02:52:52.361 INFO Fetch successful Mar 6 02:52:52.361113 coreos-metadata[1056]: Mar 06 02:52:52.361 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Mar 6 02:52:52.374803 coreos-metadata[1056]: Mar 06 02:52:52.374 INFO Fetch successful Mar 6 02:52:52.389738 coreos-metadata[1056]: Mar 06 02:52:52.388 INFO wrote hostname ci-4459.2.3-n-bf8f1184ca to /sysroot/etc/hostname Mar 6 02:52:52.391917 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 6 02:52:52.509099 initrd-setup-root[1084]: cut: /sysroot/etc/passwd: No such file or directory Mar 6 02:52:52.546955 initrd-setup-root[1091]: cut: /sysroot/etc/group: No such file or directory Mar 6 02:52:52.554997 initrd-setup-root[1098]: cut: /sysroot/etc/shadow: No such file or directory Mar 6 02:52:52.562321 initrd-setup-root[1105]: cut: /sysroot/etc/gshadow: No such file or directory Mar 6 02:52:53.568934 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 6 02:52:53.574345 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 6 02:52:53.595604 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 6 02:52:53.607072 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 6 02:52:53.616354 kernel: BTRFS info (device sda6): last unmount of filesystem 890f9900-ea91-473b-9515-ad9b05b1880b Mar 6 02:52:53.640646 ignition[1174]: INFO : Ignition 2.22.0 Mar 6 02:52:53.645173 ignition[1174]: INFO : Stage: mount Mar 6 02:52:53.645173 ignition[1174]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 6 02:52:53.645173 ignition[1174]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 6 02:52:53.645173 ignition[1174]: INFO : mount: mount passed Mar 6 02:52:53.645173 ignition[1174]: INFO : Ignition finished successfully Mar 6 02:52:53.645036 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 6 02:52:53.651546 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 6 02:52:53.659917 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 6 02:52:53.693011 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 6 02:52:53.724078 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1186) Mar 6 02:52:53.724127 kernel: BTRFS info (device sda6): first mount of filesystem 890f9900-ea91-473b-9515-ad9b05b1880b Mar 6 02:52:53.729176 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 6 02:52:53.738256 kernel: BTRFS info (device sda6): turning on async discard Mar 6 02:52:53.738301 kernel: BTRFS info (device sda6): enabling free space tree Mar 6 02:52:53.739879 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 6 02:52:53.765572 ignition[1204]: INFO : Ignition 2.22.0 Mar 6 02:52:53.768943 ignition[1204]: INFO : Stage: files Mar 6 02:52:53.768943 ignition[1204]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 6 02:52:53.768943 ignition[1204]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 6 02:52:53.768943 ignition[1204]: DEBUG : files: compiled without relabeling support, skipping Mar 6 02:52:53.787642 ignition[1204]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 6 02:52:53.787642 ignition[1204]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 6 02:52:53.845767 ignition[1204]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 6 02:52:53.852223 ignition[1204]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 6 02:52:53.852223 ignition[1204]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 6 02:52:53.846161 unknown[1204]: wrote ssh authorized keys file for user: core Mar 6 02:52:53.878834 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 6 02:52:53.887145 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 6 02:52:53.911270 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 6 02:52:54.094738 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 6 02:52:54.094738 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 6 02:52:54.094738 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 6 02:52:54.094738 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 6 02:52:54.126301 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 6 02:52:54.126301 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 6 02:52:54.126301 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 6 02:52:54.126301 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 6 02:52:54.126301 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 6 02:52:54.126301 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 6 02:52:54.126301 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 6 02:52:54.126301 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 6 02:52:54.126301 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 6 02:52:54.126301 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 6 02:52:54.126301 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.4-arm64.raw: attempt #1 Mar 6 02:52:54.543686 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 6 02:52:55.008180 ignition[1204]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 6 02:52:55.008180 ignition[1204]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 6 02:52:55.057406 ignition[1204]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 6 02:52:55.072857 ignition[1204]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 6 02:52:55.072857 ignition[1204]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 6 02:52:55.072857 ignition[1204]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 6 02:52:55.101238 ignition[1204]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 6 02:52:55.101238 ignition[1204]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 6 02:52:55.101238 ignition[1204]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 6 02:52:55.101238 ignition[1204]: INFO : files: files passed Mar 6 02:52:55.101238 ignition[1204]: INFO : Ignition finished successfully Mar 6 02:52:55.074330 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 6 02:52:55.086679 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 6 02:52:55.109441 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 6 02:52:55.123173 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 6 02:52:55.134276 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 6 02:52:55.166832 initrd-setup-root-after-ignition[1232]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 6 02:52:55.166832 initrd-setup-root-after-ignition[1232]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 6 02:52:55.180894 initrd-setup-root-after-ignition[1236]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 6 02:52:55.174527 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 6 02:52:55.186305 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 6 02:52:55.197618 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 6 02:52:55.245348 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 6 02:52:55.245443 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 6 02:52:55.255251 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 6 02:52:55.264817 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 6 02:52:55.273912 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 6 02:52:55.274657 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 6 02:52:55.312169 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 6 02:52:55.318931 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 6 02:52:55.341295 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 6 02:52:55.346306 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 6 02:52:55.355917 systemd[1]: Stopped target timers.target - Timer Units. Mar 6 02:52:55.365002 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 6 02:52:55.365111 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 6 02:52:55.377342 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 6 02:52:55.382522 systemd[1]: Stopped target basic.target - Basic System. Mar 6 02:52:55.391221 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 6 02:52:55.400230 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 6 02:52:55.408563 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 6 02:52:55.417362 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 6 02:52:55.426492 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 6 02:52:55.435890 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 6 02:52:55.446620 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 6 02:52:55.455109 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 6 02:52:55.464002 systemd[1]: Stopped target swap.target - Swaps. Mar 6 02:52:55.471631 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 6 02:52:55.471747 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 6 02:52:55.483214 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 6 02:52:55.487685 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 6 02:52:55.497166 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 6 02:52:55.497232 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 6 02:52:55.506560 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 6 02:52:55.506654 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 6 02:52:55.519321 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 6 02:52:55.519402 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 6 02:52:55.525533 systemd[1]: ignition-files.service: Deactivated successfully. Mar 6 02:52:55.525603 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 6 02:52:55.534206 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 6 02:52:55.534271 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 6 02:52:55.606142 ignition[1256]: INFO : Ignition 2.22.0 Mar 6 02:52:55.606142 ignition[1256]: INFO : Stage: umount Mar 6 02:52:55.606142 ignition[1256]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 6 02:52:55.606142 ignition[1256]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 6 02:52:55.606142 ignition[1256]: INFO : umount: umount passed Mar 6 02:52:55.606142 ignition[1256]: INFO : Ignition finished successfully Mar 6 02:52:55.546476 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 6 02:52:55.577083 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 6 02:52:55.593014 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 6 02:52:55.593163 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 6 02:52:55.605099 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 6 02:52:55.605184 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 6 02:52:55.620078 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 6 02:52:55.620172 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 6 02:52:55.629168 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 6 02:52:55.629391 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 6 02:52:55.641219 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 6 02:52:55.641269 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 6 02:52:55.649065 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 6 02:52:55.649094 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 6 02:52:55.657202 systemd[1]: Stopped target network.target - Network. Mar 6 02:52:55.665035 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 6 02:52:55.665079 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 6 02:52:55.673710 systemd[1]: Stopped target paths.target - Path Units. Mar 6 02:52:55.682510 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 6 02:52:55.685918 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 6 02:52:55.691814 systemd[1]: Stopped target slices.target - Slice Units. Mar 6 02:52:55.701151 systemd[1]: Stopped target sockets.target - Socket Units. Mar 6 02:52:55.708757 systemd[1]: iscsid.socket: Deactivated successfully. Mar 6 02:52:55.708805 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 6 02:52:55.716863 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 6 02:52:55.716904 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 6 02:52:55.724926 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 6 02:52:55.724976 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 6 02:52:55.733146 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 6 02:52:55.733174 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 6 02:52:55.741532 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 6 02:52:55.749557 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 6 02:52:55.763264 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 6 02:52:55.763794 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 6 02:52:55.763885 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 6 02:52:55.775746 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 6 02:52:55.775925 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 6 02:52:55.776007 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 6 02:52:55.790094 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 6 02:52:55.977182 kernel: hv_netvsc 000d3ac6-007e-000d-3ac6-007e000d3ac6 eth0: Data path switched from VF: enP56633s1 Mar 6 02:52:55.790278 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 6 02:52:55.790352 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 6 02:52:55.801617 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 6 02:52:55.808883 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 6 02:52:55.814855 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 6 02:52:55.825004 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 6 02:52:55.838101 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 6 02:52:55.838157 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 6 02:52:55.847797 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 6 02:52:55.847839 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 6 02:52:55.860062 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 6 02:52:55.860099 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 6 02:52:55.865012 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 6 02:52:55.865040 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 6 02:52:55.880669 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 6 02:52:55.890520 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 6 02:52:55.890585 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 6 02:52:55.904211 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 6 02:52:55.908841 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 6 02:52:55.918103 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 6 02:52:55.918170 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 6 02:52:55.927265 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 6 02:52:55.927295 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 6 02:52:55.938807 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 6 02:52:55.938848 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 6 02:52:55.954312 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 6 02:52:55.954354 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 6 02:52:55.977367 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 6 02:52:55.977421 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 6 02:52:55.995155 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 6 02:52:56.010667 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 6 02:52:56.010728 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 6 02:52:56.021260 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 6 02:52:56.021296 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 6 02:52:56.026873 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 6 02:52:56.026919 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 6 02:52:56.042814 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 6 02:52:56.042856 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 6 02:52:56.048937 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 6 02:52:56.048974 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:52:56.066495 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Mar 6 02:52:56.066540 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Mar 6 02:52:56.066563 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 6 02:52:56.066588 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 6 02:52:56.066870 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 6 02:52:56.067075 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 6 02:52:56.093255 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 6 02:52:56.093373 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 6 02:52:57.633684 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 6 02:52:57.633780 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 6 02:52:57.637822 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 6 02:52:57.645808 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 6 02:52:57.645861 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 6 02:52:57.654220 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 6 02:52:57.677474 systemd[1]: Switching root. Mar 6 02:52:57.788876 systemd-journald[225]: Journal stopped Mar 6 02:53:10.399880 systemd-journald[225]: Received SIGTERM from PID 1 (systemd). Mar 6 02:53:10.399911 kernel: SELinux: policy capability network_peer_controls=1 Mar 6 02:53:10.399920 kernel: SELinux: policy capability open_perms=1 Mar 6 02:53:10.399929 kernel: SELinux: policy capability extended_socket_class=1 Mar 6 02:53:10.399936 kernel: SELinux: policy capability always_check_network=0 Mar 6 02:53:10.399941 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 6 02:53:10.399947 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 6 02:53:10.399952 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 6 02:53:10.399958 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 6 02:53:10.399963 kernel: SELinux: policy capability userspace_initial_context=0 Mar 6 02:53:10.399968 kernel: audit: type=1403 audit(1772765579.566:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 6 02:53:10.399975 systemd[1]: Successfully loaded SELinux policy in 790.029ms. Mar 6 02:53:10.399981 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.373ms. Mar 6 02:53:10.399988 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 6 02:53:10.399994 systemd[1]: Detected virtualization microsoft. Mar 6 02:53:10.400001 systemd[1]: Detected architecture arm64. Mar 6 02:53:10.400007 systemd[1]: Detected first boot. Mar 6 02:53:10.400013 systemd[1]: Hostname set to . Mar 6 02:53:10.400019 systemd[1]: Initializing machine ID from random generator. Mar 6 02:53:10.400025 zram_generator::config[1299]: No configuration found. Mar 6 02:53:10.400031 kernel: NET: Registered PF_VSOCK protocol family Mar 6 02:53:10.400037 systemd[1]: Populated /etc with preset unit settings. Mar 6 02:53:10.400043 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 6 02:53:10.400050 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 6 02:53:10.400057 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 6 02:53:10.400062 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 6 02:53:10.400068 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 6 02:53:10.400075 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 6 02:53:10.400081 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 6 02:53:10.400087 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 6 02:53:10.400094 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 6 02:53:10.400100 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 6 02:53:10.400106 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 6 02:53:10.400112 systemd[1]: Created slice user.slice - User and Session Slice. Mar 6 02:53:10.400118 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 6 02:53:10.400124 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 6 02:53:10.400130 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 6 02:53:10.400136 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 6 02:53:10.400143 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 6 02:53:10.400149 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 6 02:53:10.400157 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 6 02:53:10.400163 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 6 02:53:10.400169 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 6 02:53:10.400175 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 6 02:53:10.400181 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 6 02:53:10.400188 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 6 02:53:10.400195 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 6 02:53:10.400201 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 6 02:53:10.400207 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 6 02:53:10.400213 systemd[1]: Reached target slices.target - Slice Units. Mar 6 02:53:10.400219 systemd[1]: Reached target swap.target - Swaps. Mar 6 02:53:10.400225 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 6 02:53:10.400231 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 6 02:53:10.400238 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 6 02:53:10.400244 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 6 02:53:10.400251 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 6 02:53:10.400257 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 6 02:53:10.400263 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 6 02:53:10.400269 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 6 02:53:10.400276 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 6 02:53:10.400282 systemd[1]: Mounting media.mount - External Media Directory... Mar 6 02:53:10.400288 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 6 02:53:10.400294 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 6 02:53:10.400300 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 6 02:53:10.400307 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 6 02:53:10.400313 systemd[1]: Reached target machines.target - Containers. Mar 6 02:53:10.400320 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 6 02:53:10.400327 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 6 02:53:10.400333 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 6 02:53:10.400339 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 6 02:53:10.400346 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 6 02:53:10.400352 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 6 02:53:10.400358 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 6 02:53:10.400364 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 6 02:53:10.400371 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 6 02:53:10.400378 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 6 02:53:10.400384 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 6 02:53:10.400390 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 6 02:53:10.400396 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 6 02:53:10.400403 systemd[1]: Stopped systemd-fsck-usr.service. Mar 6 02:53:10.400409 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 6 02:53:10.400415 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 6 02:53:10.400434 systemd-journald[1374]: Collecting audit messages is disabled. Mar 6 02:53:10.400448 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 6 02:53:10.400456 systemd-journald[1374]: Journal started Mar 6 02:53:10.400471 systemd-journald[1374]: Runtime Journal (/run/log/journal/5fc08c466cce4c0ebd6bf96ee4373e45) is 8M, max 78.3M, 70.3M free. Mar 6 02:53:09.441806 systemd[1]: Queued start job for default target multi-user.target. Mar 6 02:53:09.446330 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 6 02:53:09.446617 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 6 02:53:09.446868 systemd[1]: systemd-journald.service: Consumed 2.602s CPU time. Mar 6 02:53:10.429909 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 6 02:53:10.442760 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 6 02:53:10.462296 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 6 02:53:10.482282 kernel: fuse: init (API version 7.41) Mar 6 02:53:10.482338 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 6 02:53:10.482350 kernel: loop: module loaded Mar 6 02:53:10.484040 systemd[1]: verity-setup.service: Deactivated successfully. Mar 6 02:53:10.489369 systemd[1]: Stopped verity-setup.service. Mar 6 02:53:10.502435 systemd[1]: Started systemd-journald.service - Journal Service. Mar 6 02:53:10.502486 kernel: ACPI: bus type drm_connector registered Mar 6 02:53:10.507462 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 6 02:53:10.512186 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 6 02:53:10.517305 systemd[1]: Mounted media.mount - External Media Directory. Mar 6 02:53:10.521088 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 6 02:53:10.526012 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 6 02:53:10.530863 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 6 02:53:10.536922 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 6 02:53:10.542382 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 6 02:53:10.542544 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 6 02:53:10.547491 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 6 02:53:10.547616 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 6 02:53:10.552527 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 6 02:53:10.552655 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 6 02:53:10.557329 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 6 02:53:10.557449 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 6 02:53:10.562568 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 6 02:53:10.562694 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 6 02:53:10.567353 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 6 02:53:10.567472 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 6 02:53:10.572187 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 6 02:53:10.577502 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 6 02:53:10.589084 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 6 02:53:10.594931 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 6 02:53:10.607948 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 6 02:53:10.612884 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 6 02:53:10.612923 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 6 02:53:10.618052 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 6 02:53:10.624332 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 6 02:53:10.628469 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 6 02:53:10.634522 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 6 02:53:10.643009 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 6 02:53:10.647445 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 6 02:53:10.648175 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 6 02:53:10.652699 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 6 02:53:10.653461 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 6 02:53:10.658745 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 6 02:53:10.664572 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 6 02:53:10.669625 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 6 02:53:10.787938 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 6 02:53:10.797414 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 6 02:53:10.804071 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 6 02:53:10.811125 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 6 02:53:10.820434 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 6 02:53:10.844953 systemd-journald[1374]: Time spent on flushing to /var/log/journal/5fc08c466cce4c0ebd6bf96ee4373e45 is 570.955ms for 934 entries. Mar 6 02:53:10.844953 systemd-journald[1374]: System Journal (/var/log/journal/5fc08c466cce4c0ebd6bf96ee4373e45) is 11.8M, max 2.6G, 2.6G free. Mar 6 02:53:12.175214 kernel: loop0: detected capacity change from 0 to 27936 Mar 6 02:53:12.175274 systemd-journald[1374]: Received client request to flush runtime journal. Mar 6 02:53:12.175302 systemd-journald[1374]: /var/log/journal/5fc08c466cce4c0ebd6bf96ee4373e45/system.journal: Realtime clock jumped backwards relative to last journal entry, rotating. Mar 6 02:53:12.175319 systemd-journald[1374]: Rotating system journal. Mar 6 02:53:10.831045 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 6 02:53:10.839284 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 6 02:53:10.847009 systemd-tmpfiles[1420]: ACLs are not supported, ignoring. Mar 6 02:53:10.847017 systemd-tmpfiles[1420]: ACLs are not supported, ignoring. Mar 6 02:53:10.858175 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 6 02:53:10.868025 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 6 02:53:10.881034 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 6 02:53:11.688086 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 6 02:53:11.697130 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 6 02:53:11.713025 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 6 02:53:11.714934 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 6 02:53:11.718231 systemd-tmpfiles[1453]: ACLs are not supported, ignoring. Mar 6 02:53:11.718238 systemd-tmpfiles[1453]: ACLs are not supported, ignoring. Mar 6 02:53:11.721289 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 6 02:53:11.725959 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 6 02:53:12.177263 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 6 02:53:13.696930 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 6 02:53:13.942909 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 6 02:53:13.950626 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 6 02:53:13.968977 kernel: loop1: detected capacity change from 0 to 200864 Mar 6 02:53:13.976413 systemd-udevd[1464]: Using default interface naming scheme 'v255'. Mar 6 02:53:14.157922 kernel: loop2: detected capacity change from 0 to 100632 Mar 6 02:53:14.382282 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 6 02:53:14.393631 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 6 02:53:14.432284 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 6 02:53:14.442810 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 6 02:53:14.548077 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 6 02:53:14.567351 kernel: mousedev: PS/2 mouse device common for all mice Mar 6 02:53:14.567431 kernel: hv_vmbus: registering driver hv_balloon Mar 6 02:53:14.567446 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Mar 6 02:53:14.570477 kernel: hv_balloon: Memory hot add disabled on ARM64 Mar 6 02:53:14.581966 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#13 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 6 02:53:14.620937 kernel: hv_vmbus: registering driver hyperv_fb Mar 6 02:53:14.622001 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Mar 6 02:53:14.630861 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Mar 6 02:53:14.635029 kernel: Console: switching to colour dummy device 80x25 Mar 6 02:53:14.643380 kernel: Console: switching to colour frame buffer device 128x48 Mar 6 02:53:14.666967 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:53:14.693791 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 6 02:53:14.694976 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:53:14.702018 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:53:14.711913 kernel: loop3: detected capacity change from 0 to 119840 Mar 6 02:53:14.722917 kernel: MACsec IEEE 802.1AE Mar 6 02:53:14.800117 systemd-networkd[1485]: lo: Link UP Mar 6 02:53:14.800124 systemd-networkd[1485]: lo: Gained carrier Mar 6 02:53:14.803452 systemd-networkd[1485]: Enumeration completed Mar 6 02:53:14.803630 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 6 02:53:14.803852 systemd-networkd[1485]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 02:53:14.804025 systemd-networkd[1485]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 6 02:53:14.812253 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 6 02:53:14.818863 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 6 02:53:14.860338 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 6 02:53:14.869029 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 6 02:53:14.877921 kernel: mlx5_core dd39:00:02.0 enP56633s1: Link up Mar 6 02:53:14.901918 kernel: hv_netvsc 000d3ac6-007e-000d-3ac6-007e000d3ac6 eth0: Data path switched to VF: enP56633s1 Mar 6 02:53:14.903242 systemd-networkd[1485]: enP56633s1: Link UP Mar 6 02:53:14.903410 systemd-networkd[1485]: eth0: Link UP Mar 6 02:53:14.903415 systemd-networkd[1485]: eth0: Gained carrier Mar 6 02:53:14.903434 systemd-networkd[1485]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 02:53:14.904781 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 6 02:53:14.913384 systemd-networkd[1485]: enP56633s1: Gained carrier Mar 6 02:53:14.919949 systemd-networkd[1485]: eth0: DHCPv4 address 10.200.20.34/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 6 02:53:14.921590 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 6 02:53:15.081940 kernel: loop4: detected capacity change from 0 to 27936 Mar 6 02:53:15.094918 kernel: loop5: detected capacity change from 0 to 200864 Mar 6 02:53:15.110929 kernel: loop6: detected capacity change from 0 to 100632 Mar 6 02:53:15.123928 kernel: loop7: detected capacity change from 0 to 119840 Mar 6 02:53:15.133297 (sd-merge)[1613]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Mar 6 02:53:15.133650 (sd-merge)[1613]: Merged extensions into '/usr'. Mar 6 02:53:15.137503 systemd[1]: Reload requested from client PID 1419 ('systemd-sysext') (unit systemd-sysext.service)... Mar 6 02:53:15.137697 systemd[1]: Reloading... Mar 6 02:53:15.187049 zram_generator::config[1648]: No configuration found. Mar 6 02:53:15.357559 systemd[1]: Reloading finished in 219 ms. Mar 6 02:53:15.382958 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:53:15.388938 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 6 02:53:15.400847 systemd[1]: Starting ensure-sysext.service... Mar 6 02:53:15.409397 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 6 02:53:15.422023 systemd[1]: Reload requested from client PID 1701 ('systemctl') (unit ensure-sysext.service)... Mar 6 02:53:15.422035 systemd[1]: Reloading... Mar 6 02:53:15.422152 systemd-tmpfiles[1702]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 6 02:53:15.422193 systemd-tmpfiles[1702]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 6 02:53:15.422367 systemd-tmpfiles[1702]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 6 02:53:15.422498 systemd-tmpfiles[1702]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 6 02:53:15.423239 systemd-tmpfiles[1702]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 6 02:53:15.423478 systemd-tmpfiles[1702]: ACLs are not supported, ignoring. Mar 6 02:53:15.423588 systemd-tmpfiles[1702]: ACLs are not supported, ignoring. Mar 6 02:53:15.426177 systemd-tmpfiles[1702]: Detected autofs mount point /boot during canonicalization of boot. Mar 6 02:53:15.426273 systemd-tmpfiles[1702]: Skipping /boot Mar 6 02:53:15.432413 systemd-tmpfiles[1702]: Detected autofs mount point /boot during canonicalization of boot. Mar 6 02:53:15.432523 systemd-tmpfiles[1702]: Skipping /boot Mar 6 02:53:15.475954 zram_generator::config[1729]: No configuration found. Mar 6 02:53:15.625416 systemd[1]: Reloading finished in 203 ms. Mar 6 02:53:15.637921 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 6 02:53:15.660041 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 6 02:53:15.667576 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 6 02:53:15.672873 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 6 02:53:15.676151 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 6 02:53:15.683066 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 6 02:53:15.693466 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 6 02:53:15.700252 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 6 02:53:15.700351 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 6 02:53:15.701167 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 6 02:53:15.718074 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 6 02:53:15.723952 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 6 02:53:15.730507 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 6 02:53:15.734335 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 6 02:53:15.739795 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 6 02:53:15.740016 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 6 02:53:15.745507 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 6 02:53:15.745722 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 6 02:53:15.759678 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 6 02:53:15.762184 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 6 02:53:15.771128 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 6 02:53:15.781809 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 6 02:53:15.789042 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 6 02:53:15.789147 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 6 02:53:15.792637 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 6 02:53:15.802556 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 6 02:53:15.803198 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 6 02:53:15.811772 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 6 02:53:15.812951 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 6 02:53:15.819547 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 6 02:53:15.819835 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 6 02:53:15.827608 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 6 02:53:15.829441 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 6 02:53:15.833321 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 6 02:53:15.836075 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 6 02:53:15.846417 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 6 02:53:15.862761 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 6 02:53:15.871095 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 6 02:53:15.875651 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 6 02:53:15.875792 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 6 02:53:15.876032 systemd[1]: Reached target time-set.target - System Time Set. Mar 6 02:53:15.876340 systemd-resolved[1797]: Positive Trust Anchors: Mar 6 02:53:15.876544 systemd-resolved[1797]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 6 02:53:15.876567 systemd-resolved[1797]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 6 02:53:15.881960 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 6 02:53:15.887387 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 6 02:53:15.887521 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 6 02:53:15.893448 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 6 02:53:15.893577 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 6 02:53:15.898598 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 6 02:53:15.898749 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 6 02:53:15.904535 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 6 02:53:15.905943 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 6 02:53:15.912470 systemd[1]: Finished ensure-sysext.service. Mar 6 02:53:15.915069 augenrules[1832]: No rules Mar 6 02:53:15.916294 systemd[1]: audit-rules.service: Deactivated successfully. Mar 6 02:53:15.916437 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 6 02:53:15.917394 systemd-resolved[1797]: Using system hostname 'ci-4459.2.3-n-bf8f1184ca'. Mar 6 02:53:15.920938 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 6 02:53:15.930252 systemd[1]: Reached target network.target - Network. Mar 6 02:53:15.934241 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 6 02:53:15.939181 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 6 02:53:15.939217 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 6 02:53:16.259031 systemd-networkd[1485]: eth0: Gained IPv6LL Mar 6 02:53:16.261178 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 6 02:53:16.267000 systemd[1]: Reached target network-online.target - Network is Online. Mar 6 02:53:16.405746 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 6 02:53:16.412268 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 6 02:53:21.765505 ldconfig[1415]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 6 02:53:21.781797 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 6 02:53:21.788190 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 6 02:53:22.034451 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 6 02:53:22.039361 systemd[1]: Reached target sysinit.target - System Initialization. Mar 6 02:53:22.044008 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 6 02:53:22.049278 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 6 02:53:22.054881 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 6 02:53:22.059585 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 6 02:53:22.065894 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 6 02:53:22.071354 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 6 02:53:22.071389 systemd[1]: Reached target paths.target - Path Units. Mar 6 02:53:22.075237 systemd[1]: Reached target timers.target - Timer Units. Mar 6 02:53:22.081126 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 6 02:53:22.087020 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 6 02:53:22.092526 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 6 02:53:22.098339 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 6 02:53:22.103681 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 6 02:53:22.110439 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 6 02:53:22.115112 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 6 02:53:22.120482 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 6 02:53:22.126072 systemd[1]: Reached target sockets.target - Socket Units. Mar 6 02:53:22.130197 systemd[1]: Reached target basic.target - Basic System. Mar 6 02:53:22.133806 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 6 02:53:22.133831 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 6 02:53:22.135935 systemd[1]: Starting chronyd.service - NTP client/server... Mar 6 02:53:22.148856 systemd[1]: Starting containerd.service - containerd container runtime... Mar 6 02:53:22.154564 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 6 02:53:22.161333 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 6 02:53:22.167960 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 6 02:53:22.184997 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 6 02:53:22.190335 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 6 02:53:22.196233 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 6 02:53:22.198961 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Mar 6 02:53:22.203963 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Mar 6 02:53:22.204861 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:53:22.212938 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 6 02:53:22.218076 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 6 02:53:22.233022 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 6 02:53:22.240058 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 6 02:53:22.255455 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 6 02:53:22.261999 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 6 02:53:22.267183 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 6 02:53:22.267520 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 6 02:53:22.270041 systemd[1]: Starting update-engine.service - Update Engine... Mar 6 02:53:22.274777 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 6 02:53:22.427264 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 6 02:53:22.443238 KVP[1861]: KVP starting; pid is:1861 Mar 6 02:53:22.447669 chronyd[1851]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Mar 6 02:53:22.447935 KVP[1861]: KVP LIC Version: 3.1 Mar 6 02:53:22.448937 kernel: hv_utils: KVP IC version 4.0 Mar 6 02:53:22.539870 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 6 02:53:22.540062 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 6 02:53:22.562541 jq[1859]: false Mar 6 02:53:22.563073 jq[1870]: true Mar 6 02:53:22.564312 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 6 02:53:22.565954 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 6 02:53:22.577195 jq[1889]: true Mar 6 02:53:22.850075 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:53:22.855584 (kubelet)[1912]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:53:22.984317 systemd[1]: motdgen.service: Deactivated successfully. Mar 6 02:53:22.987217 (ntainerd)[1920]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 6 02:53:22.987455 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 6 02:53:22.993008 chronyd[1851]: Timezone right/UTC failed leap second check, ignoring Mar 6 02:53:22.993175 chronyd[1851]: Loaded seccomp filter (level 2) Mar 6 02:53:22.995436 systemd[1]: Started chronyd.service - NTP client/server. Mar 6 02:53:23.005652 systemd-logind[1868]: New seat seat0. Mar 6 02:53:23.007982 systemd-logind[1868]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Mar 6 02:53:23.008561 systemd[1]: Started systemd-logind.service - User Login Management. Mar 6 02:53:23.015572 tar[1873]: linux-arm64/LICENSE Mar 6 02:53:23.015572 tar[1873]: linux-arm64/helm Mar 6 02:53:23.085245 extend-filesystems[1860]: Found /dev/sda6 Mar 6 02:53:23.182250 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:53:23.530479 extend-filesystems[1860]: Found /dev/sda9 Mar 6 02:53:23.530479 extend-filesystems[1860]: Checking size of /dev/sda9 Mar 6 02:53:23.537312 update_engine[1869]: I20260306 02:53:23.089175 1869 main.cc:92] Flatcar Update Engine starting Mar 6 02:53:23.537312 update_engine[1869]: I20260306 02:53:23.464548 1869 update_check_scheduler.cc:74] Next update check in 3m37s Mar 6 02:53:23.452657 dbus-daemon[1854]: [system] SELinux support is enabled Mar 6 02:53:23.538003 kubelet[1912]: E0306 02:53:23.180429 1912 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:53:23.182356 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:53:23.459560 dbus-daemon[1854]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 6 02:53:23.182610 systemd[1]: kubelet.service: Consumed 489ms CPU time, 248.9M memory peak. Mar 6 02:53:23.249053 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 6 02:53:23.452837 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 6 02:53:23.459362 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 6 02:53:23.459394 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 6 02:53:23.464965 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 6 02:53:23.464979 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 6 02:53:23.470036 systemd[1]: Started update-engine.service - Update Engine. Mar 6 02:53:23.475254 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 6 02:53:23.582414 extend-filesystems[1860]: Old size kept for /dev/sda9 Mar 6 02:53:23.586251 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 6 02:53:23.586424 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 6 02:53:23.689986 coreos-metadata[1853]: Mar 06 02:53:23.689 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 6 02:53:23.734254 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 6 02:53:24.083004 coreos-metadata[1853]: Mar 06 02:53:23.694 INFO Fetch successful Mar 6 02:53:24.083004 coreos-metadata[1853]: Mar 06 02:53:23.694 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Mar 6 02:53:24.083004 coreos-metadata[1853]: Mar 06 02:53:23.698 INFO Fetch successful Mar 6 02:53:24.083004 coreos-metadata[1853]: Mar 06 02:53:23.698 INFO Fetching http://168.63.129.16/machine/e3bdb23d-46b9-4143-81c7-0dce6b1fa10b/213514f2%2D5c3d%2D4dea%2D8238%2Dcf079c667dea.%5Fci%2D4459.2.3%2Dn%2Dbf8f1184ca?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Mar 6 02:53:24.083004 coreos-metadata[1853]: Mar 06 02:53:23.702 INFO Fetch successful Mar 6 02:53:24.083004 coreos-metadata[1853]: Mar 06 02:53:23.703 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Mar 6 02:53:24.083004 coreos-metadata[1853]: Mar 06 02:53:23.711 INFO Fetch successful Mar 6 02:53:23.740091 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 6 02:53:24.338985 tar[1873]: linux-arm64/README.md Mar 6 02:53:24.353421 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 6 02:53:24.695984 locksmithd[1953]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 6 02:53:25.122660 sshd_keygen[1885]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 6 02:53:25.129098 bash[1906]: Updated "/home/core/.ssh/authorized_keys" Mar 6 02:53:25.130995 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 6 02:53:25.138149 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 6 02:53:25.141946 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 6 02:53:25.148620 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 6 02:53:25.157064 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Mar 6 02:53:25.164795 systemd[1]: issuegen.service: Deactivated successfully. Mar 6 02:53:25.174355 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 6 02:53:25.181301 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 6 02:53:25.196045 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Mar 6 02:53:25.203040 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 6 02:53:25.209635 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 6 02:53:25.217590 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 6 02:53:25.222561 systemd[1]: Reached target getty.target - Login Prompts. Mar 6 02:53:26.046216 containerd[1920]: time="2026-03-06T02:53:26Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 6 02:53:26.047013 containerd[1920]: time="2026-03-06T02:53:26.046988196Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 6 02:53:26.052371 containerd[1920]: time="2026-03-06T02:53:26.052341548Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.128µs" Mar 6 02:53:26.052439 containerd[1920]: time="2026-03-06T02:53:26.052427612Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 6 02:53:26.052478 containerd[1920]: time="2026-03-06T02:53:26.052469892Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 6 02:53:26.052658 containerd[1920]: time="2026-03-06T02:53:26.052644356Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 6 02:53:26.052718 containerd[1920]: time="2026-03-06T02:53:26.052705748Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 6 02:53:26.052833 containerd[1920]: time="2026-03-06T02:53:26.052821076Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 6 02:53:26.052957 containerd[1920]: time="2026-03-06T02:53:26.052941236Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 6 02:53:26.053007 containerd[1920]: time="2026-03-06T02:53:26.052994676Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 6 02:53:26.053245 containerd[1920]: time="2026-03-06T02:53:26.053220668Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 6 02:53:26.053300 containerd[1920]: time="2026-03-06T02:53:26.053287444Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 6 02:53:26.053356 containerd[1920]: time="2026-03-06T02:53:26.053343628Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 6 02:53:26.053402 containerd[1920]: time="2026-03-06T02:53:26.053389988Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 6 02:53:26.053524 containerd[1920]: time="2026-03-06T02:53:26.053509892Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 6 02:53:26.053766 containerd[1920]: time="2026-03-06T02:53:26.053745380Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 6 02:53:26.053854 containerd[1920]: time="2026-03-06T02:53:26.053842060Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 6 02:53:26.053939 containerd[1920]: time="2026-03-06T02:53:26.053927316Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 6 02:53:26.054024 containerd[1920]: time="2026-03-06T02:53:26.054012252Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 6 02:53:26.054242 containerd[1920]: time="2026-03-06T02:53:26.054223012Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 6 02:53:26.054355 containerd[1920]: time="2026-03-06T02:53:26.054340628Z" level=info msg="metadata content store policy set" policy=shared Mar 6 02:53:26.434915 containerd[1920]: time="2026-03-06T02:53:26.434792956Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 6 02:53:26.435112 containerd[1920]: time="2026-03-06T02:53:26.435096668Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 6 02:53:26.436310 containerd[1920]: time="2026-03-06T02:53:26.435164868Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 6 02:53:26.436310 containerd[1920]: time="2026-03-06T02:53:26.435180508Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 6 02:53:26.436310 containerd[1920]: time="2026-03-06T02:53:26.435189508Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 6 02:53:26.436310 containerd[1920]: time="2026-03-06T02:53:26.435198308Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 6 02:53:26.436310 containerd[1920]: time="2026-03-06T02:53:26.435208172Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 6 02:53:26.436310 containerd[1920]: time="2026-03-06T02:53:26.435215980Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 6 02:53:26.436310 containerd[1920]: time="2026-03-06T02:53:26.435224100Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 6 02:53:26.436310 containerd[1920]: time="2026-03-06T02:53:26.435231844Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 6 02:53:26.436310 containerd[1920]: time="2026-03-06T02:53:26.435239292Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 6 02:53:26.436310 containerd[1920]: time="2026-03-06T02:53:26.435255804Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 6 02:53:26.436310 containerd[1920]: time="2026-03-06T02:53:26.435408844Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 6 02:53:26.436310 containerd[1920]: time="2026-03-06T02:53:26.435423916Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 6 02:53:26.436310 containerd[1920]: time="2026-03-06T02:53:26.435434380Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 6 02:53:26.436310 containerd[1920]: time="2026-03-06T02:53:26.435442364Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 6 02:53:26.436619 containerd[1920]: time="2026-03-06T02:53:26.435450276Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 6 02:53:26.436619 containerd[1920]: time="2026-03-06T02:53:26.435457012Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 6 02:53:26.436619 containerd[1920]: time="2026-03-06T02:53:26.435463748Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 6 02:53:26.436619 containerd[1920]: time="2026-03-06T02:53:26.435471900Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 6 02:53:26.436619 containerd[1920]: time="2026-03-06T02:53:26.435479444Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 6 02:53:26.436619 containerd[1920]: time="2026-03-06T02:53:26.435486268Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 6 02:53:26.436619 containerd[1920]: time="2026-03-06T02:53:26.435492444Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 6 02:53:26.436619 containerd[1920]: time="2026-03-06T02:53:26.435538628Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 6 02:53:26.436619 containerd[1920]: time="2026-03-06T02:53:26.435553876Z" level=info msg="Start snapshots syncer" Mar 6 02:53:26.436619 containerd[1920]: time="2026-03-06T02:53:26.435580356Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 6 02:53:26.436744 containerd[1920]: time="2026-03-06T02:53:26.435764612Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 6 02:53:26.436744 containerd[1920]: time="2026-03-06T02:53:26.435803380Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 6 02:53:26.436823 containerd[1920]: time="2026-03-06T02:53:26.435837212Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 6 02:53:26.436823 containerd[1920]: time="2026-03-06T02:53:26.435940004Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 6 02:53:26.436823 containerd[1920]: time="2026-03-06T02:53:26.435960348Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 6 02:53:26.436823 containerd[1920]: time="2026-03-06T02:53:26.435969180Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 6 02:53:26.436823 containerd[1920]: time="2026-03-06T02:53:26.435977228Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 6 02:53:26.436823 containerd[1920]: time="2026-03-06T02:53:26.435984804Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 6 02:53:26.436823 containerd[1920]: time="2026-03-06T02:53:26.435991908Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 6 02:53:26.436823 containerd[1920]: time="2026-03-06T02:53:26.435998388Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 6 02:53:26.436823 containerd[1920]: time="2026-03-06T02:53:26.436016180Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 6 02:53:26.436823 containerd[1920]: time="2026-03-06T02:53:26.436024476Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 6 02:53:26.436823 containerd[1920]: time="2026-03-06T02:53:26.436031076Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 6 02:53:26.436823 containerd[1920]: time="2026-03-06T02:53:26.436059332Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 6 02:53:26.436823 containerd[1920]: time="2026-03-06T02:53:26.436071988Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 6 02:53:26.436823 containerd[1920]: time="2026-03-06T02:53:26.436078068Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 6 02:53:26.437087 containerd[1920]: time="2026-03-06T02:53:26.436083684Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 6 02:53:26.437087 containerd[1920]: time="2026-03-06T02:53:26.436089532Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 6 02:53:26.437087 containerd[1920]: time="2026-03-06T02:53:26.436097812Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 6 02:53:26.437087 containerd[1920]: time="2026-03-06T02:53:26.436104260Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 6 02:53:26.437087 containerd[1920]: time="2026-03-06T02:53:26.436117404Z" level=info msg="runtime interface created" Mar 6 02:53:26.437087 containerd[1920]: time="2026-03-06T02:53:26.436120676Z" level=info msg="created NRI interface" Mar 6 02:53:26.437087 containerd[1920]: time="2026-03-06T02:53:26.436125916Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 6 02:53:26.437087 containerd[1920]: time="2026-03-06T02:53:26.436136188Z" level=info msg="Connect containerd service" Mar 6 02:53:26.437087 containerd[1920]: time="2026-03-06T02:53:26.436151140Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 6 02:53:26.437087 containerd[1920]: time="2026-03-06T02:53:26.436735788Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 6 02:53:28.975912 containerd[1920]: time="2026-03-06T02:53:28.975839012Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 6 02:53:28.975912 containerd[1920]: time="2026-03-06T02:53:28.975924124Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 6 02:53:28.976241 containerd[1920]: time="2026-03-06T02:53:28.976100196Z" level=info msg="Start subscribing containerd event" Mar 6 02:53:28.976241 containerd[1920]: time="2026-03-06T02:53:28.976141204Z" level=info msg="Start recovering state" Mar 6 02:53:28.976241 containerd[1920]: time="2026-03-06T02:53:28.976229644Z" level=info msg="Start event monitor" Mar 6 02:53:28.976241 containerd[1920]: time="2026-03-06T02:53:28.976238916Z" level=info msg="Start cni network conf syncer for default" Mar 6 02:53:28.976303 containerd[1920]: time="2026-03-06T02:53:28.976243868Z" level=info msg="Start streaming server" Mar 6 02:53:28.976303 containerd[1920]: time="2026-03-06T02:53:28.976252292Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 6 02:53:28.976303 containerd[1920]: time="2026-03-06T02:53:28.976257932Z" level=info msg="runtime interface starting up..." Mar 6 02:53:28.976303 containerd[1920]: time="2026-03-06T02:53:28.976261604Z" level=info msg="starting plugins..." Mar 6 02:53:28.976303 containerd[1920]: time="2026-03-06T02:53:28.976274564Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 6 02:53:28.977924 containerd[1920]: time="2026-03-06T02:53:28.976377908Z" level=info msg="containerd successfully booted in 2.930533s" Mar 6 02:53:28.976535 systemd[1]: Started containerd.service - containerd container runtime. Mar 6 02:53:28.982390 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 6 02:53:28.991964 systemd[1]: Startup finished in 1.718s (kernel) + 13.216s (initrd) + 30.213s (userspace) = 45.149s. Mar 6 02:53:32.582959 login[2050]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Mar 6 02:53:32.630162 login[2049]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:53:32.635833 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 6 02:53:32.637018 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 6 02:53:32.643167 systemd-logind[1868]: New session 1 of user core. Mar 6 02:53:32.701094 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 6 02:53:32.705129 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 6 02:53:32.714866 (systemd)[2073]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 6 02:53:32.716754 systemd-logind[1868]: New session c1 of user core. Mar 6 02:53:33.155355 systemd[2073]: Queued start job for default target default.target. Mar 6 02:53:33.166072 systemd[2073]: Created slice app.slice - User Application Slice. Mar 6 02:53:33.166097 systemd[2073]: Reached target paths.target - Paths. Mar 6 02:53:33.166129 systemd[2073]: Reached target timers.target - Timers. Mar 6 02:53:33.167096 systemd[2073]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 6 02:53:33.175053 systemd[2073]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 6 02:53:33.175172 systemd[2073]: Reached target sockets.target - Sockets. Mar 6 02:53:33.175280 systemd[2073]: Reached target basic.target - Basic System. Mar 6 02:53:33.175380 systemd[2073]: Reached target default.target - Main User Target. Mar 6 02:53:33.175399 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 6 02:53:33.175499 systemd[2073]: Startup finished in 454ms. Mar 6 02:53:33.176328 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 6 02:53:33.208075 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 6 02:53:33.209259 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:53:33.584184 login[2050]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:53:33.587836 systemd-logind[1868]: New session 2 of user core. Mar 6 02:53:33.599021 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 6 02:53:34.980870 waagent[2047]: 2026-03-06T02:53:34.980792Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Mar 6 02:53:34.985021 waagent[2047]: 2026-03-06T02:53:34.984971Z INFO Daemon Daemon OS: flatcar 4459.2.3 Mar 6 02:53:34.995025 waagent[2047]: 2026-03-06T02:53:34.988667Z INFO Daemon Daemon Python: 3.11.13 Mar 6 02:53:34.995025 waagent[2047]: 2026-03-06T02:53:34.992128Z INFO Daemon Daemon Run daemon Mar 6 02:53:34.995602 waagent[2047]: 2026-03-06T02:53:34.995511Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4459.2.3' Mar 6 02:53:35.002793 waagent[2047]: 2026-03-06T02:53:35.002756Z INFO Daemon Daemon Using waagent for provisioning Mar 6 02:53:35.006748 waagent[2047]: 2026-03-06T02:53:35.006715Z INFO Daemon Daemon Activate resource disk Mar 6 02:53:35.010344 waagent[2047]: 2026-03-06T02:53:35.010302Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Mar 6 02:53:35.018692 waagent[2047]: 2026-03-06T02:53:35.018652Z INFO Daemon Daemon Found device: None Mar 6 02:53:35.022747 waagent[2047]: 2026-03-06T02:53:35.022708Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Mar 6 02:53:35.029733 waagent[2047]: 2026-03-06T02:53:35.029605Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Mar 6 02:53:35.038318 waagent[2047]: 2026-03-06T02:53:35.038275Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 6 02:53:35.042813 waagent[2047]: 2026-03-06T02:53:35.042778Z INFO Daemon Daemon Running default provisioning handler Mar 6 02:53:35.052337 waagent[2047]: 2026-03-06T02:53:35.052290Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Mar 6 02:53:35.063180 waagent[2047]: 2026-03-06T02:53:35.063140Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Mar 6 02:53:35.070641 waagent[2047]: 2026-03-06T02:53:35.070607Z INFO Daemon Daemon cloud-init is enabled: False Mar 6 02:53:35.074487 waagent[2047]: 2026-03-06T02:53:35.074456Z INFO Daemon Daemon Copying ovf-env.xml Mar 6 02:53:35.195278 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:53:35.198080 (kubelet)[2117]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:53:35.229785 kubelet[2117]: E0306 02:53:35.229722 2117 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:53:35.232559 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:53:35.232773 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:53:35.233214 systemd[1]: kubelet.service: Consumed 113ms CPU time, 107.6M memory peak. Mar 6 02:53:35.471744 waagent[2047]: 2026-03-06T02:53:35.471674Z INFO Daemon Daemon Successfully mounted dvd Mar 6 02:53:35.508994 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Mar 6 02:53:35.511173 waagent[2047]: 2026-03-06T02:53:35.511111Z INFO Daemon Daemon Detect protocol endpoint Mar 6 02:53:35.514866 waagent[2047]: 2026-03-06T02:53:35.514832Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 6 02:53:35.519110 waagent[2047]: 2026-03-06T02:53:35.519077Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Mar 6 02:53:35.523950 waagent[2047]: 2026-03-06T02:53:35.523921Z INFO Daemon Daemon Test for route to 168.63.129.16 Mar 6 02:53:35.528354 waagent[2047]: 2026-03-06T02:53:35.528319Z INFO Daemon Daemon Route to 168.63.129.16 exists Mar 6 02:53:35.532186 waagent[2047]: 2026-03-06T02:53:35.532161Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Mar 6 02:53:35.598894 waagent[2047]: 2026-03-06T02:53:35.598854Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Mar 6 02:53:35.603778 waagent[2047]: 2026-03-06T02:53:35.603758Z INFO Daemon Daemon Wire protocol version:2012-11-30 Mar 6 02:53:35.607820 waagent[2047]: 2026-03-06T02:53:35.607797Z INFO Daemon Daemon Server preferred version:2015-04-05 Mar 6 02:53:35.725004 waagent[2047]: 2026-03-06T02:53:35.724923Z INFO Daemon Daemon Initializing goal state during protocol detection Mar 6 02:53:35.730052 waagent[2047]: 2026-03-06T02:53:35.730021Z INFO Daemon Daemon Forcing an update of the goal state. Mar 6 02:53:35.737949 waagent[2047]: 2026-03-06T02:53:35.737911Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 6 02:53:35.753978 waagent[2047]: 2026-03-06T02:53:35.753949Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.179 Mar 6 02:53:35.758309 waagent[2047]: 2026-03-06T02:53:35.758277Z INFO Daemon Mar 6 02:53:35.760568 waagent[2047]: 2026-03-06T02:53:35.760514Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 10819b73-e7ef-4b06-a295-2951d72259c1 eTag: 6755371444665161702 source: Fabric] Mar 6 02:53:35.769066 waagent[2047]: 2026-03-06T02:53:35.769037Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Mar 6 02:53:35.774006 waagent[2047]: 2026-03-06T02:53:35.773977Z INFO Daemon Mar 6 02:53:35.776100 waagent[2047]: 2026-03-06T02:53:35.776073Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Mar 6 02:53:35.785568 waagent[2047]: 2026-03-06T02:53:35.785537Z INFO Daemon Daemon Downloading artifacts profile blob Mar 6 02:53:35.841285 waagent[2047]: 2026-03-06T02:53:35.841221Z INFO Daemon Downloaded certificate {'thumbprint': 'FE57253C2E76E38DD097205509CC6E1420797CBD', 'hasPrivateKey': True} Mar 6 02:53:35.848981 waagent[2047]: 2026-03-06T02:53:35.848942Z INFO Daemon Fetch goal state completed Mar 6 02:53:35.858756 waagent[2047]: 2026-03-06T02:53:35.858723Z INFO Daemon Daemon Starting provisioning Mar 6 02:53:35.862723 waagent[2047]: 2026-03-06T02:53:35.862689Z INFO Daemon Daemon Handle ovf-env.xml. Mar 6 02:53:35.866321 waagent[2047]: 2026-03-06T02:53:35.866296Z INFO Daemon Daemon Set hostname [ci-4459.2.3-n-bf8f1184ca] Mar 6 02:53:35.872235 waagent[2047]: 2026-03-06T02:53:35.872192Z INFO Daemon Daemon Publish hostname [ci-4459.2.3-n-bf8f1184ca] Mar 6 02:53:35.877658 waagent[2047]: 2026-03-06T02:53:35.877621Z INFO Daemon Daemon Examine /proc/net/route for primary interface Mar 6 02:53:35.882394 waagent[2047]: 2026-03-06T02:53:35.882365Z INFO Daemon Daemon Primary interface is [eth0] Mar 6 02:53:35.891817 systemd-networkd[1485]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 02:53:35.891825 systemd-networkd[1485]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 6 02:53:35.891856 systemd-networkd[1485]: eth0: DHCP lease lost Mar 6 02:53:35.892730 waagent[2047]: 2026-03-06T02:53:35.892689Z INFO Daemon Daemon Create user account if not exists Mar 6 02:53:35.897301 waagent[2047]: 2026-03-06T02:53:35.897267Z INFO Daemon Daemon User core already exists, skip useradd Mar 6 02:53:35.901587 waagent[2047]: 2026-03-06T02:53:35.901549Z INFO Daemon Daemon Configure sudoer Mar 6 02:53:35.909818 waagent[2047]: 2026-03-06T02:53:35.909775Z INFO Daemon Daemon Configure sshd Mar 6 02:53:35.914941 systemd-networkd[1485]: eth0: DHCPv4 address 10.200.20.34/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 6 02:53:35.916877 waagent[2047]: 2026-03-06T02:53:35.916837Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Mar 6 02:53:35.926973 waagent[2047]: 2026-03-06T02:53:35.926933Z INFO Daemon Daemon Deploy ssh public key. Mar 6 02:53:37.058198 waagent[2047]: 2026-03-06T02:53:37.054524Z INFO Daemon Daemon Provisioning complete Mar 6 02:53:37.068720 waagent[2047]: 2026-03-06T02:53:37.068673Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Mar 6 02:53:37.073801 waagent[2047]: 2026-03-06T02:53:37.073758Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Mar 6 02:53:37.081936 waagent[2047]: 2026-03-06T02:53:37.081892Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Mar 6 02:53:37.181940 waagent[2140]: 2026-03-06T02:53:37.181663Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Mar 6 02:53:37.181940 waagent[2140]: 2026-03-06T02:53:37.181798Z INFO ExtHandler ExtHandler OS: flatcar 4459.2.3 Mar 6 02:53:37.181940 waagent[2140]: 2026-03-06T02:53:37.181834Z INFO ExtHandler ExtHandler Python: 3.11.13 Mar 6 02:53:37.181940 waagent[2140]: 2026-03-06T02:53:37.181868Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Mar 6 02:53:37.224937 waagent[2140]: 2026-03-06T02:53:37.224575Z INFO ExtHandler ExtHandler Distro: flatcar-4459.2.3; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Mar 6 02:53:37.224937 waagent[2140]: 2026-03-06T02:53:37.224778Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 6 02:53:37.224937 waagent[2140]: 2026-03-06T02:53:37.224820Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 6 02:53:37.230883 waagent[2140]: 2026-03-06T02:53:37.230834Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 6 02:53:37.236148 waagent[2140]: 2026-03-06T02:53:37.236110Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.179 Mar 6 02:53:37.236616 waagent[2140]: 2026-03-06T02:53:37.236583Z INFO ExtHandler Mar 6 02:53:37.236749 waagent[2140]: 2026-03-06T02:53:37.236726Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: d89df7d5-6d15-4b1b-babb-86cd46fe4fae eTag: 6755371444665161702 source: Fabric] Mar 6 02:53:37.237089 waagent[2140]: 2026-03-06T02:53:37.237059Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 6 02:53:37.237593 waagent[2140]: 2026-03-06T02:53:37.237561Z INFO ExtHandler Mar 6 02:53:37.237705 waagent[2140]: 2026-03-06T02:53:37.237683Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Mar 6 02:53:37.241078 waagent[2140]: 2026-03-06T02:53:37.241052Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 6 02:53:37.293958 waagent[2140]: 2026-03-06T02:53:37.293352Z INFO ExtHandler Downloaded certificate {'thumbprint': 'FE57253C2E76E38DD097205509CC6E1420797CBD', 'hasPrivateKey': True} Mar 6 02:53:37.293958 waagent[2140]: 2026-03-06T02:53:37.293729Z INFO ExtHandler Fetch goal state completed Mar 6 02:53:37.305856 waagent[2140]: 2026-03-06T02:53:37.305807Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.4 27 Jan 2026 (Library: OpenSSL 3.4.4 27 Jan 2026) Mar 6 02:53:37.309126 waagent[2140]: 2026-03-06T02:53:37.309051Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2140 Mar 6 02:53:37.309195 waagent[2140]: 2026-03-06T02:53:37.309169Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Mar 6 02:53:37.309427 waagent[2140]: 2026-03-06T02:53:37.309399Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Mar 6 02:53:37.310499 waagent[2140]: 2026-03-06T02:53:37.310463Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4459.2.3', '', 'Flatcar Container Linux by Kinvolk'] Mar 6 02:53:37.310812 waagent[2140]: 2026-03-06T02:53:37.310782Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4459.2.3', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Mar 6 02:53:37.310949 waagent[2140]: 2026-03-06T02:53:37.310923Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Mar 6 02:53:37.311366 waagent[2140]: 2026-03-06T02:53:37.311335Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Mar 6 02:53:37.458405 waagent[2140]: 2026-03-06T02:53:37.458365Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Mar 6 02:53:37.458583 waagent[2140]: 2026-03-06T02:53:37.458554Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Mar 6 02:53:37.463359 waagent[2140]: 2026-03-06T02:53:37.463328Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Mar 6 02:53:37.467840 systemd[1]: Reload requested from client PID 2155 ('systemctl') (unit waagent.service)... Mar 6 02:53:37.468077 systemd[1]: Reloading... Mar 6 02:53:37.546088 zram_generator::config[2193]: No configuration found. Mar 6 02:53:37.691194 systemd[1]: Reloading finished in 222 ms. Mar 6 02:53:37.703510 waagent[2140]: 2026-03-06T02:53:37.703129Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Mar 6 02:53:37.703510 waagent[2140]: 2026-03-06T02:53:37.703269Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Mar 6 02:53:37.988329 waagent[2140]: 2026-03-06T02:53:37.988256Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Mar 6 02:53:37.988595 waagent[2140]: 2026-03-06T02:53:37.988563Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Mar 6 02:53:37.989242 waagent[2140]: 2026-03-06T02:53:37.989200Z INFO ExtHandler ExtHandler Starting env monitor service. Mar 6 02:53:37.989555 waagent[2140]: 2026-03-06T02:53:37.989480Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Mar 6 02:53:37.989956 waagent[2140]: 2026-03-06T02:53:37.989713Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 6 02:53:37.989956 waagent[2140]: 2026-03-06T02:53:37.989781Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 6 02:53:37.990250 waagent[2140]: 2026-03-06T02:53:37.990203Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Mar 6 02:53:37.990311 waagent[2140]: 2026-03-06T02:53:37.990243Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Mar 6 02:53:37.990346 waagent[2140]: 2026-03-06T02:53:37.990137Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Mar 6 02:53:37.990468 waagent[2140]: 2026-03-06T02:53:37.990435Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Mar 6 02:53:37.990585 waagent[2140]: 2026-03-06T02:53:37.990488Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Mar 6 02:53:37.991178 waagent[2140]: 2026-03-06T02:53:37.991123Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Mar 6 02:53:37.991306 waagent[2140]: 2026-03-06T02:53:37.991276Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 6 02:53:37.991653 waagent[2140]: 2026-03-06T02:53:37.991631Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Mar 6 02:53:37.991653 waagent[2140]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Mar 6 02:53:37.991653 waagent[2140]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Mar 6 02:53:37.991653 waagent[2140]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Mar 6 02:53:37.991653 waagent[2140]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Mar 6 02:53:37.991653 waagent[2140]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 6 02:53:37.991653 waagent[2140]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 6 02:53:37.991868 waagent[2140]: 2026-03-06T02:53:37.991816Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 6 02:53:37.992355 waagent[2140]: 2026-03-06T02:53:37.992259Z INFO EnvHandler ExtHandler Configure routes Mar 6 02:53:37.992861 waagent[2140]: 2026-03-06T02:53:37.992820Z INFO EnvHandler ExtHandler Gateway:None Mar 6 02:53:37.993456 waagent[2140]: 2026-03-06T02:53:37.993275Z INFO EnvHandler ExtHandler Routes:None Mar 6 02:53:37.998236 waagent[2140]: 2026-03-06T02:53:37.998205Z INFO ExtHandler ExtHandler Mar 6 02:53:37.998374 waagent[2140]: 2026-03-06T02:53:37.998350Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: dbdd4a2c-edf4-4b3f-a9ab-a381ee2ae0f6 correlation 02faf538-66fb-4aeb-8305-8f3edb918977 created: 2026-03-06T02:52:13.322037Z] Mar 6 02:53:37.998726 waagent[2140]: 2026-03-06T02:53:37.998696Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 6 02:53:37.999229 waagent[2140]: 2026-03-06T02:53:37.999198Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] Mar 6 02:53:38.022655 waagent[2140]: 2026-03-06T02:53:38.022603Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Mar 6 02:53:38.022655 waagent[2140]: Try `iptables -h' or 'iptables --help' for more information.) Mar 6 02:53:38.023126 waagent[2140]: 2026-03-06T02:53:38.023091Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: BDB83B0C-B3F7-440E-9EE1-1A6F000AE4FF;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Mar 6 02:53:38.043734 waagent[2140]: 2026-03-06T02:53:38.043377Z INFO MonitorHandler ExtHandler Network interfaces: Mar 6 02:53:38.043734 waagent[2140]: Executing ['ip', '-a', '-o', 'link']: Mar 6 02:53:38.043734 waagent[2140]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Mar 6 02:53:38.043734 waagent[2140]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:c6:00:7e brd ff:ff:ff:ff:ff:ff Mar 6 02:53:38.043734 waagent[2140]: 3: enP56633s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:c6:00:7e brd ff:ff:ff:ff:ff:ff\ altname enP56633p0s2 Mar 6 02:53:38.043734 waagent[2140]: Executing ['ip', '-4', '-a', '-o', 'address']: Mar 6 02:53:38.043734 waagent[2140]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Mar 6 02:53:38.043734 waagent[2140]: 2: eth0 inet 10.200.20.34/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Mar 6 02:53:38.043734 waagent[2140]: Executing ['ip', '-6', '-a', '-o', 'address']: Mar 6 02:53:38.043734 waagent[2140]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Mar 6 02:53:38.043734 waagent[2140]: 2: eth0 inet6 fe80::20d:3aff:fec6:7e/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Mar 6 02:53:38.101649 waagent[2140]: 2026-03-06T02:53:38.101608Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Mar 6 02:53:38.101649 waagent[2140]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 6 02:53:38.101649 waagent[2140]: pkts bytes target prot opt in out source destination Mar 6 02:53:38.101649 waagent[2140]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 6 02:53:38.101649 waagent[2140]: pkts bytes target prot opt in out source destination Mar 6 02:53:38.101649 waagent[2140]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 6 02:53:38.101649 waagent[2140]: pkts bytes target prot opt in out source destination Mar 6 02:53:38.101649 waagent[2140]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 6 02:53:38.101649 waagent[2140]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 6 02:53:38.101649 waagent[2140]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 6 02:53:38.104249 waagent[2140]: 2026-03-06T02:53:38.104216Z INFO EnvHandler ExtHandler Current Firewall rules: Mar 6 02:53:38.104249 waagent[2140]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 6 02:53:38.104249 waagent[2140]: pkts bytes target prot opt in out source destination Mar 6 02:53:38.104249 waagent[2140]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 6 02:53:38.104249 waagent[2140]: pkts bytes target prot opt in out source destination Mar 6 02:53:38.104249 waagent[2140]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 6 02:53:38.104249 waagent[2140]: pkts bytes target prot opt in out source destination Mar 6 02:53:38.104249 waagent[2140]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 6 02:53:38.104249 waagent[2140]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 6 02:53:38.104249 waagent[2140]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 6 02:53:38.104640 waagent[2140]: 2026-03-06T02:53:38.104616Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Mar 6 02:53:45.458018 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 6 02:53:45.459277 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:53:46.818492 chronyd[1851]: Selected source PHC0 Mar 6 02:53:47.951982 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:53:47.955268 (kubelet)[2291]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:53:47.978850 kubelet[2291]: E0306 02:53:47.978805 2291 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:53:47.980810 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:53:47.980942 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:53:47.981378 systemd[1]: kubelet.service: Consumed 105ms CPU time, 106.9M memory peak. Mar 6 02:53:58.208056 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 6 02:53:58.209703 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:53:58.309523 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:53:58.317109 (kubelet)[2305]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:53:58.344073 kubelet[2305]: E0306 02:53:58.344018 2305 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:53:58.346119 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:53:58.346331 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:53:58.347981 systemd[1]: kubelet.service: Consumed 105ms CPU time, 106.7M memory peak. Mar 6 02:53:59.581814 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 6 02:53:59.583366 systemd[1]: Started sshd@0-10.200.20.34:22-10.200.16.10:60190.service - OpenSSH per-connection server daemon (10.200.16.10:60190). Mar 6 02:54:01.039603 sshd[2313]: Accepted publickey for core from 10.200.16.10 port 60190 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:54:01.040335 sshd-session[2313]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:54:01.043586 systemd-logind[1868]: New session 3 of user core. Mar 6 02:54:01.054194 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 6 02:54:01.330123 systemd[1]: Started sshd@1-10.200.20.34:22-10.200.16.10:42026.service - OpenSSH per-connection server daemon (10.200.16.10:42026). Mar 6 02:54:01.743863 sshd[2319]: Accepted publickey for core from 10.200.16.10 port 42026 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:54:01.744957 sshd-session[2319]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:54:01.748663 systemd-logind[1868]: New session 4 of user core. Mar 6 02:54:01.756045 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 6 02:54:01.964858 sshd[2322]: Connection closed by 10.200.16.10 port 42026 Mar 6 02:54:01.965469 sshd-session[2319]: pam_unix(sshd:session): session closed for user core Mar 6 02:54:01.969082 systemd-logind[1868]: Session 4 logged out. Waiting for processes to exit. Mar 6 02:54:01.969373 systemd[1]: sshd@1-10.200.20.34:22-10.200.16.10:42026.service: Deactivated successfully. Mar 6 02:54:01.970677 systemd[1]: session-4.scope: Deactivated successfully. Mar 6 02:54:01.972311 systemd-logind[1868]: Removed session 4. Mar 6 02:54:02.055551 systemd[1]: Started sshd@2-10.200.20.34:22-10.200.16.10:42028.service - OpenSSH per-connection server daemon (10.200.16.10:42028). Mar 6 02:54:02.451028 sshd[2328]: Accepted publickey for core from 10.200.16.10 port 42028 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:54:02.708361 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Mar 6 02:54:02.880836 sshd-session[2328]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:54:02.884320 systemd-logind[1868]: New session 5 of user core. Mar 6 02:54:02.891027 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 6 02:54:03.039667 sshd[2331]: Connection closed by 10.200.16.10 port 42028 Mar 6 02:54:03.039221 sshd-session[2328]: pam_unix(sshd:session): session closed for user core Mar 6 02:54:03.042189 systemd[1]: sshd@2-10.200.20.34:22-10.200.16.10:42028.service: Deactivated successfully. Mar 6 02:54:03.043667 systemd[1]: session-5.scope: Deactivated successfully. Mar 6 02:54:03.044353 systemd-logind[1868]: Session 5 logged out. Waiting for processes to exit. Mar 6 02:54:03.045667 systemd-logind[1868]: Removed session 5. Mar 6 02:54:03.115524 systemd[1]: Started sshd@3-10.200.20.34:22-10.200.16.10:42036.service - OpenSSH per-connection server daemon (10.200.16.10:42036). Mar 6 02:54:03.494335 sshd[2337]: Accepted publickey for core from 10.200.16.10 port 42036 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:54:03.495346 sshd-session[2337]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:54:03.498790 systemd-logind[1868]: New session 6 of user core. Mar 6 02:54:03.506119 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 6 02:54:03.699432 sshd[2340]: Connection closed by 10.200.16.10 port 42036 Mar 6 02:54:03.700023 sshd-session[2337]: pam_unix(sshd:session): session closed for user core Mar 6 02:54:03.703358 systemd[1]: sshd@3-10.200.20.34:22-10.200.16.10:42036.service: Deactivated successfully. Mar 6 02:54:03.704656 systemd[1]: session-6.scope: Deactivated successfully. Mar 6 02:54:03.705330 systemd-logind[1868]: Session 6 logged out. Waiting for processes to exit. Mar 6 02:54:03.706464 systemd-logind[1868]: Removed session 6. Mar 6 02:54:03.806398 systemd[1]: Started sshd@4-10.200.20.34:22-10.200.16.10:42038.service - OpenSSH per-connection server daemon (10.200.16.10:42038). Mar 6 02:54:04.267519 sshd[2346]: Accepted publickey for core from 10.200.16.10 port 42038 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:54:04.268581 sshd-session[2346]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:54:04.272024 systemd-logind[1868]: New session 7 of user core. Mar 6 02:54:04.283214 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 6 02:54:05.020607 sudo[2350]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 6 02:54:05.020829 sudo[2350]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 02:54:05.033236 sudo[2350]: pam_unix(sudo:session): session closed for user root Mar 6 02:54:05.119057 sshd[2349]: Connection closed by 10.200.16.10 port 42038 Mar 6 02:54:05.119752 sshd-session[2346]: pam_unix(sshd:session): session closed for user core Mar 6 02:54:05.123183 systemd[1]: sshd@4-10.200.20.34:22-10.200.16.10:42038.service: Deactivated successfully. Mar 6 02:54:05.124814 systemd[1]: session-7.scope: Deactivated successfully. Mar 6 02:54:05.125649 systemd-logind[1868]: Session 7 logged out. Waiting for processes to exit. Mar 6 02:54:05.127120 systemd-logind[1868]: Removed session 7. Mar 6 02:54:05.197113 systemd[1]: Started sshd@5-10.200.20.34:22-10.200.16.10:42040.service - OpenSSH per-connection server daemon (10.200.16.10:42040). Mar 6 02:54:05.615830 sshd[2356]: Accepted publickey for core from 10.200.16.10 port 42040 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:54:05.616582 sshd-session[2356]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:54:05.620465 systemd-logind[1868]: New session 8 of user core. Mar 6 02:54:05.631034 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 6 02:54:05.768526 sudo[2361]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 6 02:54:05.769107 sudo[2361]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 02:54:06.030034 sudo[2361]: pam_unix(sudo:session): session closed for user root Mar 6 02:54:06.034261 sudo[2360]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 6 02:54:06.034468 sudo[2360]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 02:54:06.041190 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 6 02:54:06.067148 augenrules[2383]: No rules Mar 6 02:54:06.068269 systemd[1]: audit-rules.service: Deactivated successfully. Mar 6 02:54:06.068455 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 6 02:54:06.069403 sudo[2360]: pam_unix(sudo:session): session closed for user root Mar 6 02:54:06.144947 sshd[2359]: Connection closed by 10.200.16.10 port 42040 Mar 6 02:54:06.145626 sshd-session[2356]: pam_unix(sshd:session): session closed for user core Mar 6 02:54:06.148981 systemd[1]: sshd@5-10.200.20.34:22-10.200.16.10:42040.service: Deactivated successfully. Mar 6 02:54:06.150585 systemd[1]: session-8.scope: Deactivated successfully. Mar 6 02:54:06.151424 systemd-logind[1868]: Session 8 logged out. Waiting for processes to exit. Mar 6 02:54:06.153313 systemd-logind[1868]: Removed session 8. Mar 6 02:54:06.222130 systemd[1]: Started sshd@6-10.200.20.34:22-10.200.16.10:42050.service - OpenSSH per-connection server daemon (10.200.16.10:42050). Mar 6 02:54:06.600849 sshd[2392]: Accepted publickey for core from 10.200.16.10 port 42050 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:54:06.601581 sshd-session[2392]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:54:06.604943 systemd-logind[1868]: New session 9 of user core. Mar 6 02:54:06.614018 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 6 02:54:06.739237 sudo[2396]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 6 02:54:06.739452 sudo[2396]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 02:54:08.459772 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 6 02:54:08.461088 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:54:08.845186 update_engine[1869]: I20260306 02:54:08.845043 1869 update_attempter.cc:509] Updating boot flags... Mar 6 02:54:11.324119 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 6 02:54:11.340192 (dockerd)[2584]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 6 02:54:13.139052 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:54:13.144157 (kubelet)[2594]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:54:13.170942 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:54:13.236132 kubelet[2594]: E0306 02:54:13.169264 2594 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:54:13.171034 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:54:13.171374 systemd[1]: kubelet.service: Consumed 108ms CPU time, 105M memory peak. Mar 6 02:54:14.060515 dockerd[2584]: time="2026-03-06T02:54:14.059592390Z" level=info msg="Starting up" Mar 6 02:54:14.061085 dockerd[2584]: time="2026-03-06T02:54:14.061058294Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 6 02:54:14.069196 dockerd[2584]: time="2026-03-06T02:54:14.069159716Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 6 02:54:15.833797 dockerd[2584]: time="2026-03-06T02:54:15.833750952Z" level=info msg="Loading containers: start." Mar 6 02:54:15.910931 kernel: Initializing XFRM netlink socket Mar 6 02:54:16.297062 systemd-networkd[1485]: docker0: Link UP Mar 6 02:54:16.334858 dockerd[2584]: time="2026-03-06T02:54:16.334816376Z" level=info msg="Loading containers: done." Mar 6 02:54:16.773736 dockerd[2584]: time="2026-03-06T02:54:16.773387454Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 6 02:54:16.773736 dockerd[2584]: time="2026-03-06T02:54:16.773486937Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 6 02:54:16.773736 dockerd[2584]: time="2026-03-06T02:54:16.773582340Z" level=info msg="Initializing buildkit" Mar 6 02:54:16.984931 dockerd[2584]: time="2026-03-06T02:54:16.984880774Z" level=info msg="Completed buildkit initialization" Mar 6 02:54:16.990122 dockerd[2584]: time="2026-03-06T02:54:16.990082438Z" level=info msg="Daemon has completed initialization" Mar 6 02:54:16.990351 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 6 02:54:16.991022 dockerd[2584]: time="2026-03-06T02:54:16.990847807Z" level=info msg="API listen on /run/docker.sock" Mar 6 02:54:17.315292 containerd[1920]: time="2026-03-06T02:54:17.315074154Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\"" Mar 6 02:54:18.372761 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3445026740.mount: Deactivated successfully. Mar 6 02:54:21.237723 containerd[1920]: time="2026-03-06T02:54:21.237667177Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:21.241380 containerd[1920]: time="2026-03-06T02:54:21.241330159Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.5: active requests=0, bytes read=24583252" Mar 6 02:54:21.284708 containerd[1920]: time="2026-03-06T02:54:21.284663594Z" level=info msg="ImageCreate event name:\"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:21.290935 containerd[1920]: time="2026-03-06T02:54:21.290247493Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:21.290935 containerd[1920]: time="2026-03-06T02:54:21.290833327Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.5\" with image id \"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\", size \"24579851\" in 3.975721556s" Mar 6 02:54:21.290935 containerd[1920]: time="2026-03-06T02:54:21.290856264Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\" returns image reference \"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\"" Mar 6 02:54:21.291585 containerd[1920]: time="2026-03-06T02:54:21.291544806Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\"" Mar 6 02:54:23.208045 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 6 02:54:23.209908 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:54:23.305591 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:54:23.314288 (kubelet)[2869]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:54:23.340543 kubelet[2869]: E0306 02:54:23.340501 2869 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:54:23.342414 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:54:23.342527 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:54:23.343967 systemd[1]: kubelet.service: Consumed 100ms CPU time, 107M memory peak. Mar 6 02:54:28.286863 containerd[1920]: time="2026-03-06T02:54:28.286283538Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:28.289957 containerd[1920]: time="2026-03-06T02:54:28.289931239Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.5: active requests=0, bytes read=19139641" Mar 6 02:54:28.333361 containerd[1920]: time="2026-03-06T02:54:28.333334102Z" level=info msg="ImageCreate event name:\"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:28.379335 containerd[1920]: time="2026-03-06T02:54:28.379278207Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:28.380205 containerd[1920]: time="2026-03-06T02:54:28.380083337Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.5\" with image id \"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\", size \"20724045\" in 7.088467353s" Mar 6 02:54:28.380205 containerd[1920]: time="2026-03-06T02:54:28.380109618Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\" returns image reference \"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\"" Mar 6 02:54:28.381507 containerd[1920]: time="2026-03-06T02:54:28.381446125Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\"" Mar 6 02:54:31.425422 containerd[1920]: time="2026-03-06T02:54:31.425359987Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:31.472320 containerd[1920]: time="2026-03-06T02:54:31.472138383Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.5: active requests=0, bytes read=14195544" Mar 6 02:54:31.475913 containerd[1920]: time="2026-03-06T02:54:31.475701057Z" level=info msg="ImageCreate event name:\"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:31.534700 containerd[1920]: time="2026-03-06T02:54:31.534656419Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:31.535455 containerd[1920]: time="2026-03-06T02:54:31.535240238Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.5\" with image id \"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\", size \"15779966\" in 3.153764569s" Mar 6 02:54:31.535455 containerd[1920]: time="2026-03-06T02:54:31.535269871Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\" returns image reference \"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\"" Mar 6 02:54:31.535863 containerd[1920]: time="2026-03-06T02:54:31.535698813Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\"" Mar 6 02:54:33.458169 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Mar 6 02:54:33.459599 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:54:33.557887 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:54:33.564155 (kubelet)[2892]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:54:33.589316 kubelet[2892]: E0306 02:54:33.589270 2892 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:54:33.591370 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:54:33.591491 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:54:33.591979 systemd[1]: kubelet.service: Consumed 103ms CPU time, 105.2M memory peak. Mar 6 02:54:37.912853 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2204686235.mount: Deactivated successfully. Mar 6 02:54:38.256171 containerd[1920]: time="2026-03-06T02:54:38.256119199Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:38.259772 containerd[1920]: time="2026-03-06T02:54:38.259682737Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.5: active requests=0, bytes read=22697088" Mar 6 02:54:38.264358 containerd[1920]: time="2026-03-06T02:54:38.264324094Z" level=info msg="ImageCreate event name:\"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:38.268195 containerd[1920]: time="2026-03-06T02:54:38.268151056Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:38.268773 containerd[1920]: time="2026-03-06T02:54:38.268376920Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.5\" with image id \"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\", repo tag \"registry.k8s.io/kube-proxy:v1.34.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\", size \"22696107\" in 6.732569767s" Mar 6 02:54:38.268773 containerd[1920]: time="2026-03-06T02:54:38.268405040Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\" returns image reference \"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\"" Mar 6 02:54:38.269009 containerd[1920]: time="2026-03-06T02:54:38.268981299Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Mar 6 02:54:40.791624 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3206342507.mount: Deactivated successfully. Mar 6 02:54:43.708042 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Mar 6 02:54:43.711077 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:54:45.059884 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:54:45.062918 (kubelet)[2922]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:54:45.087269 kubelet[2922]: E0306 02:54:45.087208 2922 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:54:45.089317 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:54:45.089532 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:54:45.090052 systemd[1]: kubelet.service: Consumed 103ms CPU time, 104.5M memory peak. Mar 6 02:54:54.729667 containerd[1920]: time="2026-03-06T02:54:54.729002945Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:54.732308 containerd[1920]: time="2026-03-06T02:54:54.732282202Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=20395406" Mar 6 02:54:54.776685 containerd[1920]: time="2026-03-06T02:54:54.776652004Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:54.781150 containerd[1920]: time="2026-03-06T02:54:54.781114059Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:54.781850 containerd[1920]: time="2026-03-06T02:54:54.781824249Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 16.512815741s" Mar 6 02:54:54.782020 containerd[1920]: time="2026-03-06T02:54:54.781992559Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Mar 6 02:54:54.782740 containerd[1920]: time="2026-03-06T02:54:54.782719886Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 6 02:54:55.208174 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. Mar 6 02:54:55.210312 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:54:55.310603 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:54:55.313153 (kubelet)[2981]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:54:55.335607 kubelet[2981]: E0306 02:54:55.335546 2981 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:54:55.337430 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:54:55.337536 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:54:55.337798 systemd[1]: kubelet.service: Consumed 100ms CPU time, 107M memory peak. Mar 6 02:54:58.219351 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1187909703.mount: Deactivated successfully. Mar 6 02:54:58.239791 containerd[1920]: time="2026-03-06T02:54:58.239323535Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:58.242136 containerd[1920]: time="2026-03-06T02:54:58.242111872Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268709" Mar 6 02:54:58.245376 containerd[1920]: time="2026-03-06T02:54:58.245356456Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:58.249530 containerd[1920]: time="2026-03-06T02:54:58.249508541Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:54:58.250019 containerd[1920]: time="2026-03-06T02:54:58.249782750Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 3.467037407s" Mar 6 02:54:58.250019 containerd[1920]: time="2026-03-06T02:54:58.249810727Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Mar 6 02:54:58.250225 containerd[1920]: time="2026-03-06T02:54:58.250203683Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\"" Mar 6 02:55:00.397677 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount728042546.mount: Deactivated successfully. Mar 6 02:55:05.457992 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. Mar 6 02:55:05.459563 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:55:05.564848 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:55:05.567701 (kubelet)[3021]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:55:05.593450 kubelet[3021]: E0306 02:55:05.593397 3021 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:55:05.595313 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:55:05.595419 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:55:05.595700 systemd[1]: kubelet.service: Consumed 104ms CPU time, 104.8M memory peak. Mar 6 02:55:15.486482 containerd[1920]: time="2026-03-06T02:55:15.486339353Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:55:15.534083 containerd[1920]: time="2026-03-06T02:55:15.533868396Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=21125515" Mar 6 02:55:15.581737 containerd[1920]: time="2026-03-06T02:55:15.581703506Z" level=info msg="ImageCreate event name:\"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:55:15.629331 containerd[1920]: time="2026-03-06T02:55:15.629269372Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:55:15.630138 containerd[1920]: time="2026-03-06T02:55:15.629976202Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"21136588\" in 17.379729086s" Mar 6 02:55:15.630138 containerd[1920]: time="2026-03-06T02:55:15.630004595Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\"" Mar 6 02:55:15.708644 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. Mar 6 02:55:15.712049 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:55:15.871040 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:55:15.876284 (kubelet)[3096]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:55:15.901811 kubelet[3096]: E0306 02:55:15.901755 3096 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:55:15.904592 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:55:15.904801 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:55:15.906970 systemd[1]: kubelet.service: Consumed 101ms CPU time, 106.9M memory peak. Mar 6 02:55:18.188402 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:55:18.188849 systemd[1]: kubelet.service: Consumed 101ms CPU time, 106.9M memory peak. Mar 6 02:55:18.190754 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:55:18.214626 systemd[1]: Reload requested from client PID 3110 ('systemctl') (unit session-9.scope)... Mar 6 02:55:18.214752 systemd[1]: Reloading... Mar 6 02:55:18.313991 zram_generator::config[3160]: No configuration found. Mar 6 02:55:18.459194 systemd[1]: Reloading finished in 244 ms. Mar 6 02:55:18.496244 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 6 02:55:18.496302 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 6 02:55:18.497942 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:55:18.497981 systemd[1]: kubelet.service: Consumed 72ms CPU time, 94.9M memory peak. Mar 6 02:55:18.499240 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:55:24.061840 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:55:24.065044 (kubelet)[3224]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 6 02:55:24.088937 kubelet[3224]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 6 02:55:24.088937 kubelet[3224]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 6 02:55:24.089604 kubelet[3224]: I0306 02:55:24.089561 3224 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 6 02:55:24.352614 kubelet[3224]: I0306 02:55:24.352507 3224 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 6 02:55:24.352614 kubelet[3224]: I0306 02:55:24.352537 3224 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 6 02:55:24.352614 kubelet[3224]: I0306 02:55:24.352557 3224 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 6 02:55:24.352614 kubelet[3224]: I0306 02:55:24.352562 3224 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 6 02:55:24.352760 kubelet[3224]: I0306 02:55:24.352724 3224 server.go:956] "Client rotation is on, will bootstrap in background" Mar 6 02:55:28.931525 kubelet[3224]: E0306 02:55:28.931490 3224 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.34:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.34:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 6 02:55:28.932491 kubelet[3224]: I0306 02:55:28.931951 3224 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 6 02:55:28.936968 kubelet[3224]: I0306 02:55:28.936466 3224 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 6 02:55:28.939055 kubelet[3224]: I0306 02:55:28.939037 3224 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 6 02:55:28.939311 kubelet[3224]: I0306 02:55:28.939291 3224 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 6 02:55:28.939483 kubelet[3224]: I0306 02:55:28.939366 3224 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.2.3-n-bf8f1184ca","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 6 02:55:28.939597 kubelet[3224]: I0306 02:55:28.939585 3224 topology_manager.go:138] "Creating topology manager with none policy" Mar 6 02:55:28.939640 kubelet[3224]: I0306 02:55:28.939634 3224 container_manager_linux.go:306] "Creating device plugin manager" Mar 6 02:55:28.939760 kubelet[3224]: I0306 02:55:28.939752 3224 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 6 02:55:28.945006 kubelet[3224]: I0306 02:55:28.944989 3224 state_mem.go:36] "Initialized new in-memory state store" Mar 6 02:55:28.946114 kubelet[3224]: I0306 02:55:28.946099 3224 kubelet.go:475] "Attempting to sync node with API server" Mar 6 02:55:28.946203 kubelet[3224]: I0306 02:55:28.946192 3224 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 6 02:55:28.946571 kubelet[3224]: E0306 02:55:28.946537 3224 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.34:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.2.3-n-bf8f1184ca&limit=500&resourceVersion=0\": dial tcp 10.200.20.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 6 02:55:28.946874 kubelet[3224]: I0306 02:55:28.946861 3224 kubelet.go:387] "Adding apiserver pod source" Mar 6 02:55:28.946956 kubelet[3224]: I0306 02:55:28.946948 3224 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 6 02:55:28.947622 kubelet[3224]: E0306 02:55:28.947590 3224 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.34:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 6 02:55:28.947820 kubelet[3224]: I0306 02:55:28.947803 3224 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 6 02:55:28.948171 kubelet[3224]: I0306 02:55:28.948155 3224 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 6 02:55:28.948208 kubelet[3224]: I0306 02:55:28.948178 3224 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 6 02:55:28.948226 kubelet[3224]: W0306 02:55:28.948208 3224 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 6 02:55:28.951156 kubelet[3224]: I0306 02:55:28.951076 3224 server.go:1262] "Started kubelet" Mar 6 02:55:28.953948 kubelet[3224]: I0306 02:55:28.953438 3224 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 6 02:55:28.953948 kubelet[3224]: I0306 02:55:28.953753 3224 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 6 02:55:28.953948 kubelet[3224]: I0306 02:55:28.953795 3224 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 6 02:55:28.954037 kubelet[3224]: I0306 02:55:28.953983 3224 server.go:310] "Adding debug handlers to kubelet server" Mar 6 02:55:28.954203 kubelet[3224]: I0306 02:55:28.954188 3224 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 6 02:55:28.955258 kubelet[3224]: E0306 02:55:28.954337 3224 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.34:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.34:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459.2.3-n-bf8f1184ca.189a2109b81d333c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459.2.3-n-bf8f1184ca,UID:ci-4459.2.3-n-bf8f1184ca,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459.2.3-n-bf8f1184ca,},FirstTimestamp:2026-03-06 02:55:28.951055164 +0000 UTC m=+4.883505700,LastTimestamp:2026-03-06 02:55:28.951055164 +0000 UTC m=+4.883505700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459.2.3-n-bf8f1184ca,}" Mar 6 02:55:28.956380 kubelet[3224]: I0306 02:55:28.956363 3224 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 6 02:55:28.956859 kubelet[3224]: I0306 02:55:28.956832 3224 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 6 02:55:28.958607 kubelet[3224]: E0306 02:55:28.958581 3224 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:28.958607 kubelet[3224]: I0306 02:55:28.958606 3224 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 6 02:55:28.958711 kubelet[3224]: I0306 02:55:28.958698 3224 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 6 02:55:28.958749 kubelet[3224]: I0306 02:55:28.958741 3224 reconciler.go:29] "Reconciler: start to sync state" Mar 6 02:55:28.959000 kubelet[3224]: E0306 02:55:28.958981 3224 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.34:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 6 02:55:28.959170 kubelet[3224]: E0306 02:55:28.959139 3224 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.3-n-bf8f1184ca?timeout=10s\": dial tcp 10.200.20.34:6443: connect: connection refused" interval="200ms" Mar 6 02:55:28.959722 kubelet[3224]: I0306 02:55:28.959701 3224 factory.go:223] Registration of the systemd container factory successfully Mar 6 02:55:28.959864 kubelet[3224]: I0306 02:55:28.959842 3224 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 6 02:55:28.961089 kubelet[3224]: E0306 02:55:28.961067 3224 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 6 02:55:28.961455 kubelet[3224]: I0306 02:55:28.961177 3224 factory.go:223] Registration of the containerd container factory successfully Mar 6 02:55:28.981566 kubelet[3224]: I0306 02:55:28.981543 3224 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 6 02:55:28.982401 kubelet[3224]: I0306 02:55:28.982388 3224 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 6 02:55:28.982483 kubelet[3224]: I0306 02:55:28.982474 3224 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 6 02:55:28.982537 kubelet[3224]: I0306 02:55:28.982530 3224 kubelet.go:2428] "Starting kubelet main sync loop" Mar 6 02:55:28.982608 kubelet[3224]: E0306 02:55:28.982595 3224 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 6 02:55:28.987025 kubelet[3224]: E0306 02:55:28.987003 3224 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.34:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 6 02:55:28.987700 kubelet[3224]: I0306 02:55:28.987688 3224 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 6 02:55:28.987934 kubelet[3224]: I0306 02:55:28.987789 3224 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 6 02:55:28.987934 kubelet[3224]: I0306 02:55:28.987808 3224 state_mem.go:36] "Initialized new in-memory state store" Mar 6 02:55:28.994211 kubelet[3224]: I0306 02:55:28.994196 3224 policy_none.go:49] "None policy: Start" Mar 6 02:55:28.994477 kubelet[3224]: I0306 02:55:28.994293 3224 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 6 02:55:28.994477 kubelet[3224]: I0306 02:55:28.994308 3224 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 6 02:55:29.038386 kubelet[3224]: I0306 02:55:29.038363 3224 policy_none.go:47] "Start" Mar 6 02:55:29.042118 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 6 02:55:29.050522 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 6 02:55:29.053301 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 6 02:55:29.059191 kubelet[3224]: E0306 02:55:29.059173 3224 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:29.062463 kubelet[3224]: E0306 02:55:29.062433 3224 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 6 02:55:29.063593 kubelet[3224]: I0306 02:55:29.063580 3224 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 6 02:55:29.063913 kubelet[3224]: I0306 02:55:29.063731 3224 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 6 02:55:29.064214 kubelet[3224]: I0306 02:55:29.063983 3224 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 6 02:55:29.064984 kubelet[3224]: E0306 02:55:29.064972 3224 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 6 02:55:29.065188 kubelet[3224]: E0306 02:55:29.065175 3224 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:29.094279 systemd[1]: Created slice kubepods-burstable-pod8a41ed82907f38dcff8d0b68a106500f.slice - libcontainer container kubepods-burstable-pod8a41ed82907f38dcff8d0b68a106500f.slice. Mar 6 02:55:29.101595 kubelet[3224]: E0306 02:55:29.101473 3224 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.132224 kubelet[3224]: E0306 02:55:29.112981 3224 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.111425 systemd[1]: Created slice kubepods-burstable-podc070085da33dc32023820ece9ad94bab.slice - libcontainer container kubepods-burstable-podc070085da33dc32023820ece9ad94bab.slice. Mar 6 02:55:29.145144 systemd[1]: Created slice kubepods-burstable-pod00d369a21fe5720fb3b898595335fd28.slice - libcontainer container kubepods-burstable-pod00d369a21fe5720fb3b898595335fd28.slice. Mar 6 02:55:29.146714 kubelet[3224]: E0306 02:55:29.146657 3224 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.160070 kubelet[3224]: E0306 02:55:29.160034 3224 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.3-n-bf8f1184ca?timeout=10s\": dial tcp 10.200.20.34:6443: connect: connection refused" interval="400ms" Mar 6 02:55:29.165408 kubelet[3224]: I0306 02:55:29.165388 3224 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.165714 kubelet[3224]: E0306 02:55:29.165692 3224 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.34:6443/api/v1/nodes\": dial tcp 10.200.20.34:6443: connect: connection refused" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.260604 kubelet[3224]: I0306 02:55:29.260497 3224 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/00d369a21fe5720fb3b898595335fd28-kubeconfig\") pod \"kube-scheduler-ci-4459.2.3-n-bf8f1184ca\" (UID: \"00d369a21fe5720fb3b898595335fd28\") " pod="kube-system/kube-scheduler-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.260604 kubelet[3224]: I0306 02:55:29.260535 3224 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8a41ed82907f38dcff8d0b68a106500f-ca-certs\") pod \"kube-apiserver-ci-4459.2.3-n-bf8f1184ca\" (UID: \"8a41ed82907f38dcff8d0b68a106500f\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.260604 kubelet[3224]: I0306 02:55:29.260553 3224 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8a41ed82907f38dcff8d0b68a106500f-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.2.3-n-bf8f1184ca\" (UID: \"8a41ed82907f38dcff8d0b68a106500f\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.260604 kubelet[3224]: I0306 02:55:29.260565 3224 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c070085da33dc32023820ece9ad94bab-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.2.3-n-bf8f1184ca\" (UID: \"c070085da33dc32023820ece9ad94bab\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.260604 kubelet[3224]: I0306 02:55:29.260575 3224 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c070085da33dc32023820ece9ad94bab-kubeconfig\") pod \"kube-controller-manager-ci-4459.2.3-n-bf8f1184ca\" (UID: \"c070085da33dc32023820ece9ad94bab\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.260798 kubelet[3224]: I0306 02:55:29.260630 3224 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c070085da33dc32023820ece9ad94bab-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.2.3-n-bf8f1184ca\" (UID: \"c070085da33dc32023820ece9ad94bab\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.260798 kubelet[3224]: I0306 02:55:29.260659 3224 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8a41ed82907f38dcff8d0b68a106500f-k8s-certs\") pod \"kube-apiserver-ci-4459.2.3-n-bf8f1184ca\" (UID: \"8a41ed82907f38dcff8d0b68a106500f\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.260798 kubelet[3224]: I0306 02:55:29.260672 3224 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c070085da33dc32023820ece9ad94bab-ca-certs\") pod \"kube-controller-manager-ci-4459.2.3-n-bf8f1184ca\" (UID: \"c070085da33dc32023820ece9ad94bab\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.260798 kubelet[3224]: I0306 02:55:29.260685 3224 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c070085da33dc32023820ece9ad94bab-k8s-certs\") pod \"kube-controller-manager-ci-4459.2.3-n-bf8f1184ca\" (UID: \"c070085da33dc32023820ece9ad94bab\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.321060 kubelet[3224]: E0306 02:55:29.320957 3224 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.34:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.34:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459.2.3-n-bf8f1184ca.189a2109b81d333c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459.2.3-n-bf8f1184ca,UID:ci-4459.2.3-n-bf8f1184ca,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459.2.3-n-bf8f1184ca,},FirstTimestamp:2026-03-06 02:55:28.951055164 +0000 UTC m=+4.883505700,LastTimestamp:2026-03-06 02:55:28.951055164 +0000 UTC m=+4.883505700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459.2.3-n-bf8f1184ca,}" Mar 6 02:55:29.368310 kubelet[3224]: I0306 02:55:29.368057 3224 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.368513 kubelet[3224]: E0306 02:55:29.368493 3224 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.34:6443/api/v1/nodes\": dial tcp 10.200.20.34:6443: connect: connection refused" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.440281 containerd[1920]: time="2026-03-06T02:55:29.439804767Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.2.3-n-bf8f1184ca,Uid:8a41ed82907f38dcff8d0b68a106500f,Namespace:kube-system,Attempt:0,}" Mar 6 02:55:29.481818 containerd[1920]: time="2026-03-06T02:55:29.481777282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.2.3-n-bf8f1184ca,Uid:c070085da33dc32023820ece9ad94bab,Namespace:kube-system,Attempt:0,}" Mar 6 02:55:29.531382 containerd[1920]: time="2026-03-06T02:55:29.531171720Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.2.3-n-bf8f1184ca,Uid:00d369a21fe5720fb3b898595335fd28,Namespace:kube-system,Attempt:0,}" Mar 6 02:55:29.560825 kubelet[3224]: E0306 02:55:29.560782 3224 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.3-n-bf8f1184ca?timeout=10s\": dial tcp 10.200.20.34:6443: connect: connection refused" interval="800ms" Mar 6 02:55:29.770962 kubelet[3224]: I0306 02:55:29.770934 3224 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.771298 kubelet[3224]: E0306 02:55:29.771273 3224 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.34:6443/api/v1/nodes\": dial tcp 10.200.20.34:6443: connect: connection refused" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:29.829824 kubelet[3224]: E0306 02:55:29.829720 3224 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.34:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.2.3-n-bf8f1184ca&limit=500&resourceVersion=0\": dial tcp 10.200.20.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 6 02:55:29.908718 kubelet[3224]: E0306 02:55:29.908669 3224 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.34:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 6 02:55:30.250397 kubelet[3224]: E0306 02:55:30.250349 3224 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.34:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 6 02:55:30.340015 kubelet[3224]: E0306 02:55:30.339961 3224 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.34:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 6 02:55:30.361465 kubelet[3224]: E0306 02:55:30.361426 3224 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.3-n-bf8f1184ca?timeout=10s\": dial tcp 10.200.20.34:6443: connect: connection refused" interval="1.6s" Mar 6 02:55:30.573193 kubelet[3224]: I0306 02:55:30.573095 3224 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:30.573560 kubelet[3224]: E0306 02:55:30.573530 3224 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.34:6443/api/v1/nodes\": dial tcp 10.200.20.34:6443: connect: connection refused" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:31.017782 kubelet[3224]: E0306 02:55:31.017738 3224 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.34:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.34:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 6 02:55:31.388979 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1047029151.mount: Deactivated successfully. Mar 6 02:55:31.586922 containerd[1920]: time="2026-03-06T02:55:31.586788024Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 02:55:31.630346 containerd[1920]: time="2026-03-06T02:55:31.630298921Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Mar 6 02:55:31.725910 containerd[1920]: time="2026-03-06T02:55:31.725844526Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 02:55:31.792791 containerd[1920]: time="2026-03-06T02:55:31.792467584Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 02:55:31.836562 containerd[1920]: time="2026-03-06T02:55:31.836527636Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 6 02:55:31.884398 containerd[1920]: time="2026-03-06T02:55:31.884344752Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 02:55:31.884802 containerd[1920]: time="2026-03-06T02:55:31.884775461Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 2.403650919s" Mar 6 02:55:31.929967 containerd[1920]: time="2026-03-06T02:55:31.929914349Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 02:55:31.933468 containerd[1920]: time="2026-03-06T02:55:31.933113635Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 6 02:55:31.962084 kubelet[3224]: E0306 02:55:31.962048 3224 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.3-n-bf8f1184ca?timeout=10s\": dial tcp 10.200.20.34:6443: connect: connection refused" interval="3.2s" Mar 6 02:55:32.042151 containerd[1920]: time="2026-03-06T02:55:32.042022519Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 2.401746514s" Mar 6 02:55:32.135669 containerd[1920]: time="2026-03-06T02:55:32.135427871Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 2.559206037s" Mar 6 02:55:32.175155 kubelet[3224]: I0306 02:55:32.175109 3224 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:32.175696 kubelet[3224]: E0306 02:55:32.175669 3224 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.34:6443/api/v1/nodes\": dial tcp 10.200.20.34:6443: connect: connection refused" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:32.405597 kubelet[3224]: E0306 02:55:32.405460 3224 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.34:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 6 02:55:32.679541 kubelet[3224]: E0306 02:55:32.679413 3224 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.34:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 6 02:55:32.951382 kubelet[3224]: E0306 02:55:32.951204 3224 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.34:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 6 02:55:32.980126 kubelet[3224]: E0306 02:55:32.980082 3224 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.34:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.2.3-n-bf8f1184ca&limit=500&resourceVersion=0\": dial tcp 10.200.20.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 6 02:55:33.091370 containerd[1920]: time="2026-03-06T02:55:33.091292892Z" level=info msg="connecting to shim e0e7eeb06cfc5cdf0b9e320f9e644a9df10fd27d173f7bf2bf0bce305054a733" address="unix:///run/containerd/s/2c4c3656f812d5c31512e530727a6eadea337fa9ea4ecd90964779b2e9c52bc3" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:55:33.109049 systemd[1]: Started cri-containerd-e0e7eeb06cfc5cdf0b9e320f9e644a9df10fd27d173f7bf2bf0bce305054a733.scope - libcontainer container e0e7eeb06cfc5cdf0b9e320f9e644a9df10fd27d173f7bf2bf0bce305054a733. Mar 6 02:55:33.189121 containerd[1920]: time="2026-03-06T02:55:33.189051671Z" level=info msg="connecting to shim 15be857a1f5d885de924130224e361647170f34a718ca9ccf5df3363a1ef7fc7" address="unix:///run/containerd/s/359a50dd8a94705ef998417fca41401de1ba5f4ac42a753e5b6f2968960f6ddf" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:55:33.209038 systemd[1]: Started cri-containerd-15be857a1f5d885de924130224e361647170f34a718ca9ccf5df3363a1ef7fc7.scope - libcontainer container 15be857a1f5d885de924130224e361647170f34a718ca9ccf5df3363a1ef7fc7. Mar 6 02:55:33.291573 containerd[1920]: time="2026-03-06T02:55:33.291432981Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.2.3-n-bf8f1184ca,Uid:8a41ed82907f38dcff8d0b68a106500f,Namespace:kube-system,Attempt:0,} returns sandbox id \"e0e7eeb06cfc5cdf0b9e320f9e644a9df10fd27d173f7bf2bf0bce305054a733\"" Mar 6 02:55:33.340946 containerd[1920]: time="2026-03-06T02:55:33.340859403Z" level=info msg="CreateContainer within sandbox \"e0e7eeb06cfc5cdf0b9e320f9e644a9df10fd27d173f7bf2bf0bce305054a733\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 6 02:55:33.397100 containerd[1920]: time="2026-03-06T02:55:33.397007416Z" level=info msg="connecting to shim e6f64ae9135ca28ef5bc454fbb3c8cd8509dd3fdb87eda771218606a8c93aec1" address="unix:///run/containerd/s/977e02249690102a08717e0b945fd4c7a51344749a0f37676116b33e985b0798" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:55:33.413179 systemd[1]: Started cri-containerd-e6f64ae9135ca28ef5bc454fbb3c8cd8509dd3fdb87eda771218606a8c93aec1.scope - libcontainer container e6f64ae9135ca28ef5bc454fbb3c8cd8509dd3fdb87eda771218606a8c93aec1. Mar 6 02:55:33.433403 containerd[1920]: time="2026-03-06T02:55:33.433365606Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.2.3-n-bf8f1184ca,Uid:00d369a21fe5720fb3b898595335fd28,Namespace:kube-system,Attempt:0,} returns sandbox id \"15be857a1f5d885de924130224e361647170f34a718ca9ccf5df3363a1ef7fc7\"" Mar 6 02:55:33.529170 containerd[1920]: time="2026-03-06T02:55:33.529004893Z" level=info msg="CreateContainer within sandbox \"15be857a1f5d885de924130224e361647170f34a718ca9ccf5df3363a1ef7fc7\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 6 02:55:33.592045 containerd[1920]: time="2026-03-06T02:55:33.592000901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.2.3-n-bf8f1184ca,Uid:c070085da33dc32023820ece9ad94bab,Namespace:kube-system,Attempt:0,} returns sandbox id \"e6f64ae9135ca28ef5bc454fbb3c8cd8509dd3fdb87eda771218606a8c93aec1\"" Mar 6 02:55:33.685315 containerd[1920]: time="2026-03-06T02:55:33.685274464Z" level=info msg="CreateContainer within sandbox \"e6f64ae9135ca28ef5bc454fbb3c8cd8509dd3fdb87eda771218606a8c93aec1\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 6 02:55:33.827885 containerd[1920]: time="2026-03-06T02:55:33.827764323Z" level=info msg="Container bb8fa2504c9b0c5e79c2ad6f4d6f1ed08d319bf4bb126f133673c7462d7c302d: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:55:34.096122 containerd[1920]: time="2026-03-06T02:55:34.095949228Z" level=info msg="Container 8b96703474330a1b28d28bbd2962b8ab0a873732552b14f31e17d4570b641de8: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:55:34.188681 containerd[1920]: time="2026-03-06T02:55:34.188635229Z" level=info msg="Container 07b2baf4853dd73b96eb01deeb708a0d2e0d4d0d474a7362d8a51f158d2cdc22: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:55:34.292431 containerd[1920]: time="2026-03-06T02:55:34.292332764Z" level=info msg="CreateContainer within sandbox \"e0e7eeb06cfc5cdf0b9e320f9e644a9df10fd27d173f7bf2bf0bce305054a733\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"bb8fa2504c9b0c5e79c2ad6f4d6f1ed08d319bf4bb126f133673c7462d7c302d\"" Mar 6 02:55:34.293350 containerd[1920]: time="2026-03-06T02:55:34.293283026Z" level=info msg="StartContainer for \"bb8fa2504c9b0c5e79c2ad6f4d6f1ed08d319bf4bb126f133673c7462d7c302d\"" Mar 6 02:55:34.294376 containerd[1920]: time="2026-03-06T02:55:34.294319532Z" level=info msg="connecting to shim bb8fa2504c9b0c5e79c2ad6f4d6f1ed08d319bf4bb126f133673c7462d7c302d" address="unix:///run/containerd/s/2c4c3656f812d5c31512e530727a6eadea337fa9ea4ecd90964779b2e9c52bc3" protocol=ttrpc version=3 Mar 6 02:55:34.309020 systemd[1]: Started cri-containerd-bb8fa2504c9b0c5e79c2ad6f4d6f1ed08d319bf4bb126f133673c7462d7c302d.scope - libcontainer container bb8fa2504c9b0c5e79c2ad6f4d6f1ed08d319bf4bb126f133673c7462d7c302d. Mar 6 02:55:34.384235 containerd[1920]: time="2026-03-06T02:55:34.384098912Z" level=info msg="StartContainer for \"bb8fa2504c9b0c5e79c2ad6f4d6f1ed08d319bf4bb126f133673c7462d7c302d\" returns successfully" Mar 6 02:55:34.436675 containerd[1920]: time="2026-03-06T02:55:34.436603641Z" level=info msg="CreateContainer within sandbox \"e6f64ae9135ca28ef5bc454fbb3c8cd8509dd3fdb87eda771218606a8c93aec1\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"07b2baf4853dd73b96eb01deeb708a0d2e0d4d0d474a7362d8a51f158d2cdc22\"" Mar 6 02:55:34.437136 containerd[1920]: time="2026-03-06T02:55:34.437111745Z" level=info msg="StartContainer for \"07b2baf4853dd73b96eb01deeb708a0d2e0d4d0d474a7362d8a51f158d2cdc22\"" Mar 6 02:55:34.439715 containerd[1920]: time="2026-03-06T02:55:34.439681715Z" level=info msg="connecting to shim 07b2baf4853dd73b96eb01deeb708a0d2e0d4d0d474a7362d8a51f158d2cdc22" address="unix:///run/containerd/s/977e02249690102a08717e0b945fd4c7a51344749a0f37676116b33e985b0798" protocol=ttrpc version=3 Mar 6 02:55:34.457030 systemd[1]: Started cri-containerd-07b2baf4853dd73b96eb01deeb708a0d2e0d4d0d474a7362d8a51f158d2cdc22.scope - libcontainer container 07b2baf4853dd73b96eb01deeb708a0d2e0d4d0d474a7362d8a51f158d2cdc22. Mar 6 02:55:34.484380 containerd[1920]: time="2026-03-06T02:55:34.484341089Z" level=info msg="CreateContainer within sandbox \"15be857a1f5d885de924130224e361647170f34a718ca9ccf5df3363a1ef7fc7\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"8b96703474330a1b28d28bbd2962b8ab0a873732552b14f31e17d4570b641de8\"" Mar 6 02:55:34.485006 containerd[1920]: time="2026-03-06T02:55:34.484886275Z" level=info msg="StartContainer for \"8b96703474330a1b28d28bbd2962b8ab0a873732552b14f31e17d4570b641de8\"" Mar 6 02:55:34.486905 containerd[1920]: time="2026-03-06T02:55:34.485704045Z" level=info msg="connecting to shim 8b96703474330a1b28d28bbd2962b8ab0a873732552b14f31e17d4570b641de8" address="unix:///run/containerd/s/359a50dd8a94705ef998417fca41401de1ba5f4ac42a753e5b6f2968960f6ddf" protocol=ttrpc version=3 Mar 6 02:55:34.506124 containerd[1920]: time="2026-03-06T02:55:34.506090198Z" level=info msg="StartContainer for \"07b2baf4853dd73b96eb01deeb708a0d2e0d4d0d474a7362d8a51f158d2cdc22\" returns successfully" Mar 6 02:55:34.508044 systemd[1]: Started cri-containerd-8b96703474330a1b28d28bbd2962b8ab0a873732552b14f31e17d4570b641de8.scope - libcontainer container 8b96703474330a1b28d28bbd2962b8ab0a873732552b14f31e17d4570b641de8. Mar 6 02:55:34.545884 containerd[1920]: time="2026-03-06T02:55:34.545845977Z" level=info msg="StartContainer for \"8b96703474330a1b28d28bbd2962b8ab0a873732552b14f31e17d4570b641de8\" returns successfully" Mar 6 02:55:35.005220 kubelet[3224]: E0306 02:55:35.005007 3224 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:35.006455 kubelet[3224]: E0306 02:55:35.006319 3224 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:35.010499 kubelet[3224]: E0306 02:55:35.010474 3224 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:35.378738 kubelet[3224]: I0306 02:55:35.378655 3224 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:35.852971 kubelet[3224]: E0306 02:55:35.852927 3224 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459.2.3-n-bf8f1184ca\" not found" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:35.908783 kubelet[3224]: I0306 02:55:35.908688 3224 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:35.908783 kubelet[3224]: E0306 02:55:35.908724 3224 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4459.2.3-n-bf8f1184ca\": node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:35.926992 kubelet[3224]: E0306 02:55:35.926954 3224 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:36.013928 kubelet[3224]: E0306 02:55:36.013756 3224 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:36.013928 kubelet[3224]: E0306 02:55:36.013819 3224 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:36.014805 kubelet[3224]: E0306 02:55:36.014787 3224 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:36.027071 kubelet[3224]: E0306 02:55:36.027027 3224 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:36.127434 kubelet[3224]: E0306 02:55:36.127311 3224 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:36.227859 kubelet[3224]: E0306 02:55:36.227810 3224 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:36.328472 kubelet[3224]: E0306 02:55:36.328424 3224 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:36.429015 kubelet[3224]: E0306 02:55:36.428884 3224 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:36.529792 kubelet[3224]: E0306 02:55:36.529749 3224 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:36.630670 kubelet[3224]: E0306 02:55:36.630623 3224 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:36.731102 kubelet[3224]: E0306 02:55:36.731060 3224 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:36.831602 kubelet[3224]: E0306 02:55:36.831561 3224 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:36.932377 kubelet[3224]: E0306 02:55:36.932329 3224 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:37.015727 kubelet[3224]: E0306 02:55:37.015597 3224 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:37.016571 kubelet[3224]: E0306 02:55:37.016294 3224 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:37.032964 kubelet[3224]: E0306 02:55:37.032935 3224 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:37.159688 kubelet[3224]: I0306 02:55:37.159651 3224 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:37.170292 kubelet[3224]: I0306 02:55:37.170239 3224 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:55:37.170799 kubelet[3224]: I0306 02:55:37.170704 3224 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:37.183633 kubelet[3224]: I0306 02:55:37.183611 3224 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:55:37.183709 kubelet[3224]: I0306 02:55:37.183684 3224 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:37.193970 kubelet[3224]: I0306 02:55:37.193822 3224 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:55:37.953923 kubelet[3224]: I0306 02:55:37.953757 3224 apiserver.go:52] "Watching apiserver" Mar 6 02:55:37.959080 kubelet[3224]: I0306 02:55:37.959048 3224 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 6 02:55:38.002333 systemd[1]: Reload requested from client PID 3513 ('systemctl') (unit session-9.scope)... Mar 6 02:55:38.002350 systemd[1]: Reloading... Mar 6 02:55:38.017936 kubelet[3224]: I0306 02:55:38.017678 3224 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.030444 kubelet[3224]: I0306 02:55:38.030357 3224 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:55:38.030813 kubelet[3224]: E0306 02:55:38.030709 3224 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.3-n-bf8f1184ca\" already exists" pod="kube-system/kube-scheduler-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.101923 zram_generator::config[3575]: No configuration found. Mar 6 02:55:38.247186 systemd[1]: Reloading finished in 244 ms. Mar 6 02:55:38.264281 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:55:38.274863 systemd[1]: kubelet.service: Deactivated successfully. Mar 6 02:55:38.275090 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:55:38.275149 systemd[1]: kubelet.service: Consumed 565ms CPU time, 119.6M memory peak. Mar 6 02:55:38.276520 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:55:38.478031 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:55:38.486179 (kubelet)[3624]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 6 02:55:38.521467 kubelet[3624]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 6 02:55:38.521467 kubelet[3624]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 6 02:55:38.521467 kubelet[3624]: I0306 02:55:38.521444 3624 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 6 02:55:38.529988 kubelet[3624]: I0306 02:55:38.529717 3624 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 6 02:55:38.529988 kubelet[3624]: I0306 02:55:38.529740 3624 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 6 02:55:38.529988 kubelet[3624]: I0306 02:55:38.529761 3624 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 6 02:55:38.529988 kubelet[3624]: I0306 02:55:38.529765 3624 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 6 02:55:38.530129 kubelet[3624]: I0306 02:55:38.530025 3624 server.go:956] "Client rotation is on, will bootstrap in background" Mar 6 02:55:38.531815 kubelet[3624]: I0306 02:55:38.531728 3624 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 6 02:55:38.533472 kubelet[3624]: I0306 02:55:38.533451 3624 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 6 02:55:38.536469 kubelet[3624]: I0306 02:55:38.536450 3624 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 6 02:55:38.542435 kubelet[3624]: I0306 02:55:38.542412 3624 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 6 02:55:38.542598 kubelet[3624]: I0306 02:55:38.542562 3624 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 6 02:55:38.542694 kubelet[3624]: I0306 02:55:38.542586 3624 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.2.3-n-bf8f1184ca","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 6 02:55:38.542694 kubelet[3624]: I0306 02:55:38.542692 3624 topology_manager.go:138] "Creating topology manager with none policy" Mar 6 02:55:38.542773 kubelet[3624]: I0306 02:55:38.542698 3624 container_manager_linux.go:306] "Creating device plugin manager" Mar 6 02:55:38.542773 kubelet[3624]: I0306 02:55:38.542717 3624 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 6 02:55:38.542903 kubelet[3624]: I0306 02:55:38.542885 3624 state_mem.go:36] "Initialized new in-memory state store" Mar 6 02:55:38.543366 kubelet[3624]: I0306 02:55:38.543347 3624 kubelet.go:475] "Attempting to sync node with API server" Mar 6 02:55:38.543366 kubelet[3624]: I0306 02:55:38.543366 3624 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 6 02:55:38.543964 kubelet[3624]: I0306 02:55:38.543946 3624 kubelet.go:387] "Adding apiserver pod source" Mar 6 02:55:38.545918 kubelet[3624]: I0306 02:55:38.545591 3624 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 6 02:55:38.547712 kubelet[3624]: I0306 02:55:38.547689 3624 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 6 02:55:38.548523 kubelet[3624]: I0306 02:55:38.548495 3624 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 6 02:55:38.548523 kubelet[3624]: I0306 02:55:38.548523 3624 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 6 02:55:38.555202 kubelet[3624]: I0306 02:55:38.555181 3624 server.go:1262] "Started kubelet" Mar 6 02:55:38.561582 kubelet[3624]: I0306 02:55:38.561560 3624 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 6 02:55:38.564773 kubelet[3624]: I0306 02:55:38.564034 3624 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 6 02:55:38.566807 kubelet[3624]: I0306 02:55:38.566788 3624 server.go:310] "Adding debug handlers to kubelet server" Mar 6 02:55:38.574823 kubelet[3624]: I0306 02:55:38.574797 3624 factory.go:223] Registration of the systemd container factory successfully Mar 6 02:55:38.574918 kubelet[3624]: I0306 02:55:38.574888 3624 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 6 02:55:38.575179 kubelet[3624]: I0306 02:55:38.575054 3624 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 6 02:55:38.575484 kubelet[3624]: I0306 02:55:38.575300 3624 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 6 02:55:38.576609 kubelet[3624]: I0306 02:55:38.575877 3624 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 6 02:55:38.580075 kubelet[3624]: I0306 02:55:38.579580 3624 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 6 02:55:38.581989 kubelet[3624]: I0306 02:55:38.581737 3624 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 6 02:55:38.582421 kubelet[3624]: E0306 02:55:38.582227 3624 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-bf8f1184ca\" not found" Mar 6 02:55:38.585457 kubelet[3624]: E0306 02:55:38.585311 3624 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 6 02:55:38.585806 kubelet[3624]: I0306 02:55:38.581950 3624 factory.go:223] Registration of the containerd container factory successfully Mar 6 02:55:38.586365 kubelet[3624]: I0306 02:55:38.586103 3624 reconciler.go:29] "Reconciler: start to sync state" Mar 6 02:55:38.588386 kubelet[3624]: I0306 02:55:38.588325 3624 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 6 02:55:38.590444 kubelet[3624]: I0306 02:55:38.590328 3624 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 6 02:55:38.591780 kubelet[3624]: I0306 02:55:38.591671 3624 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 6 02:55:38.591947 kubelet[3624]: I0306 02:55:38.591934 3624 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 6 02:55:38.592209 kubelet[3624]: I0306 02:55:38.592189 3624 kubelet.go:2428] "Starting kubelet main sync loop" Mar 6 02:55:38.592724 kubelet[3624]: E0306 02:55:38.592301 3624 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 6 02:55:38.631936 kubelet[3624]: I0306 02:55:38.631879 3624 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 6 02:55:38.632176 kubelet[3624]: I0306 02:55:38.632107 3624 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 6 02:55:38.632176 kubelet[3624]: I0306 02:55:38.632132 3624 state_mem.go:36] "Initialized new in-memory state store" Mar 6 02:55:38.632345 kubelet[3624]: I0306 02:55:38.632335 3624 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 6 02:55:38.632523 kubelet[3624]: I0306 02:55:38.632401 3624 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 6 02:55:38.632523 kubelet[3624]: I0306 02:55:38.632420 3624 policy_none.go:49] "None policy: Start" Mar 6 02:55:38.632523 kubelet[3624]: I0306 02:55:38.632427 3624 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 6 02:55:38.632523 kubelet[3624]: I0306 02:55:38.632436 3624 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 6 02:55:38.632716 kubelet[3624]: I0306 02:55:38.632706 3624 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 6 02:55:38.632769 kubelet[3624]: I0306 02:55:38.632763 3624 policy_none.go:47] "Start" Mar 6 02:55:38.637893 kubelet[3624]: E0306 02:55:38.637867 3624 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 6 02:55:38.638699 kubelet[3624]: I0306 02:55:38.638682 3624 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 6 02:55:38.638803 kubelet[3624]: I0306 02:55:38.638778 3624 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 6 02:55:38.639162 kubelet[3624]: I0306 02:55:38.639147 3624 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 6 02:55:38.642586 kubelet[3624]: E0306 02:55:38.642420 3624 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 6 02:55:38.693773 kubelet[3624]: I0306 02:55:38.693433 3624 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.693773 kubelet[3624]: I0306 02:55:38.693562 3624 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.694040 kubelet[3624]: I0306 02:55:38.694028 3624 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.708319 kubelet[3624]: I0306 02:55:38.708286 3624 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:55:38.708410 kubelet[3624]: E0306 02:55:38.708343 3624 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.3-n-bf8f1184ca\" already exists" pod="kube-system/kube-scheduler-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.708966 kubelet[3624]: I0306 02:55:38.708911 3624 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:55:38.709241 kubelet[3624]: E0306 02:55:38.709028 3624 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.3-n-bf8f1184ca\" already exists" pod="kube-system/kube-apiserver-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.709241 kubelet[3624]: I0306 02:55:38.709060 3624 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:55:38.709241 kubelet[3624]: E0306 02:55:38.709087 3624 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.2.3-n-bf8f1184ca\" already exists" pod="kube-system/kube-controller-manager-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.743170 kubelet[3624]: I0306 02:55:38.743137 3624 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.757576 kubelet[3624]: I0306 02:55:38.757355 3624 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.757576 kubelet[3624]: I0306 02:55:38.757444 3624 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.887818 kubelet[3624]: I0306 02:55:38.887160 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c070085da33dc32023820ece9ad94bab-ca-certs\") pod \"kube-controller-manager-ci-4459.2.3-n-bf8f1184ca\" (UID: \"c070085da33dc32023820ece9ad94bab\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.887818 kubelet[3624]: I0306 02:55:38.887727 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c070085da33dc32023820ece9ad94bab-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.2.3-n-bf8f1184ca\" (UID: \"c070085da33dc32023820ece9ad94bab\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.887818 kubelet[3624]: I0306 02:55:38.887746 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c070085da33dc32023820ece9ad94bab-k8s-certs\") pod \"kube-controller-manager-ci-4459.2.3-n-bf8f1184ca\" (UID: \"c070085da33dc32023820ece9ad94bab\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.887818 kubelet[3624]: I0306 02:55:38.887757 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c070085da33dc32023820ece9ad94bab-kubeconfig\") pod \"kube-controller-manager-ci-4459.2.3-n-bf8f1184ca\" (UID: \"c070085da33dc32023820ece9ad94bab\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.887818 kubelet[3624]: I0306 02:55:38.887768 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c070085da33dc32023820ece9ad94bab-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.2.3-n-bf8f1184ca\" (UID: \"c070085da33dc32023820ece9ad94bab\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.888016 kubelet[3624]: I0306 02:55:38.887784 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/00d369a21fe5720fb3b898595335fd28-kubeconfig\") pod \"kube-scheduler-ci-4459.2.3-n-bf8f1184ca\" (UID: \"00d369a21fe5720fb3b898595335fd28\") " pod="kube-system/kube-scheduler-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.888016 kubelet[3624]: I0306 02:55:38.887795 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8a41ed82907f38dcff8d0b68a106500f-ca-certs\") pod \"kube-apiserver-ci-4459.2.3-n-bf8f1184ca\" (UID: \"8a41ed82907f38dcff8d0b68a106500f\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.888016 kubelet[3624]: I0306 02:55:38.887816 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8a41ed82907f38dcff8d0b68a106500f-k8s-certs\") pod \"kube-apiserver-ci-4459.2.3-n-bf8f1184ca\" (UID: \"8a41ed82907f38dcff8d0b68a106500f\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:38.888016 kubelet[3624]: I0306 02:55:38.887828 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8a41ed82907f38dcff8d0b68a106500f-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.2.3-n-bf8f1184ca\" (UID: \"8a41ed82907f38dcff8d0b68a106500f\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:39.547196 kubelet[3624]: I0306 02:55:39.547083 3624 apiserver.go:52] "Watching apiserver" Mar 6 02:55:39.588969 kubelet[3624]: I0306 02:55:39.588832 3624 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 6 02:55:39.621870 kubelet[3624]: I0306 02:55:39.621683 3624 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:39.632005 kubelet[3624]: I0306 02:55:39.631968 3624 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:55:39.632292 kubelet[3624]: E0306 02:55:39.632186 3624 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.3-n-bf8f1184ca\" already exists" pod="kube-system/kube-apiserver-ci-4459.2.3-n-bf8f1184ca" Mar 6 02:55:39.637581 kubelet[3624]: I0306 02:55:39.637440 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459.2.3-n-bf8f1184ca" podStartSLOduration=2.637431607 podStartE2EDuration="2.637431607s" podCreationTimestamp="2026-03-06 02:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 02:55:39.637035402 +0000 UTC m=+1.147949125" watchObservedRunningTime="2026-03-06 02:55:39.637431607 +0000 UTC m=+1.148345338" Mar 6 02:55:39.658442 kubelet[3624]: I0306 02:55:39.658405 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459.2.3-n-bf8f1184ca" podStartSLOduration=2.658368411 podStartE2EDuration="2.658368411s" podCreationTimestamp="2026-03-06 02:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 02:55:39.657352834 +0000 UTC m=+1.168266573" watchObservedRunningTime="2026-03-06 02:55:39.658368411 +0000 UTC m=+1.169282142" Mar 6 02:55:39.658840 kubelet[3624]: I0306 02:55:39.658788 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459.2.3-n-bf8f1184ca" podStartSLOduration=2.658781544 podStartE2EDuration="2.658781544s" podCreationTimestamp="2026-03-06 02:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 02:55:39.647912173 +0000 UTC m=+1.158825896" watchObservedRunningTime="2026-03-06 02:55:39.658781544 +0000 UTC m=+1.169695307" Mar 6 02:55:43.595300 kubelet[3624]: I0306 02:55:43.594960 3624 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 6 02:55:43.595886 containerd[1920]: time="2026-03-06T02:55:43.595255400Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 6 02:55:43.596054 kubelet[3624]: I0306 02:55:43.595436 3624 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 6 02:55:44.372776 systemd[1]: Created slice kubepods-besteffort-podb8ffa8c1_e923_430c_b131_3dd5340eefba.slice - libcontainer container kubepods-besteffort-podb8ffa8c1_e923_430c_b131_3dd5340eefba.slice. Mar 6 02:55:44.415465 kubelet[3624]: I0306 02:55:44.415341 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9472\" (UniqueName: \"kubernetes.io/projected/b8ffa8c1-e923-430c-b131-3dd5340eefba-kube-api-access-b9472\") pod \"kube-proxy-gw8lf\" (UID: \"b8ffa8c1-e923-430c-b131-3dd5340eefba\") " pod="kube-system/kube-proxy-gw8lf" Mar 6 02:55:44.415465 kubelet[3624]: I0306 02:55:44.415418 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/b8ffa8c1-e923-430c-b131-3dd5340eefba-kube-proxy\") pod \"kube-proxy-gw8lf\" (UID: \"b8ffa8c1-e923-430c-b131-3dd5340eefba\") " pod="kube-system/kube-proxy-gw8lf" Mar 6 02:55:44.415465 kubelet[3624]: I0306 02:55:44.415431 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b8ffa8c1-e923-430c-b131-3dd5340eefba-xtables-lock\") pod \"kube-proxy-gw8lf\" (UID: \"b8ffa8c1-e923-430c-b131-3dd5340eefba\") " pod="kube-system/kube-proxy-gw8lf" Mar 6 02:55:44.415465 kubelet[3624]: I0306 02:55:44.415440 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b8ffa8c1-e923-430c-b131-3dd5340eefba-lib-modules\") pod \"kube-proxy-gw8lf\" (UID: \"b8ffa8c1-e923-430c-b131-3dd5340eefba\") " pod="kube-system/kube-proxy-gw8lf" Mar 6 02:55:44.686828 containerd[1920]: time="2026-03-06T02:55:44.686670396Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gw8lf,Uid:b8ffa8c1-e923-430c-b131-3dd5340eefba,Namespace:kube-system,Attempt:0,}" Mar 6 02:55:44.885714 systemd[1]: Created slice kubepods-besteffort-pod2368daae_bb6b_4f55_862a_1993cc177c9f.slice - libcontainer container kubepods-besteffort-pod2368daae_bb6b_4f55_862a_1993cc177c9f.slice. Mar 6 02:55:44.919240 kubelet[3624]: I0306 02:55:44.919157 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2368daae-bb6b-4f55-862a-1993cc177c9f-var-lib-calico\") pod \"tigera-operator-5588576f44-dbcwq\" (UID: \"2368daae-bb6b-4f55-862a-1993cc177c9f\") " pod="tigera-operator/tigera-operator-5588576f44-dbcwq" Mar 6 02:55:44.919240 kubelet[3624]: I0306 02:55:44.919201 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6whw\" (UniqueName: \"kubernetes.io/projected/2368daae-bb6b-4f55-862a-1993cc177c9f-kube-api-access-l6whw\") pod \"tigera-operator-5588576f44-dbcwq\" (UID: \"2368daae-bb6b-4f55-862a-1993cc177c9f\") " pod="tigera-operator/tigera-operator-5588576f44-dbcwq" Mar 6 02:55:44.995047 containerd[1920]: time="2026-03-06T02:55:44.995011724Z" level=info msg="connecting to shim d11b5aa0a724b49e509e56d2f42caffdf9ec0bcb19ab60d2d3acb2abb97cbdaa" address="unix:///run/containerd/s/67cd84de0d2daff26f7463e5a7b1b8b0bcdab031ff93b5d39ffdb6de364fe0fe" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:55:45.015051 systemd[1]: Started cri-containerd-d11b5aa0a724b49e509e56d2f42caffdf9ec0bcb19ab60d2d3acb2abb97cbdaa.scope - libcontainer container d11b5aa0a724b49e509e56d2f42caffdf9ec0bcb19ab60d2d3acb2abb97cbdaa. Mar 6 02:55:45.043314 containerd[1920]: time="2026-03-06T02:55:45.043255639Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gw8lf,Uid:b8ffa8c1-e923-430c-b131-3dd5340eefba,Namespace:kube-system,Attempt:0,} returns sandbox id \"d11b5aa0a724b49e509e56d2f42caffdf9ec0bcb19ab60d2d3acb2abb97cbdaa\"" Mar 6 02:55:45.053359 containerd[1920]: time="2026-03-06T02:55:45.053130122Z" level=info msg="CreateContainer within sandbox \"d11b5aa0a724b49e509e56d2f42caffdf9ec0bcb19ab60d2d3acb2abb97cbdaa\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 6 02:55:45.243080 containerd[1920]: time="2026-03-06T02:55:45.243037266Z" level=info msg="Container 69c08a38c1ee8790a6cbe62a3d321d9b2d39c96c8924c1b3c5a48f9333413212: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:55:45.244420 containerd[1920]: time="2026-03-06T02:55:45.244277913Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-dbcwq,Uid:2368daae-bb6b-4f55-862a-1993cc177c9f,Namespace:tigera-operator,Attempt:0,}" Mar 6 02:55:45.432551 containerd[1920]: time="2026-03-06T02:55:45.432238971Z" level=info msg="CreateContainer within sandbox \"d11b5aa0a724b49e509e56d2f42caffdf9ec0bcb19ab60d2d3acb2abb97cbdaa\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"69c08a38c1ee8790a6cbe62a3d321d9b2d39c96c8924c1b3c5a48f9333413212\"" Mar 6 02:55:45.434232 containerd[1920]: time="2026-03-06T02:55:45.434190377Z" level=info msg="StartContainer for \"69c08a38c1ee8790a6cbe62a3d321d9b2d39c96c8924c1b3c5a48f9333413212\"" Mar 6 02:55:45.435513 containerd[1920]: time="2026-03-06T02:55:45.435477034Z" level=info msg="connecting to shim 69c08a38c1ee8790a6cbe62a3d321d9b2d39c96c8924c1b3c5a48f9333413212" address="unix:///run/containerd/s/67cd84de0d2daff26f7463e5a7b1b8b0bcdab031ff93b5d39ffdb6de364fe0fe" protocol=ttrpc version=3 Mar 6 02:55:45.451026 systemd[1]: Started cri-containerd-69c08a38c1ee8790a6cbe62a3d321d9b2d39c96c8924c1b3c5a48f9333413212.scope - libcontainer container 69c08a38c1ee8790a6cbe62a3d321d9b2d39c96c8924c1b3c5a48f9333413212. Mar 6 02:55:45.548094 containerd[1920]: time="2026-03-06T02:55:45.548009583Z" level=info msg="StartContainer for \"69c08a38c1ee8790a6cbe62a3d321d9b2d39c96c8924c1b3c5a48f9333413212\" returns successfully" Mar 6 02:55:45.739269 containerd[1920]: time="2026-03-06T02:55:45.739226376Z" level=info msg="connecting to shim 79d49802306ce3fa37ecce033edebf227e4f08d6cf53ffbcb222244de4ffb731" address="unix:///run/containerd/s/235a8f10aca2d02c086a24ab460b4370094c71d8638fbaf2811bd8e3dc29650f" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:55:45.757015 systemd[1]: Started cri-containerd-79d49802306ce3fa37ecce033edebf227e4f08d6cf53ffbcb222244de4ffb731.scope - libcontainer container 79d49802306ce3fa37ecce033edebf227e4f08d6cf53ffbcb222244de4ffb731. Mar 6 02:55:45.784361 containerd[1920]: time="2026-03-06T02:55:45.784325791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-dbcwq,Uid:2368daae-bb6b-4f55-862a-1993cc177c9f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"79d49802306ce3fa37ecce033edebf227e4f08d6cf53ffbcb222244de4ffb731\"" Mar 6 02:55:45.786059 containerd[1920]: time="2026-03-06T02:55:45.786035141Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 6 02:55:48.240226 kubelet[3624]: I0306 02:55:48.240126 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-gw8lf" podStartSLOduration=4.240111149 podStartE2EDuration="4.240111149s" podCreationTimestamp="2026-03-06 02:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 02:55:45.644215467 +0000 UTC m=+7.155129190" watchObservedRunningTime="2026-03-06 02:55:48.240111149 +0000 UTC m=+9.751024872" Mar 6 02:55:49.163969 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3164407517.mount: Deactivated successfully. Mar 6 02:55:51.038482 containerd[1920]: time="2026-03-06T02:55:51.038426473Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:55:51.084884 containerd[1920]: time="2026-03-06T02:55:51.084808983Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Mar 6 02:55:51.089013 containerd[1920]: time="2026-03-06T02:55:51.088968862Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:55:51.134060 containerd[1920]: time="2026-03-06T02:55:51.134015689Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:55:51.134601 containerd[1920]: time="2026-03-06T02:55:51.134572379Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 5.348509245s" Mar 6 02:55:51.134837 containerd[1920]: time="2026-03-06T02:55:51.134604956Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Mar 6 02:55:51.181746 containerd[1920]: time="2026-03-06T02:55:51.181706281Z" level=info msg="CreateContainer within sandbox \"79d49802306ce3fa37ecce033edebf227e4f08d6cf53ffbcb222244de4ffb731\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 6 02:55:51.389605 containerd[1920]: time="2026-03-06T02:55:51.389504697Z" level=info msg="Container 95485db79eddfa6b5b91fd712a918156d8c78a5a618613f9f57b18cc495cea3e: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:55:51.488112 containerd[1920]: time="2026-03-06T02:55:51.488062288Z" level=info msg="CreateContainer within sandbox \"79d49802306ce3fa37ecce033edebf227e4f08d6cf53ffbcb222244de4ffb731\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"95485db79eddfa6b5b91fd712a918156d8c78a5a618613f9f57b18cc495cea3e\"" Mar 6 02:55:51.488822 containerd[1920]: time="2026-03-06T02:55:51.488789568Z" level=info msg="StartContainer for \"95485db79eddfa6b5b91fd712a918156d8c78a5a618613f9f57b18cc495cea3e\"" Mar 6 02:55:51.489483 containerd[1920]: time="2026-03-06T02:55:51.489430149Z" level=info msg="connecting to shim 95485db79eddfa6b5b91fd712a918156d8c78a5a618613f9f57b18cc495cea3e" address="unix:///run/containerd/s/235a8f10aca2d02c086a24ab460b4370094c71d8638fbaf2811bd8e3dc29650f" protocol=ttrpc version=3 Mar 6 02:55:51.510026 systemd[1]: Started cri-containerd-95485db79eddfa6b5b91fd712a918156d8c78a5a618613f9f57b18cc495cea3e.scope - libcontainer container 95485db79eddfa6b5b91fd712a918156d8c78a5a618613f9f57b18cc495cea3e. Mar 6 02:55:51.536637 containerd[1920]: time="2026-03-06T02:55:51.536573139Z" level=info msg="StartContainer for \"95485db79eddfa6b5b91fd712a918156d8c78a5a618613f9f57b18cc495cea3e\" returns successfully" Mar 6 02:55:51.659230 kubelet[3624]: I0306 02:55:51.658648 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5588576f44-dbcwq" podStartSLOduration=2.308716529 podStartE2EDuration="7.658584066s" podCreationTimestamp="2026-03-06 02:55:44 +0000 UTC" firstStartedPulling="2026-03-06 02:55:45.785597815 +0000 UTC m=+7.296511538" lastFinishedPulling="2026-03-06 02:55:51.135465352 +0000 UTC m=+12.646379075" observedRunningTime="2026-03-06 02:55:51.658248655 +0000 UTC m=+13.169162386" watchObservedRunningTime="2026-03-06 02:55:51.658584066 +0000 UTC m=+13.169497789" Mar 6 02:55:56.600455 sudo[2396]: pam_unix(sudo:session): session closed for user root Mar 6 02:55:56.668012 sshd[2395]: Connection closed by 10.200.16.10 port 42050 Mar 6 02:55:56.670073 sshd-session[2392]: pam_unix(sshd:session): session closed for user core Mar 6 02:55:56.672713 systemd[1]: sshd@6-10.200.20.34:22-10.200.16.10:42050.service: Deactivated successfully. Mar 6 02:55:56.676468 systemd[1]: session-9.scope: Deactivated successfully. Mar 6 02:55:56.682707 systemd[1]: session-9.scope: Consumed 3.760s CPU time, 223.2M memory peak. Mar 6 02:55:56.686076 systemd-logind[1868]: Session 9 logged out. Waiting for processes to exit. Mar 6 02:55:56.687815 systemd-logind[1868]: Removed session 9. Mar 6 02:56:00.396492 systemd[1]: Created slice kubepods-besteffort-podc7ad0795_75d8_43c7_a77d_af904dd61d16.slice - libcontainer container kubepods-besteffort-podc7ad0795_75d8_43c7_a77d_af904dd61d16.slice. Mar 6 02:56:00.408565 kubelet[3624]: I0306 02:56:00.408538 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52j4d\" (UniqueName: \"kubernetes.io/projected/c7ad0795-75d8-43c7-a77d-af904dd61d16-kube-api-access-52j4d\") pod \"calico-typha-6b64c84f55-qd4f6\" (UID: \"c7ad0795-75d8-43c7-a77d-af904dd61d16\") " pod="calico-system/calico-typha-6b64c84f55-qd4f6" Mar 6 02:56:00.408930 kubelet[3624]: I0306 02:56:00.408886 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ad0795-75d8-43c7-a77d-af904dd61d16-tigera-ca-bundle\") pod \"calico-typha-6b64c84f55-qd4f6\" (UID: \"c7ad0795-75d8-43c7-a77d-af904dd61d16\") " pod="calico-system/calico-typha-6b64c84f55-qd4f6" Mar 6 02:56:00.409024 kubelet[3624]: I0306 02:56:00.409014 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/c7ad0795-75d8-43c7-a77d-af904dd61d16-typha-certs\") pod \"calico-typha-6b64c84f55-qd4f6\" (UID: \"c7ad0795-75d8-43c7-a77d-af904dd61d16\") " pod="calico-system/calico-typha-6b64c84f55-qd4f6" Mar 6 02:56:00.463297 systemd[1]: Created slice kubepods-besteffort-pod524b4537_fc25_4d75_bb3f_13881faa85d6.slice - libcontainer container kubepods-besteffort-pod524b4537_fc25_4d75_bb3f_13881faa85d6.slice. Mar 6 02:56:00.509928 kubelet[3624]: I0306 02:56:00.509857 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/524b4537-fc25-4d75-bb3f-13881faa85d6-sys-fs\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.509928 kubelet[3624]: I0306 02:56:00.509893 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m2fv\" (UniqueName: \"kubernetes.io/projected/524b4537-fc25-4d75-bb3f-13881faa85d6-kube-api-access-9m2fv\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.509928 kubelet[3624]: I0306 02:56:00.509921 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/524b4537-fc25-4d75-bb3f-13881faa85d6-cni-log-dir\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.509928 kubelet[3624]: I0306 02:56:00.509933 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/524b4537-fc25-4d75-bb3f-13881faa85d6-tigera-ca-bundle\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.509928 kubelet[3624]: I0306 02:56:00.509942 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/524b4537-fc25-4d75-bb3f-13881faa85d6-nodeproc\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.510703 kubelet[3624]: I0306 02:56:00.509952 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/524b4537-fc25-4d75-bb3f-13881faa85d6-bpffs\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.510703 kubelet[3624]: I0306 02:56:00.509961 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/524b4537-fc25-4d75-bb3f-13881faa85d6-cni-net-dir\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.510703 kubelet[3624]: I0306 02:56:00.509969 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/524b4537-fc25-4d75-bb3f-13881faa85d6-flexvol-driver-host\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.510703 kubelet[3624]: I0306 02:56:00.509978 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/524b4537-fc25-4d75-bb3f-13881faa85d6-var-run-calico\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.510703 kubelet[3624]: I0306 02:56:00.510002 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/524b4537-fc25-4d75-bb3f-13881faa85d6-lib-modules\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.510783 kubelet[3624]: I0306 02:56:00.510022 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/524b4537-fc25-4d75-bb3f-13881faa85d6-cni-bin-dir\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.510783 kubelet[3624]: I0306 02:56:00.510031 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/524b4537-fc25-4d75-bb3f-13881faa85d6-node-certs\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.510783 kubelet[3624]: I0306 02:56:00.510038 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/524b4537-fc25-4d75-bb3f-13881faa85d6-policysync\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.510783 kubelet[3624]: I0306 02:56:00.510047 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/524b4537-fc25-4d75-bb3f-13881faa85d6-var-lib-calico\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.510783 kubelet[3624]: I0306 02:56:00.510061 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/524b4537-fc25-4d75-bb3f-13881faa85d6-xtables-lock\") pod \"calico-node-xqrr7\" (UID: \"524b4537-fc25-4d75-bb3f-13881faa85d6\") " pod="calico-system/calico-node-xqrr7" Mar 6 02:56:00.567459 kubelet[3624]: E0306 02:56:00.567415 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:00.611047 kubelet[3624]: I0306 02:56:00.610447 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5bde228-2966-4175-a103-4afd465c6d9f-kubelet-dir\") pod \"csi-node-driver-5qfnx\" (UID: \"a5bde228-2966-4175-a103-4afd465c6d9f\") " pod="calico-system/csi-node-driver-5qfnx" Mar 6 02:56:00.611047 kubelet[3624]: I0306 02:56:00.610473 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a5bde228-2966-4175-a103-4afd465c6d9f-registration-dir\") pod \"csi-node-driver-5qfnx\" (UID: \"a5bde228-2966-4175-a103-4afd465c6d9f\") " pod="calico-system/csi-node-driver-5qfnx" Mar 6 02:56:00.611047 kubelet[3624]: I0306 02:56:00.610523 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a5bde228-2966-4175-a103-4afd465c6d9f-socket-dir\") pod \"csi-node-driver-5qfnx\" (UID: \"a5bde228-2966-4175-a103-4afd465c6d9f\") " pod="calico-system/csi-node-driver-5qfnx" Mar 6 02:56:00.611047 kubelet[3624]: I0306 02:56:00.610532 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/a5bde228-2966-4175-a103-4afd465c6d9f-varrun\") pod \"csi-node-driver-5qfnx\" (UID: \"a5bde228-2966-4175-a103-4afd465c6d9f\") " pod="calico-system/csi-node-driver-5qfnx" Mar 6 02:56:00.611047 kubelet[3624]: I0306 02:56:00.610580 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmbfv\" (UniqueName: \"kubernetes.io/projected/a5bde228-2966-4175-a103-4afd465c6d9f-kube-api-access-mmbfv\") pod \"csi-node-driver-5qfnx\" (UID: \"a5bde228-2966-4175-a103-4afd465c6d9f\") " pod="calico-system/csi-node-driver-5qfnx" Mar 6 02:56:00.618730 kubelet[3624]: E0306 02:56:00.618607 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.619000 kubelet[3624]: W0306 02:56:00.618970 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.619152 kubelet[3624]: E0306 02:56:00.619092 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.637662 kubelet[3624]: E0306 02:56:00.637648 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.637803 kubelet[3624]: W0306 02:56:00.637729 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.637803 kubelet[3624]: E0306 02:56:00.637747 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.708606 containerd[1920]: time="2026-03-06T02:56:00.708495355Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b64c84f55-qd4f6,Uid:c7ad0795-75d8-43c7-a77d-af904dd61d16,Namespace:calico-system,Attempt:0,}" Mar 6 02:56:00.711992 kubelet[3624]: E0306 02:56:00.711970 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.712173 kubelet[3624]: W0306 02:56:00.712079 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.712173 kubelet[3624]: E0306 02:56:00.712104 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.712408 kubelet[3624]: E0306 02:56:00.712390 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.712543 kubelet[3624]: W0306 02:56:00.712461 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.712543 kubelet[3624]: E0306 02:56:00.712477 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.712633 kubelet[3624]: E0306 02:56:00.712613 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.712633 kubelet[3624]: W0306 02:56:00.712627 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.712723 kubelet[3624]: E0306 02:56:00.712638 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.712756 kubelet[3624]: E0306 02:56:00.712743 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.712756 kubelet[3624]: W0306 02:56:00.712749 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.712823 kubelet[3624]: E0306 02:56:00.712755 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.712931 kubelet[3624]: E0306 02:56:00.712838 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.712931 kubelet[3624]: W0306 02:56:00.712843 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.712931 kubelet[3624]: E0306 02:56:00.712849 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.713154 kubelet[3624]: E0306 02:56:00.713141 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.713275 kubelet[3624]: W0306 02:56:00.713216 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.713275 kubelet[3624]: E0306 02:56:00.713239 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.713431 kubelet[3624]: E0306 02:56:00.713418 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.713431 kubelet[3624]: W0306 02:56:00.713427 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.713509 kubelet[3624]: E0306 02:56:00.713436 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.713540 kubelet[3624]: E0306 02:56:00.713529 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.713540 kubelet[3624]: W0306 02:56:00.713534 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.713540 kubelet[3624]: E0306 02:56:00.713539 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.713631 kubelet[3624]: E0306 02:56:00.713617 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.713631 kubelet[3624]: W0306 02:56:00.713621 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.713631 kubelet[3624]: E0306 02:56:00.713626 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.713889 kubelet[3624]: E0306 02:56:00.713879 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.713960 kubelet[3624]: W0306 02:56:00.713949 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.714022 kubelet[3624]: E0306 02:56:00.714012 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.714257 kubelet[3624]: E0306 02:56:00.714227 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.714257 kubelet[3624]: W0306 02:56:00.714237 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.714257 kubelet[3624]: E0306 02:56:00.714246 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.714534 kubelet[3624]: E0306 02:56:00.714522 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.714647 kubelet[3624]: W0306 02:56:00.714594 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.714647 kubelet[3624]: E0306 02:56:00.714607 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.714831 kubelet[3624]: E0306 02:56:00.714821 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.715019 kubelet[3624]: W0306 02:56:00.714892 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.715019 kubelet[3624]: E0306 02:56:00.714921 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.715189 kubelet[3624]: E0306 02:56:00.715132 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.715189 kubelet[3624]: W0306 02:56:00.715141 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.715189 kubelet[3624]: E0306 02:56:00.715149 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.715441 kubelet[3624]: E0306 02:56:00.715381 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.715441 kubelet[3624]: W0306 02:56:00.715392 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.715441 kubelet[3624]: E0306 02:56:00.715402 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.715645 kubelet[3624]: E0306 02:56:00.715635 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.715761 kubelet[3624]: W0306 02:56:00.715699 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.715761 kubelet[3624]: E0306 02:56:00.715712 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.715966 kubelet[3624]: E0306 02:56:00.715955 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.716121 kubelet[3624]: W0306 02:56:00.716026 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.716121 kubelet[3624]: E0306 02:56:00.716041 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.716282 kubelet[3624]: E0306 02:56:00.716226 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.716282 kubelet[3624]: W0306 02:56:00.716236 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.716282 kubelet[3624]: E0306 02:56:00.716243 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.716543 kubelet[3624]: E0306 02:56:00.716472 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.716543 kubelet[3624]: W0306 02:56:00.716482 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.716543 kubelet[3624]: E0306 02:56:00.716491 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.716737 kubelet[3624]: E0306 02:56:00.716727 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.716854 kubelet[3624]: W0306 02:56:00.716789 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.716854 kubelet[3624]: E0306 02:56:00.716802 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.717087 kubelet[3624]: E0306 02:56:00.717076 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.717242 kubelet[3624]: W0306 02:56:00.717143 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.717242 kubelet[3624]: E0306 02:56:00.717158 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.717396 kubelet[3624]: E0306 02:56:00.717388 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.717451 kubelet[3624]: W0306 02:56:00.717442 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.717498 kubelet[3624]: E0306 02:56:00.717489 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.717729 kubelet[3624]: E0306 02:56:00.717697 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.717729 kubelet[3624]: W0306 02:56:00.717710 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.717729 kubelet[3624]: E0306 02:56:00.717718 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.718185 kubelet[3624]: E0306 02:56:00.718005 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.718185 kubelet[3624]: W0306 02:56:00.718015 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.718185 kubelet[3624]: E0306 02:56:00.718023 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.718986 kubelet[3624]: E0306 02:56:00.718970 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.718986 kubelet[3624]: W0306 02:56:00.718981 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.719073 kubelet[3624]: E0306 02:56:00.718992 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.727877 kubelet[3624]: E0306 02:56:00.727862 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:00.728014 kubelet[3624]: W0306 02:56:00.727972 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:00.728014 kubelet[3624]: E0306 02:56:00.727991 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:00.829515 containerd[1920]: time="2026-03-06T02:56:00.829347625Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xqrr7,Uid:524b4537-fc25-4d75-bb3f-13881faa85d6,Namespace:calico-system,Attempt:0,}" Mar 6 02:56:01.134781 containerd[1920]: time="2026-03-06T02:56:01.134638548Z" level=info msg="connecting to shim 3964cc1e4d1cb4bf74aebca0ef750fc73b74f92556d270763519e70a54eb59aa" address="unix:///run/containerd/s/2a183f395614b74cea262c8df89e99df4b2329f8a334f2971fb86b25bb420545" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:56:01.158064 systemd[1]: Started cri-containerd-3964cc1e4d1cb4bf74aebca0ef750fc73b74f92556d270763519e70a54eb59aa.scope - libcontainer container 3964cc1e4d1cb4bf74aebca0ef750fc73b74f92556d270763519e70a54eb59aa. Mar 6 02:56:01.284154 containerd[1920]: time="2026-03-06T02:56:01.284100388Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b64c84f55-qd4f6,Uid:c7ad0795-75d8-43c7-a77d-af904dd61d16,Namespace:calico-system,Attempt:0,} returns sandbox id \"3964cc1e4d1cb4bf74aebca0ef750fc73b74f92556d270763519e70a54eb59aa\"" Mar 6 02:56:01.285962 containerd[1920]: time="2026-03-06T02:56:01.285885809Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 6 02:56:01.495928 containerd[1920]: time="2026-03-06T02:56:01.495813126Z" level=info msg="connecting to shim 0adedecc4a27799523e82de50375f8bdc6bec0818def2ec23647a5c55392e05c" address="unix:///run/containerd/s/01380828731830108b2d238646fd09c6fd5ebfcb4416d7c7503f85898838f8e1" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:56:01.512117 systemd[1]: Started cri-containerd-0adedecc4a27799523e82de50375f8bdc6bec0818def2ec23647a5c55392e05c.scope - libcontainer container 0adedecc4a27799523e82de50375f8bdc6bec0818def2ec23647a5c55392e05c. Mar 6 02:56:01.539372 containerd[1920]: time="2026-03-06T02:56:01.539333815Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xqrr7,Uid:524b4537-fc25-4d75-bb3f-13881faa85d6,Namespace:calico-system,Attempt:0,} returns sandbox id \"0adedecc4a27799523e82de50375f8bdc6bec0818def2ec23647a5c55392e05c\"" Mar 6 02:56:02.595596 kubelet[3624]: E0306 02:56:02.595534 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:03.202977 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2193141978.mount: Deactivated successfully. Mar 6 02:56:04.593708 kubelet[3624]: E0306 02:56:04.592968 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:05.376852 containerd[1920]: time="2026-03-06T02:56:05.376804954Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:05.379685 containerd[1920]: time="2026-03-06T02:56:05.379655293Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Mar 6 02:56:05.441281 containerd[1920]: time="2026-03-06T02:56:05.441235115Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:05.487596 containerd[1920]: time="2026-03-06T02:56:05.487552456Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:05.488428 containerd[1920]: time="2026-03-06T02:56:05.488402155Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 4.202471616s" Mar 6 02:56:05.488473 containerd[1920]: time="2026-03-06T02:56:05.488432980Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Mar 6 02:56:05.490037 containerd[1920]: time="2026-03-06T02:56:05.489968293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 6 02:56:05.582569 containerd[1920]: time="2026-03-06T02:56:05.582517772Z" level=info msg="CreateContainer within sandbox \"3964cc1e4d1cb4bf74aebca0ef750fc73b74f92556d270763519e70a54eb59aa\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 6 02:56:05.788928 containerd[1920]: time="2026-03-06T02:56:05.787806159Z" level=info msg="Container 2f014b236337a0cfa2833de901c2e25849991aa41a5e3070c219a8f164b51bc4: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:56:05.932175 containerd[1920]: time="2026-03-06T02:56:05.932092400Z" level=info msg="CreateContainer within sandbox \"3964cc1e4d1cb4bf74aebca0ef750fc73b74f92556d270763519e70a54eb59aa\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"2f014b236337a0cfa2833de901c2e25849991aa41a5e3070c219a8f164b51bc4\"" Mar 6 02:56:05.933879 containerd[1920]: time="2026-03-06T02:56:05.932914578Z" level=info msg="StartContainer for \"2f014b236337a0cfa2833de901c2e25849991aa41a5e3070c219a8f164b51bc4\"" Mar 6 02:56:05.933879 containerd[1920]: time="2026-03-06T02:56:05.933645498Z" level=info msg="connecting to shim 2f014b236337a0cfa2833de901c2e25849991aa41a5e3070c219a8f164b51bc4" address="unix:///run/containerd/s/2a183f395614b74cea262c8df89e99df4b2329f8a334f2971fb86b25bb420545" protocol=ttrpc version=3 Mar 6 02:56:06.010034 systemd[1]: Started cri-containerd-2f014b236337a0cfa2833de901c2e25849991aa41a5e3070c219a8f164b51bc4.scope - libcontainer container 2f014b236337a0cfa2833de901c2e25849991aa41a5e3070c219a8f164b51bc4. Mar 6 02:56:06.045795 containerd[1920]: time="2026-03-06T02:56:06.045647136Z" level=info msg="StartContainer for \"2f014b236337a0cfa2833de901c2e25849991aa41a5e3070c219a8f164b51bc4\" returns successfully" Mar 6 02:56:06.593847 kubelet[3624]: E0306 02:56:06.593247 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:06.694124 kubelet[3624]: I0306 02:56:06.693789 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6b64c84f55-qd4f6" podStartSLOduration=2.489986843 podStartE2EDuration="6.693777006s" podCreationTimestamp="2026-03-06 02:56:00 +0000 UTC" firstStartedPulling="2026-03-06 02:56:01.285399329 +0000 UTC m=+22.796313052" lastFinishedPulling="2026-03-06 02:56:05.489189476 +0000 UTC m=+27.000103215" observedRunningTime="2026-03-06 02:56:06.693710444 +0000 UTC m=+28.204624199" watchObservedRunningTime="2026-03-06 02:56:06.693777006 +0000 UTC m=+28.204690729" Mar 6 02:56:06.729718 kubelet[3624]: E0306 02:56:06.729682 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.729718 kubelet[3624]: W0306 02:56:06.729708 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.729718 kubelet[3624]: E0306 02:56:06.729729 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.729936 kubelet[3624]: E0306 02:56:06.729853 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.729936 kubelet[3624]: W0306 02:56:06.729860 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.729936 kubelet[3624]: E0306 02:56:06.729891 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.730038 kubelet[3624]: E0306 02:56:06.730027 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.730038 kubelet[3624]: W0306 02:56:06.730036 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.730074 kubelet[3624]: E0306 02:56:06.730043 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.730159 kubelet[3624]: E0306 02:56:06.730148 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.730159 kubelet[3624]: W0306 02:56:06.730156 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.730205 kubelet[3624]: E0306 02:56:06.730162 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.730289 kubelet[3624]: E0306 02:56:06.730279 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.730289 kubelet[3624]: W0306 02:56:06.730286 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.730338 kubelet[3624]: E0306 02:56:06.730322 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.730430 kubelet[3624]: E0306 02:56:06.730421 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.730430 kubelet[3624]: W0306 02:56:06.730428 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.730476 kubelet[3624]: E0306 02:56:06.730434 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.730541 kubelet[3624]: E0306 02:56:06.730529 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.730541 kubelet[3624]: W0306 02:56:06.730537 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.730583 kubelet[3624]: E0306 02:56:06.730544 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.730656 kubelet[3624]: E0306 02:56:06.730644 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.730656 kubelet[3624]: W0306 02:56:06.730651 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.730703 kubelet[3624]: E0306 02:56:06.730656 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.730772 kubelet[3624]: E0306 02:56:06.730764 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.730772 kubelet[3624]: W0306 02:56:06.730770 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.730823 kubelet[3624]: E0306 02:56:06.730775 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.730878 kubelet[3624]: E0306 02:56:06.730867 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.730878 kubelet[3624]: W0306 02:56:06.730874 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.730878 kubelet[3624]: E0306 02:56:06.730879 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.730977 kubelet[3624]: E0306 02:56:06.730969 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.730977 kubelet[3624]: W0306 02:56:06.730974 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.731022 kubelet[3624]: E0306 02:56:06.730986 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.731078 kubelet[3624]: E0306 02:56:06.731069 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.731078 kubelet[3624]: W0306 02:56:06.731074 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.731129 kubelet[3624]: E0306 02:56:06.731079 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.731198 kubelet[3624]: E0306 02:56:06.731187 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.731198 kubelet[3624]: W0306 02:56:06.731194 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.731244 kubelet[3624]: E0306 02:56:06.731200 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.731298 kubelet[3624]: E0306 02:56:06.731290 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.731298 kubelet[3624]: W0306 02:56:06.731295 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.731337 kubelet[3624]: E0306 02:56:06.731300 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.731458 kubelet[3624]: E0306 02:56:06.731444 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.731458 kubelet[3624]: W0306 02:56:06.731454 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.731502 kubelet[3624]: E0306 02:56:06.731461 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.752841 kubelet[3624]: E0306 02:56:06.752817 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.752841 kubelet[3624]: W0306 02:56:06.752835 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.753020 kubelet[3624]: E0306 02:56:06.752848 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.753020 kubelet[3624]: E0306 02:56:06.753016 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.753190 kubelet[3624]: W0306 02:56:06.753023 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.753190 kubelet[3624]: E0306 02:56:06.753031 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.753295 kubelet[3624]: E0306 02:56:06.753283 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.753348 kubelet[3624]: W0306 02:56:06.753338 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.753399 kubelet[3624]: E0306 02:56:06.753388 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.753602 kubelet[3624]: E0306 02:56:06.753590 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.753684 kubelet[3624]: W0306 02:56:06.753659 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.753684 kubelet[3624]: E0306 02:56:06.753674 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.753920 kubelet[3624]: E0306 02:56:06.753894 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.754063 kubelet[3624]: W0306 02:56:06.753912 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.754063 kubelet[3624]: E0306 02:56:06.753995 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.754261 kubelet[3624]: E0306 02:56:06.754248 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.754320 kubelet[3624]: W0306 02:56:06.754311 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.754410 kubelet[3624]: E0306 02:56:06.754358 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.754584 kubelet[3624]: E0306 02:56:06.754574 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.754730 kubelet[3624]: W0306 02:56:06.754641 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.754730 kubelet[3624]: E0306 02:56:06.754655 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.754944 kubelet[3624]: E0306 02:56:06.754930 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.755076 kubelet[3624]: W0306 02:56:06.755016 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.755076 kubelet[3624]: E0306 02:56:06.755031 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.755270 kubelet[3624]: E0306 02:56:06.755260 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.755440 kubelet[3624]: W0306 02:56:06.755327 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.755440 kubelet[3624]: E0306 02:56:06.755340 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.755548 kubelet[3624]: E0306 02:56:06.755540 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.755597 kubelet[3624]: W0306 02:56:06.755589 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.755635 kubelet[3624]: E0306 02:56:06.755626 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.755823 kubelet[3624]: E0306 02:56:06.755813 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.755916 kubelet[3624]: W0306 02:56:06.755869 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.755916 kubelet[3624]: E0306 02:56:06.755880 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.756178 kubelet[3624]: E0306 02:56:06.756106 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.756178 kubelet[3624]: W0306 02:56:06.756116 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.756178 kubelet[3624]: E0306 02:56:06.756124 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.756387 kubelet[3624]: E0306 02:56:06.756377 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.756496 kubelet[3624]: W0306 02:56:06.756440 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.756496 kubelet[3624]: E0306 02:56:06.756454 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.756708 kubelet[3624]: E0306 02:56:06.756677 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.756708 kubelet[3624]: W0306 02:56:06.756688 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.756708 kubelet[3624]: E0306 02:56:06.756696 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.757080 kubelet[3624]: E0306 02:56:06.757010 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.757080 kubelet[3624]: W0306 02:56:06.757021 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.757080 kubelet[3624]: E0306 02:56:06.757029 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.757405 kubelet[3624]: E0306 02:56:06.757288 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.757405 kubelet[3624]: W0306 02:56:06.757299 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.757405 kubelet[3624]: E0306 02:56:06.757308 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.757497 kubelet[3624]: E0306 02:56:06.757470 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.757497 kubelet[3624]: W0306 02:56:06.757479 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.757497 kubelet[3624]: E0306 02:56:06.757489 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:06.757963 kubelet[3624]: E0306 02:56:06.757921 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:06.757963 kubelet[3624]: W0306 02:56:06.757934 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:06.757963 kubelet[3624]: E0306 02:56:06.757943 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.683330 kubelet[3624]: I0306 02:56:07.683298 3624 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 6 02:56:07.737811 kubelet[3624]: E0306 02:56:07.737679 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.737811 kubelet[3624]: W0306 02:56:07.737704 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.737811 kubelet[3624]: E0306 02:56:07.737730 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.738158 kubelet[3624]: E0306 02:56:07.738059 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.738158 kubelet[3624]: W0306 02:56:07.738071 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.738158 kubelet[3624]: E0306 02:56:07.738081 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.738315 kubelet[3624]: E0306 02:56:07.738304 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.738366 kubelet[3624]: W0306 02:56:07.738356 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.738412 kubelet[3624]: E0306 02:56:07.738401 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.738602 kubelet[3624]: E0306 02:56:07.738590 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.738763 kubelet[3624]: W0306 02:56:07.738659 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.738763 kubelet[3624]: E0306 02:56:07.738673 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.738884 kubelet[3624]: E0306 02:56:07.738873 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.738990 kubelet[3624]: W0306 02:56:07.738971 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.739051 kubelet[3624]: E0306 02:56:07.739039 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.739246 kubelet[3624]: E0306 02:56:07.739236 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.990593 kubelet[3624]: W0306 02:56:07.739309 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.990593 kubelet[3624]: E0306 02:56:07.739322 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.990593 kubelet[3624]: E0306 02:56:07.739452 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.990593 kubelet[3624]: W0306 02:56:07.739460 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.990593 kubelet[3624]: E0306 02:56:07.739467 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.990593 kubelet[3624]: E0306 02:56:07.739611 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.990593 kubelet[3624]: W0306 02:56:07.739619 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.990593 kubelet[3624]: E0306 02:56:07.739629 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.990593 kubelet[3624]: E0306 02:56:07.739755 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.990593 kubelet[3624]: W0306 02:56:07.739763 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.990837 kubelet[3624]: E0306 02:56:07.739771 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.990837 kubelet[3624]: E0306 02:56:07.739881 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.990837 kubelet[3624]: W0306 02:56:07.739887 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.990837 kubelet[3624]: E0306 02:56:07.739894 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.990837 kubelet[3624]: E0306 02:56:07.740021 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.990837 kubelet[3624]: W0306 02:56:07.740027 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.990837 kubelet[3624]: E0306 02:56:07.740033 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.990837 kubelet[3624]: E0306 02:56:07.740137 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.990837 kubelet[3624]: W0306 02:56:07.740142 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.990837 kubelet[3624]: E0306 02:56:07.740149 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991002 kubelet[3624]: E0306 02:56:07.740256 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991002 kubelet[3624]: W0306 02:56:07.740261 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991002 kubelet[3624]: E0306 02:56:07.740267 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991002 kubelet[3624]: E0306 02:56:07.740373 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991002 kubelet[3624]: W0306 02:56:07.740379 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991002 kubelet[3624]: E0306 02:56:07.740384 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991002 kubelet[3624]: E0306 02:56:07.740478 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991002 kubelet[3624]: W0306 02:56:07.740483 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991002 kubelet[3624]: E0306 02:56:07.740488 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991002 kubelet[3624]: E0306 02:56:07.760923 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991153 kubelet[3624]: W0306 02:56:07.760944 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991153 kubelet[3624]: E0306 02:56:07.760961 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991153 kubelet[3624]: E0306 02:56:07.761106 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991153 kubelet[3624]: W0306 02:56:07.761113 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991153 kubelet[3624]: E0306 02:56:07.761120 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991153 kubelet[3624]: E0306 02:56:07.761316 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991153 kubelet[3624]: W0306 02:56:07.761330 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991153 kubelet[3624]: E0306 02:56:07.761340 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991153 kubelet[3624]: E0306 02:56:07.761480 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991153 kubelet[3624]: W0306 02:56:07.761488 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991299 kubelet[3624]: E0306 02:56:07.761494 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991299 kubelet[3624]: E0306 02:56:07.761635 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991299 kubelet[3624]: W0306 02:56:07.761642 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991299 kubelet[3624]: E0306 02:56:07.761648 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991299 kubelet[3624]: E0306 02:56:07.761785 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991299 kubelet[3624]: W0306 02:56:07.761791 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991299 kubelet[3624]: E0306 02:56:07.761797 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991299 kubelet[3624]: E0306 02:56:07.762070 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991299 kubelet[3624]: W0306 02:56:07.762077 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991299 kubelet[3624]: E0306 02:56:07.762084 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991438 kubelet[3624]: E0306 02:56:07.762233 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991438 kubelet[3624]: W0306 02:56:07.762241 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991438 kubelet[3624]: E0306 02:56:07.762250 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991438 kubelet[3624]: E0306 02:56:07.762370 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991438 kubelet[3624]: W0306 02:56:07.762376 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991438 kubelet[3624]: E0306 02:56:07.762382 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991438 kubelet[3624]: E0306 02:56:07.762491 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991438 kubelet[3624]: W0306 02:56:07.762496 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991438 kubelet[3624]: E0306 02:56:07.762502 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991438 kubelet[3624]: E0306 02:56:07.762626 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991574 kubelet[3624]: W0306 02:56:07.762632 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991574 kubelet[3624]: E0306 02:56:07.762638 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991574 kubelet[3624]: E0306 02:56:07.762810 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991574 kubelet[3624]: W0306 02:56:07.762819 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991574 kubelet[3624]: E0306 02:56:07.762827 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991574 kubelet[3624]: E0306 02:56:07.762975 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991574 kubelet[3624]: W0306 02:56:07.762981 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991574 kubelet[3624]: E0306 02:56:07.762988 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991574 kubelet[3624]: E0306 02:56:07.763115 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991574 kubelet[3624]: W0306 02:56:07.763121 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991706 kubelet[3624]: E0306 02:56:07.763127 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991706 kubelet[3624]: E0306 02:56:07.763239 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991706 kubelet[3624]: W0306 02:56:07.763244 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991706 kubelet[3624]: E0306 02:56:07.763250 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991706 kubelet[3624]: E0306 02:56:07.763384 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991706 kubelet[3624]: W0306 02:56:07.763393 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991706 kubelet[3624]: E0306 02:56:07.763401 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991706 kubelet[3624]: E0306 02:56:07.763597 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991706 kubelet[3624]: W0306 02:56:07.763606 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991706 kubelet[3624]: E0306 02:56:07.763617 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:07.991845 kubelet[3624]: E0306 02:56:07.763794 3624 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:56:07.991845 kubelet[3624]: W0306 02:56:07.763803 3624 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:56:07.991845 kubelet[3624]: E0306 02:56:07.763813 3624 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:56:08.928603 kubelet[3624]: E0306 02:56:08.593498 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:09.929005 containerd[1920]: time="2026-03-06T02:56:09.928960700Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:09.990779 containerd[1920]: time="2026-03-06T02:56:09.990725463Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Mar 6 02:56:10.037530 containerd[1920]: time="2026-03-06T02:56:10.037474490Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:10.042446 containerd[1920]: time="2026-03-06T02:56:10.042276692Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:10.042794 containerd[1920]: time="2026-03-06T02:56:10.042767916Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 4.55275603s" Mar 6 02:56:10.042888 containerd[1920]: time="2026-03-06T02:56:10.042875111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Mar 6 02:56:10.088253 containerd[1920]: time="2026-03-06T02:56:10.088211205Z" level=info msg="CreateContainer within sandbox \"0adedecc4a27799523e82de50375f8bdc6bec0818def2ec23647a5c55392e05c\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 6 02:56:10.232073 containerd[1920]: time="2026-03-06T02:56:10.232004909Z" level=info msg="Container b512b20b8ce746fb54d60bfcdecbb43c921a7ac1ac97b20e736b1d3b31649412: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:56:10.235428 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount78012337.mount: Deactivated successfully. Mar 6 02:56:10.385664 containerd[1920]: time="2026-03-06T02:56:10.385590192Z" level=info msg="CreateContainer within sandbox \"0adedecc4a27799523e82de50375f8bdc6bec0818def2ec23647a5c55392e05c\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"b512b20b8ce746fb54d60bfcdecbb43c921a7ac1ac97b20e736b1d3b31649412\"" Mar 6 02:56:10.386366 containerd[1920]: time="2026-03-06T02:56:10.386325280Z" level=info msg="StartContainer for \"b512b20b8ce746fb54d60bfcdecbb43c921a7ac1ac97b20e736b1d3b31649412\"" Mar 6 02:56:10.388169 containerd[1920]: time="2026-03-06T02:56:10.388101897Z" level=info msg="connecting to shim b512b20b8ce746fb54d60bfcdecbb43c921a7ac1ac97b20e736b1d3b31649412" address="unix:///run/containerd/s/01380828731830108b2d238646fd09c6fd5ebfcb4416d7c7503f85898838f8e1" protocol=ttrpc version=3 Mar 6 02:56:10.408069 systemd[1]: Started cri-containerd-b512b20b8ce746fb54d60bfcdecbb43c921a7ac1ac97b20e736b1d3b31649412.scope - libcontainer container b512b20b8ce746fb54d60bfcdecbb43c921a7ac1ac97b20e736b1d3b31649412. Mar 6 02:56:10.463378 containerd[1920]: time="2026-03-06T02:56:10.463339876Z" level=info msg="StartContainer for \"b512b20b8ce746fb54d60bfcdecbb43c921a7ac1ac97b20e736b1d3b31649412\" returns successfully" Mar 6 02:56:10.467761 systemd[1]: cri-containerd-b512b20b8ce746fb54d60bfcdecbb43c921a7ac1ac97b20e736b1d3b31649412.scope: Deactivated successfully. Mar 6 02:56:10.472127 containerd[1920]: time="2026-03-06T02:56:10.472092877Z" level=info msg="received container exit event container_id:\"b512b20b8ce746fb54d60bfcdecbb43c921a7ac1ac97b20e736b1d3b31649412\" id:\"b512b20b8ce746fb54d60bfcdecbb43c921a7ac1ac97b20e736b1d3b31649412\" pid:4280 exited_at:{seconds:1772765770 nanos:470861605}" Mar 6 02:56:10.488158 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b512b20b8ce746fb54d60bfcdecbb43c921a7ac1ac97b20e736b1d3b31649412-rootfs.mount: Deactivated successfully. Mar 6 02:56:10.594670 kubelet[3624]: E0306 02:56:10.593600 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:12.169858 kubelet[3624]: I0306 02:56:12.169816 3624 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 6 02:56:13.232135 kubelet[3624]: E0306 02:56:12.592956 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:14.594430 kubelet[3624]: E0306 02:56:14.593989 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:16.593846 kubelet[3624]: E0306 02:56:16.593221 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:16.703627 containerd[1920]: time="2026-03-06T02:56:16.703557210Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 6 02:56:18.593521 kubelet[3624]: E0306 02:56:18.593118 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:20.593213 kubelet[3624]: E0306 02:56:20.593163 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:22.594153 kubelet[3624]: E0306 02:56:22.594093 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:24.592896 kubelet[3624]: E0306 02:56:24.592850 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:25.503304 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2336318041.mount: Deactivated successfully. Mar 6 02:56:26.593672 kubelet[3624]: E0306 02:56:26.593011 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:28.593725 kubelet[3624]: E0306 02:56:28.593331 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:30.594480 kubelet[3624]: E0306 02:56:30.594150 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:31.589810 containerd[1920]: time="2026-03-06T02:56:31.589753553Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:31.635401 containerd[1920]: time="2026-03-06T02:56:31.635351148Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Mar 6 02:56:31.638580 containerd[1920]: time="2026-03-06T02:56:31.638542051Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:31.684894 containerd[1920]: time="2026-03-06T02:56:31.684844829Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:31.685706 containerd[1920]: time="2026-03-06T02:56:31.685676664Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 14.982049476s" Mar 6 02:56:31.685749 containerd[1920]: time="2026-03-06T02:56:31.685711649Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Mar 6 02:56:31.732060 containerd[1920]: time="2026-03-06T02:56:31.732020259Z" level=info msg="CreateContainer within sandbox \"0adedecc4a27799523e82de50375f8bdc6bec0818def2ec23647a5c55392e05c\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 6 02:56:31.895990 containerd[1920]: time="2026-03-06T02:56:31.895212858Z" level=info msg="Container 1e7dad604031c7389dd4f3ffb933cdfdffd7a9db33f52ba268fc04d99227ad8f: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:56:32.049513 containerd[1920]: time="2026-03-06T02:56:32.049470984Z" level=info msg="CreateContainer within sandbox \"0adedecc4a27799523e82de50375f8bdc6bec0818def2ec23647a5c55392e05c\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"1e7dad604031c7389dd4f3ffb933cdfdffd7a9db33f52ba268fc04d99227ad8f\"" Mar 6 02:56:32.050317 containerd[1920]: time="2026-03-06T02:56:32.050257730Z" level=info msg="StartContainer for \"1e7dad604031c7389dd4f3ffb933cdfdffd7a9db33f52ba268fc04d99227ad8f\"" Mar 6 02:56:32.051436 containerd[1920]: time="2026-03-06T02:56:32.051409431Z" level=info msg="connecting to shim 1e7dad604031c7389dd4f3ffb933cdfdffd7a9db33f52ba268fc04d99227ad8f" address="unix:///run/containerd/s/01380828731830108b2d238646fd09c6fd5ebfcb4416d7c7503f85898838f8e1" protocol=ttrpc version=3 Mar 6 02:56:32.072045 systemd[1]: Started cri-containerd-1e7dad604031c7389dd4f3ffb933cdfdffd7a9db33f52ba268fc04d99227ad8f.scope - libcontainer container 1e7dad604031c7389dd4f3ffb933cdfdffd7a9db33f52ba268fc04d99227ad8f. Mar 6 02:56:32.135960 containerd[1920]: time="2026-03-06T02:56:32.135914949Z" level=info msg="StartContainer for \"1e7dad604031c7389dd4f3ffb933cdfdffd7a9db33f52ba268fc04d99227ad8f\" returns successfully" Mar 6 02:56:32.161311 systemd[1]: cri-containerd-1e7dad604031c7389dd4f3ffb933cdfdffd7a9db33f52ba268fc04d99227ad8f.scope: Deactivated successfully. Mar 6 02:56:32.163817 containerd[1920]: time="2026-03-06T02:56:32.163720320Z" level=info msg="received container exit event container_id:\"1e7dad604031c7389dd4f3ffb933cdfdffd7a9db33f52ba268fc04d99227ad8f\" id:\"1e7dad604031c7389dd4f3ffb933cdfdffd7a9db33f52ba268fc04d99227ad8f\" pid:4338 exited_at:{seconds:1772765792 nanos:163295978}" Mar 6 02:56:32.180390 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1e7dad604031c7389dd4f3ffb933cdfdffd7a9db33f52ba268fc04d99227ad8f-rootfs.mount: Deactivated successfully. Mar 6 02:56:32.594230 kubelet[3624]: E0306 02:56:32.594136 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:37.384996 kubelet[3624]: E0306 02:56:34.593779 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:37.384996 kubelet[3624]: E0306 02:56:36.593169 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:38.593688 kubelet[3624]: E0306 02:56:38.593360 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:38.748764 containerd[1920]: time="2026-03-06T02:56:38.748714742Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 6 02:56:40.594126 kubelet[3624]: E0306 02:56:40.593822 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:42.594216 kubelet[3624]: E0306 02:56:42.594169 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:44.594221 kubelet[3624]: E0306 02:56:44.594153 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:44.786268 containerd[1920]: time="2026-03-06T02:56:44.786203954Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:44.789212 containerd[1920]: time="2026-03-06T02:56:44.789176898Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Mar 6 02:56:44.833934 containerd[1920]: time="2026-03-06T02:56:44.833688030Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:44.837663 containerd[1920]: time="2026-03-06T02:56:44.837618085Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:44.838374 containerd[1920]: time="2026-03-06T02:56:44.838259634Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 6.089490683s" Mar 6 02:56:44.838374 containerd[1920]: time="2026-03-06T02:56:44.838283339Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Mar 6 02:56:44.884192 containerd[1920]: time="2026-03-06T02:56:44.884050822Z" level=info msg="CreateContainer within sandbox \"0adedecc4a27799523e82de50375f8bdc6bec0818def2ec23647a5c55392e05c\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 6 02:56:45.046050 containerd[1920]: time="2026-03-06T02:56:45.044607556Z" level=info msg="Container 701e3d8397e1caeb570496bdc1ba9aa92630dd90ba58c43480ee6b086399ecda: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:56:45.047151 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3019921202.mount: Deactivated successfully. Mar 6 02:56:45.193795 containerd[1920]: time="2026-03-06T02:56:45.193571657Z" level=info msg="CreateContainer within sandbox \"0adedecc4a27799523e82de50375f8bdc6bec0818def2ec23647a5c55392e05c\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"701e3d8397e1caeb570496bdc1ba9aa92630dd90ba58c43480ee6b086399ecda\"" Mar 6 02:56:45.194896 containerd[1920]: time="2026-03-06T02:56:45.194212790Z" level=info msg="StartContainer for \"701e3d8397e1caeb570496bdc1ba9aa92630dd90ba58c43480ee6b086399ecda\"" Mar 6 02:56:45.195660 containerd[1920]: time="2026-03-06T02:56:45.195637860Z" level=info msg="connecting to shim 701e3d8397e1caeb570496bdc1ba9aa92630dd90ba58c43480ee6b086399ecda" address="unix:///run/containerd/s/01380828731830108b2d238646fd09c6fd5ebfcb4416d7c7503f85898838f8e1" protocol=ttrpc version=3 Mar 6 02:56:45.217049 systemd[1]: Started cri-containerd-701e3d8397e1caeb570496bdc1ba9aa92630dd90ba58c43480ee6b086399ecda.scope - libcontainer container 701e3d8397e1caeb570496bdc1ba9aa92630dd90ba58c43480ee6b086399ecda. Mar 6 02:56:45.274350 containerd[1920]: time="2026-03-06T02:56:45.274314283Z" level=info msg="StartContainer for \"701e3d8397e1caeb570496bdc1ba9aa92630dd90ba58c43480ee6b086399ecda\" returns successfully" Mar 6 02:56:46.593853 kubelet[3624]: E0306 02:56:46.593795 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:48.594518 kubelet[3624]: E0306 02:56:48.594311 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:50.593745 kubelet[3624]: E0306 02:56:50.593551 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:52.593352 kubelet[3624]: E0306 02:56:52.593161 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:54.592769 kubelet[3624]: E0306 02:56:54.592644 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5qfnx" podUID="a5bde228-2966-4175-a103-4afd465c6d9f" Mar 6 02:56:54.705050 containerd[1920]: time="2026-03-06T02:56:54.705001824Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 6 02:56:54.707342 systemd[1]: cri-containerd-701e3d8397e1caeb570496bdc1ba9aa92630dd90ba58c43480ee6b086399ecda.scope: Deactivated successfully. Mar 6 02:56:54.708032 systemd[1]: cri-containerd-701e3d8397e1caeb570496bdc1ba9aa92630dd90ba58c43480ee6b086399ecda.scope: Consumed 329ms CPU time, 190.8M memory peak, 171.3M written to disk. Mar 6 02:56:54.713372 containerd[1920]: time="2026-03-06T02:56:54.712951609Z" level=info msg="received container exit event container_id:\"701e3d8397e1caeb570496bdc1ba9aa92630dd90ba58c43480ee6b086399ecda\" id:\"701e3d8397e1caeb570496bdc1ba9aa92630dd90ba58c43480ee6b086399ecda\" pid:4394 exited_at:{seconds:1772765814 nanos:712721362}" Mar 6 02:56:54.714501 kubelet[3624]: I0306 02:56:54.714094 3624 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Mar 6 02:56:54.736450 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-701e3d8397e1caeb570496bdc1ba9aa92630dd90ba58c43480ee6b086399ecda-rootfs.mount: Deactivated successfully. Mar 6 02:56:55.064394 systemd[1]: Created slice kubepods-besteffort-pod5d0afbfb_0167_41c2_b3c3_176bd7181ce9.slice - libcontainer container kubepods-besteffort-pod5d0afbfb_0167_41c2_b3c3_176bd7181ce9.slice. Mar 6 02:56:55.071013 systemd[1]: Created slice kubepods-besteffort-pod89cc2055_c0e4_4f2d_baa5_a51a5bbbd9f6.slice - libcontainer container kubepods-besteffort-pod89cc2055_c0e4_4f2d_baa5_a51a5bbbd9f6.slice. Mar 6 02:56:55.083290 systemd[1]: Created slice kubepods-besteffort-pod9f8be92d_b77f_4be7_b306_4ecace67782f.slice - libcontainer container kubepods-besteffort-pod9f8be92d_b77f_4be7_b306_4ecace67782f.slice. Mar 6 02:56:55.092479 systemd[1]: Created slice kubepods-besteffort-pod9a2bd17b_a387_4762_bdc0_a8863bbccf80.slice - libcontainer container kubepods-besteffort-pod9a2bd17b_a387_4762_bdc0_a8863bbccf80.slice. Mar 6 02:56:55.099205 systemd[1]: Created slice kubepods-besteffort-pod6c90e71f_9e4c_4dc9_b065_b12cb8d9d954.slice - libcontainer container kubepods-besteffort-pod6c90e71f_9e4c_4dc9_b065_b12cb8d9d954.slice. Mar 6 02:56:55.105079 systemd[1]: Created slice kubepods-burstable-pod2e5652d3_fe74_46a9_ae0e_f644c51ba2ba.slice - libcontainer container kubepods-burstable-pod2e5652d3_fe74_46a9_ae0e_f644c51ba2ba.slice. Mar 6 02:56:55.110438 systemd[1]: Created slice kubepods-burstable-pod4523b3f9_467b_41e3_93e3_a2e8bb564f83.slice - libcontainer container kubepods-burstable-pod4523b3f9_467b_41e3_93e3_a2e8bb564f83.slice. Mar 6 02:56:55.149532 kubelet[3624]: I0306 02:56:55.149488 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95m2g\" (UniqueName: \"kubernetes.io/projected/9a2bd17b-a387-4762-bdc0-a8863bbccf80-kube-api-access-95m2g\") pod \"calico-apiserver-6856446c58-8x2gl\" (UID: \"9a2bd17b-a387-4762-bdc0-a8863bbccf80\") " pod="calico-system/calico-apiserver-6856446c58-8x2gl" Mar 6 02:56:55.149532 kubelet[3624]: I0306 02:56:55.149533 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89cc2055-c0e4-4f2d-baa5-a51a5bbbd9f6-tigera-ca-bundle\") pod \"calico-kube-controllers-8659d8567b-bwxtm\" (UID: \"89cc2055-c0e4-4f2d-baa5-a51a5bbbd9f6\") " pod="calico-system/calico-kube-controllers-8659d8567b-bwxtm" Mar 6 02:56:55.149745 kubelet[3624]: I0306 02:56:55.149548 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c90e71f-9e4c-4dc9-b065-b12cb8d9d954-config\") pod \"goldmane-cccfbd5cf-7p82v\" (UID: \"6c90e71f-9e4c-4dc9-b065-b12cb8d9d954\") " pod="calico-system/goldmane-cccfbd5cf-7p82v" Mar 6 02:56:55.149745 kubelet[3624]: I0306 02:56:55.149563 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/6c90e71f-9e4c-4dc9-b065-b12cb8d9d954-goldmane-key-pair\") pod \"goldmane-cccfbd5cf-7p82v\" (UID: \"6c90e71f-9e4c-4dc9-b065-b12cb8d9d954\") " pod="calico-system/goldmane-cccfbd5cf-7p82v" Mar 6 02:56:55.149745 kubelet[3624]: I0306 02:56:55.149574 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnhv5\" (UniqueName: \"kubernetes.io/projected/89cc2055-c0e4-4f2d-baa5-a51a5bbbd9f6-kube-api-access-gnhv5\") pod \"calico-kube-controllers-8659d8567b-bwxtm\" (UID: \"89cc2055-c0e4-4f2d-baa5-a51a5bbbd9f6\") " pod="calico-system/calico-kube-controllers-8659d8567b-bwxtm" Mar 6 02:56:55.149745 kubelet[3624]: I0306 02:56:55.149591 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-whisker-ca-bundle\") pod \"whisker-5bc8b64dc-wf5ff\" (UID: \"5d0afbfb-0167-41c2-b3c3-176bd7181ce9\") " pod="calico-system/whisker-5bc8b64dc-wf5ff" Mar 6 02:56:55.149745 kubelet[3624]: I0306 02:56:55.149602 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88sbn\" (UniqueName: \"kubernetes.io/projected/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-kube-api-access-88sbn\") pod \"whisker-5bc8b64dc-wf5ff\" (UID: \"5d0afbfb-0167-41c2-b3c3-176bd7181ce9\") " pod="calico-system/whisker-5bc8b64dc-wf5ff" Mar 6 02:56:55.149837 kubelet[3624]: I0306 02:56:55.149620 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpcm2\" (UniqueName: \"kubernetes.io/projected/2e5652d3-fe74-46a9-ae0e-f644c51ba2ba-kube-api-access-lpcm2\") pod \"coredns-66bc5c9577-mz5wt\" (UID: \"2e5652d3-fe74-46a9-ae0e-f644c51ba2ba\") " pod="kube-system/coredns-66bc5c9577-mz5wt" Mar 6 02:56:55.149837 kubelet[3624]: I0306 02:56:55.149630 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8285\" (UniqueName: \"kubernetes.io/projected/4523b3f9-467b-41e3-93e3-a2e8bb564f83-kube-api-access-f8285\") pod \"coredns-66bc5c9577-gmmsw\" (UID: \"4523b3f9-467b-41e3-93e3-a2e8bb564f83\") " pod="kube-system/coredns-66bc5c9577-gmmsw" Mar 6 02:56:55.149837 kubelet[3624]: I0306 02:56:55.149645 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgbv6\" (UniqueName: \"kubernetes.io/projected/9f8be92d-b77f-4be7-b306-4ecace67782f-kube-api-access-wgbv6\") pod \"calico-apiserver-6856446c58-tcg5n\" (UID: \"9f8be92d-b77f-4be7-b306-4ecace67782f\") " pod="calico-system/calico-apiserver-6856446c58-tcg5n" Mar 6 02:56:55.149837 kubelet[3624]: I0306 02:56:55.149657 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-nginx-config\") pod \"whisker-5bc8b64dc-wf5ff\" (UID: \"5d0afbfb-0167-41c2-b3c3-176bd7181ce9\") " pod="calico-system/whisker-5bc8b64dc-wf5ff" Mar 6 02:56:55.149837 kubelet[3624]: I0306 02:56:55.149668 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4523b3f9-467b-41e3-93e3-a2e8bb564f83-config-volume\") pod \"coredns-66bc5c9577-gmmsw\" (UID: \"4523b3f9-467b-41e3-93e3-a2e8bb564f83\") " pod="kube-system/coredns-66bc5c9577-gmmsw" Mar 6 02:56:55.149929 kubelet[3624]: I0306 02:56:55.149689 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmv9k\" (UniqueName: \"kubernetes.io/projected/6c90e71f-9e4c-4dc9-b065-b12cb8d9d954-kube-api-access-rmv9k\") pod \"goldmane-cccfbd5cf-7p82v\" (UID: \"6c90e71f-9e4c-4dc9-b065-b12cb8d9d954\") " pod="calico-system/goldmane-cccfbd5cf-7p82v" Mar 6 02:56:55.149929 kubelet[3624]: I0306 02:56:55.149701 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e5652d3-fe74-46a9-ae0e-f644c51ba2ba-config-volume\") pod \"coredns-66bc5c9577-mz5wt\" (UID: \"2e5652d3-fe74-46a9-ae0e-f644c51ba2ba\") " pod="kube-system/coredns-66bc5c9577-mz5wt" Mar 6 02:56:55.149929 kubelet[3624]: I0306 02:56:55.149715 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9a2bd17b-a387-4762-bdc0-a8863bbccf80-calico-apiserver-certs\") pod \"calico-apiserver-6856446c58-8x2gl\" (UID: \"9a2bd17b-a387-4762-bdc0-a8863bbccf80\") " pod="calico-system/calico-apiserver-6856446c58-8x2gl" Mar 6 02:56:55.149929 kubelet[3624]: I0306 02:56:55.149732 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9f8be92d-b77f-4be7-b306-4ecace67782f-calico-apiserver-certs\") pod \"calico-apiserver-6856446c58-tcg5n\" (UID: \"9f8be92d-b77f-4be7-b306-4ecace67782f\") " pod="calico-system/calico-apiserver-6856446c58-tcg5n" Mar 6 02:56:55.149929 kubelet[3624]: I0306 02:56:55.149743 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-whisker-backend-key-pair\") pod \"whisker-5bc8b64dc-wf5ff\" (UID: \"5d0afbfb-0167-41c2-b3c3-176bd7181ce9\") " pod="calico-system/whisker-5bc8b64dc-wf5ff" Mar 6 02:56:55.150007 kubelet[3624]: I0306 02:56:55.149754 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c90e71f-9e4c-4dc9-b065-b12cb8d9d954-goldmane-ca-bundle\") pod \"goldmane-cccfbd5cf-7p82v\" (UID: \"6c90e71f-9e4c-4dc9-b065-b12cb8d9d954\") " pod="calico-system/goldmane-cccfbd5cf-7p82v" Mar 6 02:56:55.382378 containerd[1920]: time="2026-03-06T02:56:55.382269400Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5bc8b64dc-wf5ff,Uid:5d0afbfb-0167-41c2-b3c3-176bd7181ce9,Namespace:calico-system,Attempt:0,}" Mar 6 02:56:55.387130 containerd[1920]: time="2026-03-06T02:56:55.387085164Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8659d8567b-bwxtm,Uid:89cc2055-c0e4-4f2d-baa5-a51a5bbbd9f6,Namespace:calico-system,Attempt:0,}" Mar 6 02:56:55.394954 containerd[1920]: time="2026-03-06T02:56:55.394825999Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6856446c58-tcg5n,Uid:9f8be92d-b77f-4be7-b306-4ecace67782f,Namespace:calico-system,Attempt:0,}" Mar 6 02:56:55.412076 containerd[1920]: time="2026-03-06T02:56:55.412043644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-7p82v,Uid:6c90e71f-9e4c-4dc9-b065-b12cb8d9d954,Namespace:calico-system,Attempt:0,}" Mar 6 02:56:55.419759 containerd[1920]: time="2026-03-06T02:56:55.419729444Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6856446c58-8x2gl,Uid:9a2bd17b-a387-4762-bdc0-a8863bbccf80,Namespace:calico-system,Attempt:0,}" Mar 6 02:56:55.424161 containerd[1920]: time="2026-03-06T02:56:55.424104194Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-gmmsw,Uid:4523b3f9-467b-41e3-93e3-a2e8bb564f83,Namespace:kube-system,Attempt:0,}" Mar 6 02:56:55.429497 containerd[1920]: time="2026-03-06T02:56:55.429449895Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-mz5wt,Uid:2e5652d3-fe74-46a9-ae0e-f644c51ba2ba,Namespace:kube-system,Attempt:0,}" Mar 6 02:56:55.482760 containerd[1920]: time="2026-03-06T02:56:55.482708026Z" level=error msg="Failed to destroy network for sandbox \"05fbae65809f6359f589464f1f7be1209d02924e5fd3eb57fa56f8c86e0d8b5c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.488598 containerd[1920]: time="2026-03-06T02:56:55.488544511Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5bc8b64dc-wf5ff,Uid:5d0afbfb-0167-41c2-b3c3-176bd7181ce9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"05fbae65809f6359f589464f1f7be1209d02924e5fd3eb57fa56f8c86e0d8b5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.491376 kubelet[3624]: E0306 02:56:55.491323 3624 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05fbae65809f6359f589464f1f7be1209d02924e5fd3eb57fa56f8c86e0d8b5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.491540 kubelet[3624]: E0306 02:56:55.491522 3624 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05fbae65809f6359f589464f1f7be1209d02924e5fd3eb57fa56f8c86e0d8b5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5bc8b64dc-wf5ff" Mar 6 02:56:55.491606 kubelet[3624]: E0306 02:56:55.491592 3624 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05fbae65809f6359f589464f1f7be1209d02924e5fd3eb57fa56f8c86e0d8b5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5bc8b64dc-wf5ff" Mar 6 02:56:55.491720 kubelet[3624]: E0306 02:56:55.491699 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5bc8b64dc-wf5ff_calico-system(5d0afbfb-0167-41c2-b3c3-176bd7181ce9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5bc8b64dc-wf5ff_calico-system(5d0afbfb-0167-41c2-b3c3-176bd7181ce9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"05fbae65809f6359f589464f1f7be1209d02924e5fd3eb57fa56f8c86e0d8b5c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5bc8b64dc-wf5ff" podUID="5d0afbfb-0167-41c2-b3c3-176bd7181ce9" Mar 6 02:56:55.511334 containerd[1920]: time="2026-03-06T02:56:55.511271734Z" level=error msg="Failed to destroy network for sandbox \"352d5122f60974d2445518d5b292b8f20d74f51e590b1a69bf4e00dbd9936553\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.515407 containerd[1920]: time="2026-03-06T02:56:55.515274424Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8659d8567b-bwxtm,Uid:89cc2055-c0e4-4f2d-baa5-a51a5bbbd9f6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"352d5122f60974d2445518d5b292b8f20d74f51e590b1a69bf4e00dbd9936553\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.516151 kubelet[3624]: E0306 02:56:55.515606 3624 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"352d5122f60974d2445518d5b292b8f20d74f51e590b1a69bf4e00dbd9936553\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.516151 kubelet[3624]: E0306 02:56:55.515661 3624 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"352d5122f60974d2445518d5b292b8f20d74f51e590b1a69bf4e00dbd9936553\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8659d8567b-bwxtm" Mar 6 02:56:55.516151 kubelet[3624]: E0306 02:56:55.515679 3624 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"352d5122f60974d2445518d5b292b8f20d74f51e590b1a69bf4e00dbd9936553\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8659d8567b-bwxtm" Mar 6 02:56:55.516233 kubelet[3624]: E0306 02:56:55.515823 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8659d8567b-bwxtm_calico-system(89cc2055-c0e4-4f2d-baa5-a51a5bbbd9f6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8659d8567b-bwxtm_calico-system(89cc2055-c0e4-4f2d-baa5-a51a5bbbd9f6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"352d5122f60974d2445518d5b292b8f20d74f51e590b1a69bf4e00dbd9936553\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8659d8567b-bwxtm" podUID="89cc2055-c0e4-4f2d-baa5-a51a5bbbd9f6" Mar 6 02:56:55.530191 containerd[1920]: time="2026-03-06T02:56:55.530147849Z" level=error msg="Failed to destroy network for sandbox \"3017127af5af31c108ae92a263050c5c732c3c50a7d1da9f7ba15a187e54a455\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.533043 containerd[1920]: time="2026-03-06T02:56:55.532963636Z" level=error msg="Failed to destroy network for sandbox \"5759fd1942ebeef1eddc99095f929ee825f2cce9d3ecb45ba3f05993ee6e0c2d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.534886 containerd[1920]: time="2026-03-06T02:56:55.534851697Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-7p82v,Uid:6c90e71f-9e4c-4dc9-b065-b12cb8d9d954,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3017127af5af31c108ae92a263050c5c732c3c50a7d1da9f7ba15a187e54a455\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.535440 kubelet[3624]: E0306 02:56:55.535378 3624 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3017127af5af31c108ae92a263050c5c732c3c50a7d1da9f7ba15a187e54a455\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.535522 kubelet[3624]: E0306 02:56:55.535429 3624 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3017127af5af31c108ae92a263050c5c732c3c50a7d1da9f7ba15a187e54a455\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-7p82v" Mar 6 02:56:55.535522 kubelet[3624]: E0306 02:56:55.535471 3624 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3017127af5af31c108ae92a263050c5c732c3c50a7d1da9f7ba15a187e54a455\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-7p82v" Mar 6 02:56:55.537099 kubelet[3624]: E0306 02:56:55.535520 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-cccfbd5cf-7p82v_calico-system(6c90e71f-9e4c-4dc9-b065-b12cb8d9d954)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-cccfbd5cf-7p82v_calico-system(6c90e71f-9e4c-4dc9-b065-b12cb8d9d954)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3017127af5af31c108ae92a263050c5c732c3c50a7d1da9f7ba15a187e54a455\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-7p82v" podUID="6c90e71f-9e4c-4dc9-b065-b12cb8d9d954" Mar 6 02:56:55.538177 containerd[1920]: time="2026-03-06T02:56:55.538144636Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6856446c58-tcg5n,Uid:9f8be92d-b77f-4be7-b306-4ecace67782f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5759fd1942ebeef1eddc99095f929ee825f2cce9d3ecb45ba3f05993ee6e0c2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.538349 kubelet[3624]: E0306 02:56:55.538294 3624 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5759fd1942ebeef1eddc99095f929ee825f2cce9d3ecb45ba3f05993ee6e0c2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.538349 kubelet[3624]: E0306 02:56:55.538326 3624 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5759fd1942ebeef1eddc99095f929ee825f2cce9d3ecb45ba3f05993ee6e0c2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6856446c58-tcg5n" Mar 6 02:56:55.538349 kubelet[3624]: E0306 02:56:55.538339 3624 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5759fd1942ebeef1eddc99095f929ee825f2cce9d3ecb45ba3f05993ee6e0c2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6856446c58-tcg5n" Mar 6 02:56:55.538516 kubelet[3624]: E0306 02:56:55.538486 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6856446c58-tcg5n_calico-system(9f8be92d-b77f-4be7-b306-4ecace67782f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6856446c58-tcg5n_calico-system(9f8be92d-b77f-4be7-b306-4ecace67782f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5759fd1942ebeef1eddc99095f929ee825f2cce9d3ecb45ba3f05993ee6e0c2d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-6856446c58-tcg5n" podUID="9f8be92d-b77f-4be7-b306-4ecace67782f" Mar 6 02:56:55.558912 containerd[1920]: time="2026-03-06T02:56:55.558864242Z" level=error msg="Failed to destroy network for sandbox \"ac39507d355e4303735bef9be47cbd554b527b40f24d23813553d06b09f3a782\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.563541 containerd[1920]: time="2026-03-06T02:56:55.563428326Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-mz5wt,Uid:2e5652d3-fe74-46a9-ae0e-f644c51ba2ba,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac39507d355e4303735bef9be47cbd554b527b40f24d23813553d06b09f3a782\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.563928 kubelet[3624]: E0306 02:56:55.563661 3624 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac39507d355e4303735bef9be47cbd554b527b40f24d23813553d06b09f3a782\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.563928 kubelet[3624]: E0306 02:56:55.563708 3624 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac39507d355e4303735bef9be47cbd554b527b40f24d23813553d06b09f3a782\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-mz5wt" Mar 6 02:56:55.563928 kubelet[3624]: E0306 02:56:55.563724 3624 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac39507d355e4303735bef9be47cbd554b527b40f24d23813553d06b09f3a782\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-mz5wt" Mar 6 02:56:55.565213 kubelet[3624]: E0306 02:56:55.563786 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-mz5wt_kube-system(2e5652d3-fe74-46a9-ae0e-f644c51ba2ba)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-mz5wt_kube-system(2e5652d3-fe74-46a9-ae0e-f644c51ba2ba)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ac39507d355e4303735bef9be47cbd554b527b40f24d23813553d06b09f3a782\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-mz5wt" podUID="2e5652d3-fe74-46a9-ae0e-f644c51ba2ba" Mar 6 02:56:55.567096 containerd[1920]: time="2026-03-06T02:56:55.567018978Z" level=error msg="Failed to destroy network for sandbox \"05d5d4f01c5a0dff15bd1fa5275193b2e0ff43828f01286e2ec7c70fd75d97ed\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.570128 containerd[1920]: time="2026-03-06T02:56:55.570084829Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-gmmsw,Uid:4523b3f9-467b-41e3-93e3-a2e8bb564f83,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"05d5d4f01c5a0dff15bd1fa5275193b2e0ff43828f01286e2ec7c70fd75d97ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.570329 kubelet[3624]: E0306 02:56:55.570292 3624 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05d5d4f01c5a0dff15bd1fa5275193b2e0ff43828f01286e2ec7c70fd75d97ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.570373 kubelet[3624]: E0306 02:56:55.570339 3624 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05d5d4f01c5a0dff15bd1fa5275193b2e0ff43828f01286e2ec7c70fd75d97ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-gmmsw" Mar 6 02:56:55.570373 kubelet[3624]: E0306 02:56:55.570358 3624 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05d5d4f01c5a0dff15bd1fa5275193b2e0ff43828f01286e2ec7c70fd75d97ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-gmmsw" Mar 6 02:56:55.570434 kubelet[3624]: E0306 02:56:55.570402 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-gmmsw_kube-system(4523b3f9-467b-41e3-93e3-a2e8bb564f83)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-gmmsw_kube-system(4523b3f9-467b-41e3-93e3-a2e8bb564f83)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"05d5d4f01c5a0dff15bd1fa5275193b2e0ff43828f01286e2ec7c70fd75d97ed\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-gmmsw" podUID="4523b3f9-467b-41e3-93e3-a2e8bb564f83" Mar 6 02:56:55.571228 containerd[1920]: time="2026-03-06T02:56:55.571191825Z" level=error msg="Failed to destroy network for sandbox \"3075d4801c19fa141220ac02c219391ea1cbe678f18713279f404e4cfa62ec35\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.575014 containerd[1920]: time="2026-03-06T02:56:55.574965363Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6856446c58-8x2gl,Uid:9a2bd17b-a387-4762-bdc0-a8863bbccf80,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3075d4801c19fa141220ac02c219391ea1cbe678f18713279f404e4cfa62ec35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.575320 kubelet[3624]: E0306 02:56:55.575285 3624 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3075d4801c19fa141220ac02c219391ea1cbe678f18713279f404e4cfa62ec35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 02:56:55.575378 kubelet[3624]: E0306 02:56:55.575329 3624 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3075d4801c19fa141220ac02c219391ea1cbe678f18713279f404e4cfa62ec35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6856446c58-8x2gl" Mar 6 02:56:55.575378 kubelet[3624]: E0306 02:56:55.575344 3624 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3075d4801c19fa141220ac02c219391ea1cbe678f18713279f404e4cfa62ec35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6856446c58-8x2gl" Mar 6 02:56:55.575416 kubelet[3624]: E0306 02:56:55.575383 3624 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6856446c58-8x2gl_calico-system(9a2bd17b-a387-4762-bdc0-a8863bbccf80)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6856446c58-8x2gl_calico-system(9a2bd17b-a387-4762-bdc0-a8863bbccf80)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3075d4801c19fa141220ac02c219391ea1cbe678f18713279f404e4cfa62ec35\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-6856446c58-8x2gl" podUID="9a2bd17b-a387-4762-bdc0-a8863bbccf80" Mar 6 02:56:55.797276 containerd[1920]: time="2026-03-06T02:56:55.797229962Z" level=info msg="CreateContainer within sandbox \"0adedecc4a27799523e82de50375f8bdc6bec0818def2ec23647a5c55392e05c\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 6 02:56:55.816939 containerd[1920]: time="2026-03-06T02:56:55.814746057Z" level=info msg="Container ee9a619e5363b31b28384c4433d373d75f66789b097a4e05de72c63f25956b40: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:56:55.832960 containerd[1920]: time="2026-03-06T02:56:55.832892228Z" level=info msg="CreateContainer within sandbox \"0adedecc4a27799523e82de50375f8bdc6bec0818def2ec23647a5c55392e05c\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ee9a619e5363b31b28384c4433d373d75f66789b097a4e05de72c63f25956b40\"" Mar 6 02:56:55.833934 containerd[1920]: time="2026-03-06T02:56:55.833395972Z" level=info msg="StartContainer for \"ee9a619e5363b31b28384c4433d373d75f66789b097a4e05de72c63f25956b40\"" Mar 6 02:56:55.834713 containerd[1920]: time="2026-03-06T02:56:55.834686990Z" level=info msg="connecting to shim ee9a619e5363b31b28384c4433d373d75f66789b097a4e05de72c63f25956b40" address="unix:///run/containerd/s/01380828731830108b2d238646fd09c6fd5ebfcb4416d7c7503f85898838f8e1" protocol=ttrpc version=3 Mar 6 02:56:55.852067 systemd[1]: Started cri-containerd-ee9a619e5363b31b28384c4433d373d75f66789b097a4e05de72c63f25956b40.scope - libcontainer container ee9a619e5363b31b28384c4433d373d75f66789b097a4e05de72c63f25956b40. Mar 6 02:56:55.914690 containerd[1920]: time="2026-03-06T02:56:55.914634425Z" level=info msg="StartContainer for \"ee9a619e5363b31b28384c4433d373d75f66789b097a4e05de72c63f25956b40\" returns successfully" Mar 6 02:56:56.159269 kubelet[3624]: I0306 02:56:56.159067 3624 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88sbn\" (UniqueName: \"kubernetes.io/projected/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-kube-api-access-88sbn\") pod \"5d0afbfb-0167-41c2-b3c3-176bd7181ce9\" (UID: \"5d0afbfb-0167-41c2-b3c3-176bd7181ce9\") " Mar 6 02:56:56.160109 kubelet[3624]: I0306 02:56:56.159688 3624 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-whisker-ca-bundle\") pod \"5d0afbfb-0167-41c2-b3c3-176bd7181ce9\" (UID: \"5d0afbfb-0167-41c2-b3c3-176bd7181ce9\") " Mar 6 02:56:56.160109 kubelet[3624]: I0306 02:56:56.159711 3624 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-whisker-backend-key-pair\") pod \"5d0afbfb-0167-41c2-b3c3-176bd7181ce9\" (UID: \"5d0afbfb-0167-41c2-b3c3-176bd7181ce9\") " Mar 6 02:56:56.160109 kubelet[3624]: I0306 02:56:56.159981 3624 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-nginx-config\") pod \"5d0afbfb-0167-41c2-b3c3-176bd7181ce9\" (UID: \"5d0afbfb-0167-41c2-b3c3-176bd7181ce9\") " Mar 6 02:56:56.160456 kubelet[3624]: I0306 02:56:56.160256 3624 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "5d0afbfb-0167-41c2-b3c3-176bd7181ce9" (UID: "5d0afbfb-0167-41c2-b3c3-176bd7181ce9"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 6 02:56:56.160681 kubelet[3624]: I0306 02:56:56.160603 3624 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "5d0afbfb-0167-41c2-b3c3-176bd7181ce9" (UID: "5d0afbfb-0167-41c2-b3c3-176bd7181ce9"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 6 02:56:56.162774 kubelet[3624]: I0306 02:56:56.162752 3624 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "5d0afbfb-0167-41c2-b3c3-176bd7181ce9" (UID: "5d0afbfb-0167-41c2-b3c3-176bd7181ce9"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 6 02:56:56.162887 kubelet[3624]: I0306 02:56:56.162766 3624 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-kube-api-access-88sbn" (OuterVolumeSpecName: "kube-api-access-88sbn") pod "5d0afbfb-0167-41c2-b3c3-176bd7181ce9" (UID: "5d0afbfb-0167-41c2-b3c3-176bd7181ce9"). InnerVolumeSpecName "kube-api-access-88sbn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 6 02:56:56.260928 kubelet[3624]: I0306 02:56:56.260824 3624 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-88sbn\" (UniqueName: \"kubernetes.io/projected/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-kube-api-access-88sbn\") on node \"ci-4459.2.3-n-bf8f1184ca\" DevicePath \"\"" Mar 6 02:56:56.260928 kubelet[3624]: I0306 02:56:56.260859 3624 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-whisker-ca-bundle\") on node \"ci-4459.2.3-n-bf8f1184ca\" DevicePath \"\"" Mar 6 02:56:56.260928 kubelet[3624]: I0306 02:56:56.260866 3624 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-whisker-backend-key-pair\") on node \"ci-4459.2.3-n-bf8f1184ca\" DevicePath \"\"" Mar 6 02:56:56.260928 kubelet[3624]: I0306 02:56:56.260873 3624 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/5d0afbfb-0167-41c2-b3c3-176bd7181ce9-nginx-config\") on node \"ci-4459.2.3-n-bf8f1184ca\" DevicePath \"\"" Mar 6 02:56:56.599361 systemd[1]: Created slice kubepods-besteffort-poda5bde228_2966_4175_a103_4afd465c6d9f.slice - libcontainer container kubepods-besteffort-poda5bde228_2966_4175_a103_4afd465c6d9f.slice. Mar 6 02:56:56.600206 systemd[1]: Removed slice kubepods-besteffort-pod5d0afbfb_0167_41c2_b3c3_176bd7181ce9.slice - libcontainer container kubepods-besteffort-pod5d0afbfb_0167_41c2_b3c3_176bd7181ce9.slice. Mar 6 02:56:56.609507 containerd[1920]: time="2026-03-06T02:56:56.609043236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5qfnx,Uid:a5bde228-2966-4175-a103-4afd465c6d9f,Namespace:calico-system,Attempt:0,}" Mar 6 02:56:56.736925 systemd[1]: var-lib-kubelet-pods-5d0afbfb\x2d0167\x2d41c2\x2db3c3\x2d176bd7181ce9-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d88sbn.mount: Deactivated successfully. Mar 6 02:56:56.737022 systemd[1]: var-lib-kubelet-pods-5d0afbfb\x2d0167\x2d41c2\x2db3c3\x2d176bd7181ce9-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 6 02:56:56.779338 systemd-networkd[1485]: cali8b84400d75b: Link UP Mar 6 02:56:56.780794 systemd-networkd[1485]: cali8b84400d75b: Gained carrier Mar 6 02:56:56.804260 containerd[1920]: 2026-03-06 02:56:56.629 [ERROR][4677] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 6 02:56:56.804260 containerd[1920]: 2026-03-06 02:56:56.644 [INFO][4677] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--bf8f1184ca-k8s-csi--node--driver--5qfnx-eth0 csi-node-driver- calico-system a5bde228-2966-4175-a103-4afd465c6d9f 702 0 2026-03-06 02:56:00 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:98cbb5577 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459.2.3-n-bf8f1184ca csi-node-driver-5qfnx eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali8b84400d75b [] [] }} ContainerID="ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" Namespace="calico-system" Pod="csi-node-driver-5qfnx" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-csi--node--driver--5qfnx-" Mar 6 02:56:56.804260 containerd[1920]: 2026-03-06 02:56:56.644 [INFO][4677] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" Namespace="calico-system" Pod="csi-node-driver-5qfnx" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-csi--node--driver--5qfnx-eth0" Mar 6 02:56:56.804260 containerd[1920]: 2026-03-06 02:56:56.663 [INFO][4689] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" HandleID="k8s-pod-network.ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-csi--node--driver--5qfnx-eth0" Mar 6 02:56:56.804788 containerd[1920]: 2026-03-06 02:56:56.669 [INFO][4689] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" HandleID="k8s-pod-network.ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-csi--node--driver--5qfnx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed4b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-bf8f1184ca", "pod":"csi-node-driver-5qfnx", "timestamp":"2026-03-06 02:56:56.663630602 +0000 UTC"}, Hostname:"ci-4459.2.3-n-bf8f1184ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400030cf20)} Mar 6 02:56:56.804788 containerd[1920]: 2026-03-06 02:56:56.669 [INFO][4689] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 02:56:56.804788 containerd[1920]: 2026-03-06 02:56:56.669 [INFO][4689] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 02:56:56.804788 containerd[1920]: 2026-03-06 02:56:56.669 [INFO][4689] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-bf8f1184ca' Mar 6 02:56:56.804788 containerd[1920]: 2026-03-06 02:56:56.672 [INFO][4689] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:56.804788 containerd[1920]: 2026-03-06 02:56:56.676 [INFO][4689] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:56.804788 containerd[1920]: 2026-03-06 02:56:56.680 [INFO][4689] ipam/ipam.go 526: Trying affinity for 192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:56.804788 containerd[1920]: 2026-03-06 02:56:56.682 [INFO][4689] ipam/ipam.go 160: Attempting to load block cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:56.804788 containerd[1920]: 2026-03-06 02:56:56.684 [INFO][4689] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:56.804948 containerd[1920]: 2026-03-06 02:56:56.685 [INFO][4689] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.30.192/26 handle="k8s-pod-network.ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:56.804948 containerd[1920]: 2026-03-06 02:56:56.686 [INFO][4689] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61 Mar 6 02:56:56.804948 containerd[1920]: 2026-03-06 02:56:56.691 [INFO][4689] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.30.192/26 handle="k8s-pod-network.ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:56.804948 containerd[1920]: 2026-03-06 02:56:56.704 [INFO][4689] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.30.193/26] block=192.168.30.192/26 handle="k8s-pod-network.ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:56.804948 containerd[1920]: 2026-03-06 02:56:56.704 [INFO][4689] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.30.193/26] handle="k8s-pod-network.ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:56.804948 containerd[1920]: 2026-03-06 02:56:56.704 [INFO][4689] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 02:56:56.804948 containerd[1920]: 2026-03-06 02:56:56.705 [INFO][4689] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.30.193/26] IPv6=[] ContainerID="ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" HandleID="k8s-pod-network.ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-csi--node--driver--5qfnx-eth0" Mar 6 02:56:56.805042 containerd[1920]: 2026-03-06 02:56:56.709 [INFO][4677] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" Namespace="calico-system" Pod="csi-node-driver-5qfnx" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-csi--node--driver--5qfnx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-csi--node--driver--5qfnx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a5bde228-2966-4175-a103-4afd465c6d9f", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 56, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"", Pod:"csi-node-driver-5qfnx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.30.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8b84400d75b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:56:56.805081 containerd[1920]: 2026-03-06 02:56:56.710 [INFO][4677] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.193/32] ContainerID="ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" Namespace="calico-system" Pod="csi-node-driver-5qfnx" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-csi--node--driver--5qfnx-eth0" Mar 6 02:56:56.805081 containerd[1920]: 2026-03-06 02:56:56.710 [INFO][4677] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8b84400d75b ContainerID="ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" Namespace="calico-system" Pod="csi-node-driver-5qfnx" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-csi--node--driver--5qfnx-eth0" Mar 6 02:56:56.805081 containerd[1920]: 2026-03-06 02:56:56.781 [INFO][4677] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" Namespace="calico-system" Pod="csi-node-driver-5qfnx" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-csi--node--driver--5qfnx-eth0" Mar 6 02:56:56.805125 containerd[1920]: 2026-03-06 02:56:56.782 [INFO][4677] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" Namespace="calico-system" Pod="csi-node-driver-5qfnx" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-csi--node--driver--5qfnx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-csi--node--driver--5qfnx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a5bde228-2966-4175-a103-4afd465c6d9f", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 56, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61", Pod:"csi-node-driver-5qfnx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.30.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8b84400d75b", MAC:"0a:62:8e:1f:eb:3b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:56:56.805159 containerd[1920]: 2026-03-06 02:56:56.798 [INFO][4677] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" Namespace="calico-system" Pod="csi-node-driver-5qfnx" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-csi--node--driver--5qfnx-eth0" Mar 6 02:56:56.835240 kubelet[3624]: I0306 02:56:56.835183 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-xqrr7" podStartSLOduration=13.536711726 podStartE2EDuration="56.835167488s" podCreationTimestamp="2026-03-06 02:56:00 +0000 UTC" firstStartedPulling="2026-03-06 02:56:01.540547736 +0000 UTC m=+23.051461459" lastFinishedPulling="2026-03-06 02:56:44.839003498 +0000 UTC m=+66.349917221" observedRunningTime="2026-03-06 02:56:56.818488933 +0000 UTC m=+78.329402680" watchObservedRunningTime="2026-03-06 02:56:56.835167488 +0000 UTC m=+78.346081211" Mar 6 02:56:56.862485 containerd[1920]: time="2026-03-06T02:56:56.861829375Z" level=info msg="connecting to shim ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61" address="unix:///run/containerd/s/781acc40a14bf1015d609471bef022781ba5061837e3d2e7c2c252ec281e314f" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:56:56.894058 systemd[1]: Started cri-containerd-ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61.scope - libcontainer container ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61. Mar 6 02:56:56.921600 systemd[1]: Created slice kubepods-besteffort-podcbf872b6_1f56_4844_ac43_48ca01ece614.slice - libcontainer container kubepods-besteffort-podcbf872b6_1f56_4844_ac43_48ca01ece614.slice. Mar 6 02:56:56.959616 containerd[1920]: time="2026-03-06T02:56:56.959575897Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5qfnx,Uid:a5bde228-2966-4175-a103-4afd465c6d9f,Namespace:calico-system,Attempt:0,} returns sandbox id \"ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61\"" Mar 6 02:56:56.961475 containerd[1920]: time="2026-03-06T02:56:56.961445910Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 6 02:56:57.067164 kubelet[3624]: I0306 02:56:57.067004 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbf872b6-1f56-4844-ac43-48ca01ece614-whisker-ca-bundle\") pod \"whisker-6f48dd5f8d-p52s4\" (UID: \"cbf872b6-1f56-4844-ac43-48ca01ece614\") " pod="calico-system/whisker-6f48dd5f8d-p52s4" Mar 6 02:56:57.067164 kubelet[3624]: I0306 02:56:57.067115 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cbf872b6-1f56-4844-ac43-48ca01ece614-whisker-backend-key-pair\") pod \"whisker-6f48dd5f8d-p52s4\" (UID: \"cbf872b6-1f56-4844-ac43-48ca01ece614\") " pod="calico-system/whisker-6f48dd5f8d-p52s4" Mar 6 02:56:57.067164 kubelet[3624]: I0306 02:56:57.067172 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/cbf872b6-1f56-4844-ac43-48ca01ece614-nginx-config\") pod \"whisker-6f48dd5f8d-p52s4\" (UID: \"cbf872b6-1f56-4844-ac43-48ca01ece614\") " pod="calico-system/whisker-6f48dd5f8d-p52s4" Mar 6 02:56:57.067458 kubelet[3624]: I0306 02:56:57.067187 3624 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhvzf\" (UniqueName: \"kubernetes.io/projected/cbf872b6-1f56-4844-ac43-48ca01ece614-kube-api-access-vhvzf\") pod \"whisker-6f48dd5f8d-p52s4\" (UID: \"cbf872b6-1f56-4844-ac43-48ca01ece614\") " pod="calico-system/whisker-6f48dd5f8d-p52s4" Mar 6 02:56:57.233052 containerd[1920]: time="2026-03-06T02:56:57.233002144Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f48dd5f8d-p52s4,Uid:cbf872b6-1f56-4844-ac43-48ca01ece614,Namespace:calico-system,Attempt:0,}" Mar 6 02:56:57.385019 systemd-networkd[1485]: cali21475f8483a: Link UP Mar 6 02:56:57.385176 systemd-networkd[1485]: cali21475f8483a: Gained carrier Mar 6 02:56:57.402438 containerd[1920]: 2026-03-06 02:56:57.277 [ERROR][4832] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 6 02:56:57.402438 containerd[1920]: 2026-03-06 02:56:57.288 [INFO][4832] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--bf8f1184ca-k8s-whisker--6f48dd5f8d--p52s4-eth0 whisker-6f48dd5f8d- calico-system cbf872b6-1f56-4844-ac43-48ca01ece614 977 0 2026-03-06 02:56:56 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6f48dd5f8d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459.2.3-n-bf8f1184ca whisker-6f48dd5f8d-p52s4 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali21475f8483a [] [] }} ContainerID="b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" Namespace="calico-system" Pod="whisker-6f48dd5f8d-p52s4" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-whisker--6f48dd5f8d--p52s4-" Mar 6 02:56:57.402438 containerd[1920]: 2026-03-06 02:56:57.289 [INFO][4832] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" Namespace="calico-system" Pod="whisker-6f48dd5f8d-p52s4" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-whisker--6f48dd5f8d--p52s4-eth0" Mar 6 02:56:57.402438 containerd[1920]: 2026-03-06 02:56:57.333 [INFO][4876] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" HandleID="k8s-pod-network.b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-whisker--6f48dd5f8d--p52s4-eth0" Mar 6 02:56:57.402657 containerd[1920]: 2026-03-06 02:56:57.340 [INFO][4876] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" HandleID="k8s-pod-network.b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-whisker--6f48dd5f8d--p52s4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000380510), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-bf8f1184ca", "pod":"whisker-6f48dd5f8d-p52s4", "timestamp":"2026-03-06 02:56:57.33308963 +0000 UTC"}, Hostname:"ci-4459.2.3-n-bf8f1184ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000554dc0)} Mar 6 02:56:57.402657 containerd[1920]: 2026-03-06 02:56:57.340 [INFO][4876] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 02:56:57.402657 containerd[1920]: 2026-03-06 02:56:57.340 [INFO][4876] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 02:56:57.402657 containerd[1920]: 2026-03-06 02:56:57.341 [INFO][4876] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-bf8f1184ca' Mar 6 02:56:57.402657 containerd[1920]: 2026-03-06 02:56:57.345 [INFO][4876] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:57.402657 containerd[1920]: 2026-03-06 02:56:57.350 [INFO][4876] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:57.402657 containerd[1920]: 2026-03-06 02:56:57.354 [INFO][4876] ipam/ipam.go 526: Trying affinity for 192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:57.402657 containerd[1920]: 2026-03-06 02:56:57.356 [INFO][4876] ipam/ipam.go 160: Attempting to load block cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:57.402657 containerd[1920]: 2026-03-06 02:56:57.359 [INFO][4876] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:57.402793 containerd[1920]: 2026-03-06 02:56:57.359 [INFO][4876] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.30.192/26 handle="k8s-pod-network.b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:57.402793 containerd[1920]: 2026-03-06 02:56:57.361 [INFO][4876] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf Mar 6 02:56:57.402793 containerd[1920]: 2026-03-06 02:56:57.366 [INFO][4876] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.30.192/26 handle="k8s-pod-network.b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:57.402793 containerd[1920]: 2026-03-06 02:56:57.374 [INFO][4876] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.30.194/26] block=192.168.30.192/26 handle="k8s-pod-network.b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:57.402793 containerd[1920]: 2026-03-06 02:56:57.374 [INFO][4876] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.30.194/26] handle="k8s-pod-network.b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:56:57.402793 containerd[1920]: 2026-03-06 02:56:57.374 [INFO][4876] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 02:56:57.402793 containerd[1920]: 2026-03-06 02:56:57.374 [INFO][4876] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.30.194/26] IPv6=[] ContainerID="b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" HandleID="k8s-pod-network.b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-whisker--6f48dd5f8d--p52s4-eth0" Mar 6 02:56:57.402886 containerd[1920]: 2026-03-06 02:56:57.378 [INFO][4832] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" Namespace="calico-system" Pod="whisker-6f48dd5f8d-p52s4" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-whisker--6f48dd5f8d--p52s4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-whisker--6f48dd5f8d--p52s4-eth0", GenerateName:"whisker-6f48dd5f8d-", Namespace:"calico-system", SelfLink:"", UID:"cbf872b6-1f56-4844-ac43-48ca01ece614", ResourceVersion:"977", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 56, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6f48dd5f8d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"", Pod:"whisker-6f48dd5f8d-p52s4", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.30.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali21475f8483a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:56:57.402886 containerd[1920]: 2026-03-06 02:56:57.379 [INFO][4832] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.194/32] ContainerID="b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" Namespace="calico-system" Pod="whisker-6f48dd5f8d-p52s4" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-whisker--6f48dd5f8d--p52s4-eth0" Mar 6 02:56:57.404117 containerd[1920]: 2026-03-06 02:56:57.379 [INFO][4832] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali21475f8483a ContainerID="b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" Namespace="calico-system" Pod="whisker-6f48dd5f8d-p52s4" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-whisker--6f48dd5f8d--p52s4-eth0" Mar 6 02:56:57.404117 containerd[1920]: 2026-03-06 02:56:57.382 [INFO][4832] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" Namespace="calico-system" Pod="whisker-6f48dd5f8d-p52s4" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-whisker--6f48dd5f8d--p52s4-eth0" Mar 6 02:56:57.404321 containerd[1920]: 2026-03-06 02:56:57.383 [INFO][4832] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" Namespace="calico-system" Pod="whisker-6f48dd5f8d-p52s4" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-whisker--6f48dd5f8d--p52s4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-whisker--6f48dd5f8d--p52s4-eth0", GenerateName:"whisker-6f48dd5f8d-", Namespace:"calico-system", SelfLink:"", UID:"cbf872b6-1f56-4844-ac43-48ca01ece614", ResourceVersion:"977", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 56, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6f48dd5f8d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf", Pod:"whisker-6f48dd5f8d-p52s4", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.30.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali21475f8483a", MAC:"32:bf:a6:ed:64:84", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:56:57.404391 containerd[1920]: 2026-03-06 02:56:57.399 [INFO][4832] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" Namespace="calico-system" Pod="whisker-6f48dd5f8d-p52s4" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-whisker--6f48dd5f8d--p52s4-eth0" Mar 6 02:56:57.448044 containerd[1920]: time="2026-03-06T02:56:57.447976708Z" level=info msg="connecting to shim b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf" address="unix:///run/containerd/s/f3b1ea7eee82f5cf104818712dcac3b4b171a755ff40d1e354cdf9b933813828" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:56:57.481352 systemd[1]: Started cri-containerd-b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf.scope - libcontainer container b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf. Mar 6 02:56:57.543764 containerd[1920]: time="2026-03-06T02:56:57.543425684Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f48dd5f8d-p52s4,Uid:cbf872b6-1f56-4844-ac43-48ca01ece614,Namespace:calico-system,Attempt:0,} returns sandbox id \"b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf\"" Mar 6 02:56:57.936927 systemd-networkd[1485]: vxlan.calico: Link UP Mar 6 02:56:57.936933 systemd-networkd[1485]: vxlan.calico: Gained carrier Mar 6 02:56:58.177881 containerd[1920]: time="2026-03-06T02:56:58.177025728Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:58.181874 containerd[1920]: time="2026-03-06T02:56:58.180885517Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Mar 6 02:56:58.183920 containerd[1920]: time="2026-03-06T02:56:58.183843948Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:58.189082 containerd[1920]: time="2026-03-06T02:56:58.188636215Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:56:58.189082 containerd[1920]: time="2026-03-06T02:56:58.188924185Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 1.227422057s" Mar 6 02:56:58.189082 containerd[1920]: time="2026-03-06T02:56:58.188944745Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Mar 6 02:56:58.191179 containerd[1920]: time="2026-03-06T02:56:58.191100151Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 6 02:56:58.199484 containerd[1920]: time="2026-03-06T02:56:58.199451989Z" level=info msg="CreateContainer within sandbox \"ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 6 02:56:58.224561 containerd[1920]: time="2026-03-06T02:56:58.224518344Z" level=info msg="Container 572a11f1a3b5d2ef88395186fc22e0f80657a25def97a0377f02ce40c852999a: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:56:58.245941 containerd[1920]: time="2026-03-06T02:56:58.245886860Z" level=info msg="CreateContainer within sandbox \"ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"572a11f1a3b5d2ef88395186fc22e0f80657a25def97a0377f02ce40c852999a\"" Mar 6 02:56:58.247211 containerd[1920]: time="2026-03-06T02:56:58.246960542Z" level=info msg="StartContainer for \"572a11f1a3b5d2ef88395186fc22e0f80657a25def97a0377f02ce40c852999a\"" Mar 6 02:56:58.248690 containerd[1920]: time="2026-03-06T02:56:58.248665037Z" level=info msg="connecting to shim 572a11f1a3b5d2ef88395186fc22e0f80657a25def97a0377f02ce40c852999a" address="unix:///run/containerd/s/781acc40a14bf1015d609471bef022781ba5061837e3d2e7c2c252ec281e314f" protocol=ttrpc version=3 Mar 6 02:56:58.268392 systemd[1]: Started cri-containerd-572a11f1a3b5d2ef88395186fc22e0f80657a25def97a0377f02ce40c852999a.scope - libcontainer container 572a11f1a3b5d2ef88395186fc22e0f80657a25def97a0377f02ce40c852999a. Mar 6 02:56:58.335093 containerd[1920]: time="2026-03-06T02:56:58.335048729Z" level=info msg="StartContainer for \"572a11f1a3b5d2ef88395186fc22e0f80657a25def97a0377f02ce40c852999a\" returns successfully" Mar 6 02:56:58.595420 kubelet[3624]: I0306 02:56:58.595264 3624 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d0afbfb-0167-41c2-b3c3-176bd7181ce9" path="/var/lib/kubelet/pods/5d0afbfb-0167-41c2-b3c3-176bd7181ce9/volumes" Mar 6 02:56:58.787065 systemd-networkd[1485]: cali8b84400d75b: Gained IPv6LL Mar 6 02:56:58.915053 systemd-networkd[1485]: cali21475f8483a: Gained IPv6LL Mar 6 02:56:59.555036 systemd-networkd[1485]: vxlan.calico: Gained IPv6LL Mar 6 02:57:00.441875 containerd[1920]: time="2026-03-06T02:57:00.441822331Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:00.490178 containerd[1920]: time="2026-03-06T02:57:00.490019137Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Mar 6 02:57:00.538168 containerd[1920]: time="2026-03-06T02:57:00.538097035Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:00.585250 containerd[1920]: time="2026-03-06T02:57:00.585087617Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:00.586692 containerd[1920]: time="2026-03-06T02:57:00.586511815Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 2.395147544s" Mar 6 02:57:00.586692 containerd[1920]: time="2026-03-06T02:57:00.586542176Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Mar 6 02:57:00.588965 containerd[1920]: time="2026-03-06T02:57:00.588860707Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 6 02:57:00.635409 containerd[1920]: time="2026-03-06T02:57:00.635357642Z" level=info msg="CreateContainer within sandbox \"b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 6 02:57:00.797141 containerd[1920]: time="2026-03-06T02:57:00.797097767Z" level=info msg="Container 0e17e3322d5d84d3b811df94f6ec4162138d418ffd30cfaec093a4f6132ebd49: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:57:00.845567 update_engine[1869]: I20260306 02:57:00.845510 1869 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 6 02:57:00.845567 update_engine[1869]: I20260306 02:57:00.845559 1869 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 6 02:57:00.845961 update_engine[1869]: I20260306 02:57:00.845754 1869 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 6 02:57:00.846402 update_engine[1869]: I20260306 02:57:00.846372 1869 omaha_request_params.cc:62] Current group set to stable Mar 6 02:57:00.846489 update_engine[1869]: I20260306 02:57:00.846472 1869 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 6 02:57:00.846489 update_engine[1869]: I20260306 02:57:00.846483 1869 update_attempter.cc:643] Scheduling an action processor start. Mar 6 02:57:00.846530 update_engine[1869]: I20260306 02:57:00.846500 1869 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 6 02:57:00.846544 update_engine[1869]: I20260306 02:57:00.846527 1869 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 6 02:57:00.846594 update_engine[1869]: I20260306 02:57:00.846580 1869 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 6 02:57:00.846594 update_engine[1869]: I20260306 02:57:00.846589 1869 omaha_request_action.cc:272] Request: Mar 6 02:57:00.846594 update_engine[1869]: Mar 6 02:57:00.846594 update_engine[1869]: Mar 6 02:57:00.846594 update_engine[1869]: Mar 6 02:57:00.846594 update_engine[1869]: Mar 6 02:57:00.846594 update_engine[1869]: Mar 6 02:57:00.846594 update_engine[1869]: Mar 6 02:57:00.846594 update_engine[1869]: Mar 6 02:57:00.846594 update_engine[1869]: Mar 6 02:57:00.846724 update_engine[1869]: I20260306 02:57:00.846594 1869 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 6 02:57:00.850201 update_engine[1869]: I20260306 02:57:00.850168 1869 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 6 02:57:00.850973 update_engine[1869]: I20260306 02:57:00.850780 1869 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 6 02:57:00.851137 locksmithd[1953]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 6 02:57:00.888057 update_engine[1869]: E20260306 02:57:00.887885 1869 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 6 02:57:00.888057 update_engine[1869]: I20260306 02:57:00.888024 1869 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 6 02:57:00.990699 containerd[1920]: time="2026-03-06T02:57:00.990649830Z" level=info msg="CreateContainer within sandbox \"b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"0e17e3322d5d84d3b811df94f6ec4162138d418ffd30cfaec093a4f6132ebd49\"" Mar 6 02:57:00.991539 containerd[1920]: time="2026-03-06T02:57:00.991377973Z" level=info msg="StartContainer for \"0e17e3322d5d84d3b811df94f6ec4162138d418ffd30cfaec093a4f6132ebd49\"" Mar 6 02:57:00.992813 containerd[1920]: time="2026-03-06T02:57:00.992777043Z" level=info msg="connecting to shim 0e17e3322d5d84d3b811df94f6ec4162138d418ffd30cfaec093a4f6132ebd49" address="unix:///run/containerd/s/f3b1ea7eee82f5cf104818712dcac3b4b171a755ff40d1e354cdf9b933813828" protocol=ttrpc version=3 Mar 6 02:57:01.013035 systemd[1]: Started cri-containerd-0e17e3322d5d84d3b811df94f6ec4162138d418ffd30cfaec093a4f6132ebd49.scope - libcontainer container 0e17e3322d5d84d3b811df94f6ec4162138d418ffd30cfaec093a4f6132ebd49. Mar 6 02:57:01.090664 containerd[1920]: time="2026-03-06T02:57:01.090489608Z" level=info msg="StartContainer for \"0e17e3322d5d84d3b811df94f6ec4162138d418ffd30cfaec093a4f6132ebd49\" returns successfully" Mar 6 02:57:03.587045 containerd[1920]: time="2026-03-06T02:57:03.586986617Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:03.634523 containerd[1920]: time="2026-03-06T02:57:03.634285041Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Mar 6 02:57:03.639370 containerd[1920]: time="2026-03-06T02:57:03.639130383Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:03.742201 containerd[1920]: time="2026-03-06T02:57:03.742154817Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:03.742766 containerd[1920]: time="2026-03-06T02:57:03.742665090Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 3.153714523s" Mar 6 02:57:03.742766 containerd[1920]: time="2026-03-06T02:57:03.742692603Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Mar 6 02:57:03.744265 containerd[1920]: time="2026-03-06T02:57:03.744070007Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 6 02:57:03.791186 containerd[1920]: time="2026-03-06T02:57:03.791122263Z" level=info msg="CreateContainer within sandbox \"ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 6 02:57:03.939973 containerd[1920]: time="2026-03-06T02:57:03.938259451Z" level=info msg="Container 46734976cf5442b72cf1566ddb7bce44e1596970f0f60618225902d749b386dd: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:57:04.090888 containerd[1920]: time="2026-03-06T02:57:04.090845583Z" level=info msg="CreateContainer within sandbox \"ee3afd1d2aa0c9d7e909c68aa0d8ae505cd50fafd46522bab0169e095dd2dc61\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"46734976cf5442b72cf1566ddb7bce44e1596970f0f60618225902d749b386dd\"" Mar 6 02:57:04.091782 containerd[1920]: time="2026-03-06T02:57:04.091757485Z" level=info msg="StartContainer for \"46734976cf5442b72cf1566ddb7bce44e1596970f0f60618225902d749b386dd\"" Mar 6 02:57:04.093217 containerd[1920]: time="2026-03-06T02:57:04.093186587Z" level=info msg="connecting to shim 46734976cf5442b72cf1566ddb7bce44e1596970f0f60618225902d749b386dd" address="unix:///run/containerd/s/781acc40a14bf1015d609471bef022781ba5061837e3d2e7c2c252ec281e314f" protocol=ttrpc version=3 Mar 6 02:57:04.114040 systemd[1]: Started cri-containerd-46734976cf5442b72cf1566ddb7bce44e1596970f0f60618225902d749b386dd.scope - libcontainer container 46734976cf5442b72cf1566ddb7bce44e1596970f0f60618225902d749b386dd. Mar 6 02:57:04.186315 containerd[1920]: time="2026-03-06T02:57:04.186224473Z" level=info msg="StartContainer for \"46734976cf5442b72cf1566ddb7bce44e1596970f0f60618225902d749b386dd\" returns successfully" Mar 6 02:57:04.692450 kubelet[3624]: I0306 02:57:04.692322 3624 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 6 02:57:04.692450 kubelet[3624]: I0306 02:57:04.692365 3624 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 6 02:57:04.828474 kubelet[3624]: I0306 02:57:04.828394 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-5qfnx" podStartSLOduration=58.045766852 podStartE2EDuration="1m4.828378348s" podCreationTimestamp="2026-03-06 02:56:00 +0000 UTC" firstStartedPulling="2026-03-06 02:56:56.961032401 +0000 UTC m=+78.471946124" lastFinishedPulling="2026-03-06 02:57:03.743643881 +0000 UTC m=+85.254557620" observedRunningTime="2026-03-06 02:57:04.827738103 +0000 UTC m=+86.338651826" watchObservedRunningTime="2026-03-06 02:57:04.828378348 +0000 UTC m=+86.339292071" Mar 6 02:57:06.601827 containerd[1920]: time="2026-03-06T02:57:06.601515213Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-7p82v,Uid:6c90e71f-9e4c-4dc9-b065-b12cb8d9d954,Namespace:calico-system,Attempt:0,}" Mar 6 02:57:06.608331 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2481698500.mount: Deactivated successfully. Mar 6 02:57:06.871956 systemd-networkd[1485]: cali44874af92da: Link UP Mar 6 02:57:06.873351 systemd-networkd[1485]: cali44874af92da: Gained carrier Mar 6 02:57:06.937268 containerd[1920]: 2026-03-06 02:57:06.808 [INFO][5216] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--bf8f1184ca-k8s-goldmane--cccfbd5cf--7p82v-eth0 goldmane-cccfbd5cf- calico-system 6c90e71f-9e4c-4dc9-b065-b12cb8d9d954 920 0 2026-03-06 02:55:59 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:cccfbd5cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459.2.3-n-bf8f1184ca goldmane-cccfbd5cf-7p82v eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali44874af92da [] [] }} ContainerID="06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7p82v" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-goldmane--cccfbd5cf--7p82v-" Mar 6 02:57:06.937268 containerd[1920]: 2026-03-06 02:57:06.808 [INFO][5216] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7p82v" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-goldmane--cccfbd5cf--7p82v-eth0" Mar 6 02:57:06.937268 containerd[1920]: 2026-03-06 02:57:06.826 [INFO][5228] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" HandleID="k8s-pod-network.06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-goldmane--cccfbd5cf--7p82v-eth0" Mar 6 02:57:06.937472 containerd[1920]: 2026-03-06 02:57:06.832 [INFO][5228] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" HandleID="k8s-pod-network.06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-goldmane--cccfbd5cf--7p82v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000273350), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-bf8f1184ca", "pod":"goldmane-cccfbd5cf-7p82v", "timestamp":"2026-03-06 02:57:06.826449183 +0000 UTC"}, Hostname:"ci-4459.2.3-n-bf8f1184ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400035af20)} Mar 6 02:57:06.937472 containerd[1920]: 2026-03-06 02:57:06.832 [INFO][5228] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 02:57:06.937472 containerd[1920]: 2026-03-06 02:57:06.832 [INFO][5228] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 02:57:06.937472 containerd[1920]: 2026-03-06 02:57:06.832 [INFO][5228] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-bf8f1184ca' Mar 6 02:57:06.937472 containerd[1920]: 2026-03-06 02:57:06.834 [INFO][5228] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:06.937472 containerd[1920]: 2026-03-06 02:57:06.838 [INFO][5228] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:06.937472 containerd[1920]: 2026-03-06 02:57:06.843 [INFO][5228] ipam/ipam.go 526: Trying affinity for 192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:06.937472 containerd[1920]: 2026-03-06 02:57:06.845 [INFO][5228] ipam/ipam.go 160: Attempting to load block cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:06.937472 containerd[1920]: 2026-03-06 02:57:06.848 [INFO][5228] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:06.937614 containerd[1920]: 2026-03-06 02:57:06.848 [INFO][5228] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.30.192/26 handle="k8s-pod-network.06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:06.937614 containerd[1920]: 2026-03-06 02:57:06.850 [INFO][5228] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb Mar 6 02:57:06.937614 containerd[1920]: 2026-03-06 02:57:06.855 [INFO][5228] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.30.192/26 handle="k8s-pod-network.06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:06.937614 containerd[1920]: 2026-03-06 02:57:06.864 [INFO][5228] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.30.195/26] block=192.168.30.192/26 handle="k8s-pod-network.06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:06.937614 containerd[1920]: 2026-03-06 02:57:06.864 [INFO][5228] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.30.195/26] handle="k8s-pod-network.06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:06.937614 containerd[1920]: 2026-03-06 02:57:06.864 [INFO][5228] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 02:57:06.937614 containerd[1920]: 2026-03-06 02:57:06.864 [INFO][5228] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.30.195/26] IPv6=[] ContainerID="06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" HandleID="k8s-pod-network.06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-goldmane--cccfbd5cf--7p82v-eth0" Mar 6 02:57:06.937713 containerd[1920]: 2026-03-06 02:57:06.867 [INFO][5216] cni-plugin/k8s.go 418: Populated endpoint ContainerID="06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7p82v" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-goldmane--cccfbd5cf--7p82v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-goldmane--cccfbd5cf--7p82v-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"6c90e71f-9e4c-4dc9-b065-b12cb8d9d954", ResourceVersion:"920", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 55, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"", Pod:"goldmane-cccfbd5cf-7p82v", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.30.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali44874af92da", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:57:06.937713 containerd[1920]: 2026-03-06 02:57:06.867 [INFO][5216] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.195/32] ContainerID="06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7p82v" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-goldmane--cccfbd5cf--7p82v-eth0" Mar 6 02:57:06.937760 containerd[1920]: 2026-03-06 02:57:06.867 [INFO][5216] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali44874af92da ContainerID="06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7p82v" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-goldmane--cccfbd5cf--7p82v-eth0" Mar 6 02:57:06.937760 containerd[1920]: 2026-03-06 02:57:06.874 [INFO][5216] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7p82v" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-goldmane--cccfbd5cf--7p82v-eth0" Mar 6 02:57:06.937792 containerd[1920]: 2026-03-06 02:57:06.875 [INFO][5216] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7p82v" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-goldmane--cccfbd5cf--7p82v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-goldmane--cccfbd5cf--7p82v-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"6c90e71f-9e4c-4dc9-b065-b12cb8d9d954", ResourceVersion:"920", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 55, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb", Pod:"goldmane-cccfbd5cf-7p82v", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.30.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali44874af92da", MAC:"06:32:0f:fa:8c:52", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:57:06.937825 containerd[1920]: 2026-03-06 02:57:06.894 [INFO][5216] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7p82v" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-goldmane--cccfbd5cf--7p82v-eth0" Mar 6 02:57:07.437795 containerd[1920]: time="2026-03-06T02:57:07.437742824Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:07.531483 containerd[1920]: time="2026-03-06T02:57:07.531425447Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Mar 6 02:57:07.581502 containerd[1920]: time="2026-03-06T02:57:07.580958705Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:07.656137 containerd[1920]: time="2026-03-06T02:57:07.656089360Z" level=info msg="connecting to shim 06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb" address="unix:///run/containerd/s/563742fd09e20e34db699f9742b7c6bf894d3ce21972b639cb156e01d7ff45d3" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:57:07.673057 systemd[1]: Started cri-containerd-06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb.scope - libcontainer container 06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb. Mar 6 02:57:07.692386 containerd[1920]: time="2026-03-06T02:57:07.691895471Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:07.692760 containerd[1920]: time="2026-03-06T02:57:07.692659920Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 3.948562783s" Mar 6 02:57:07.692760 containerd[1920]: time="2026-03-06T02:57:07.692689809Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Mar 6 02:57:07.783882 containerd[1920]: time="2026-03-06T02:57:07.783836382Z" level=info msg="CreateContainer within sandbox \"b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 6 02:57:07.785226 containerd[1920]: time="2026-03-06T02:57:07.785201282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-7p82v,Uid:6c90e71f-9e4c-4dc9-b065-b12cb8d9d954,Namespace:calico-system,Attempt:0,} returns sandbox id \"06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb\"" Mar 6 02:57:07.788047 containerd[1920]: time="2026-03-06T02:57:07.788019717Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 6 02:57:08.048194 containerd[1920]: time="2026-03-06T02:57:08.048138133Z" level=info msg="Container 856c35eeeae665220a838d48a6642d2709924e449ce2382c9378dba499c66892: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:57:08.051286 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount498150123.mount: Deactivated successfully. Mar 6 02:57:08.067068 systemd-networkd[1485]: cali44874af92da: Gained IPv6LL Mar 6 02:57:08.187852 containerd[1920]: time="2026-03-06T02:57:08.187587989Z" level=info msg="CreateContainer within sandbox \"b38b21c9b7a4782142821a2a7aff2041ec42420461670e3ffeee7c7863faebdf\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"856c35eeeae665220a838d48a6642d2709924e449ce2382c9378dba499c66892\"" Mar 6 02:57:08.190320 containerd[1920]: time="2026-03-06T02:57:08.189325254Z" level=info msg="StartContainer for \"856c35eeeae665220a838d48a6642d2709924e449ce2382c9378dba499c66892\"" Mar 6 02:57:08.190320 containerd[1920]: time="2026-03-06T02:57:08.190198234Z" level=info msg="connecting to shim 856c35eeeae665220a838d48a6642d2709924e449ce2382c9378dba499c66892" address="unix:///run/containerd/s/f3b1ea7eee82f5cf104818712dcac3b4b171a755ff40d1e354cdf9b933813828" protocol=ttrpc version=3 Mar 6 02:57:08.211061 systemd[1]: Started cri-containerd-856c35eeeae665220a838d48a6642d2709924e449ce2382c9378dba499c66892.scope - libcontainer container 856c35eeeae665220a838d48a6642d2709924e449ce2382c9378dba499c66892. Mar 6 02:57:08.291315 containerd[1920]: time="2026-03-06T02:57:08.291258184Z" level=info msg="StartContainer for \"856c35eeeae665220a838d48a6642d2709924e449ce2382c9378dba499c66892\" returns successfully" Mar 6 02:57:08.639437 containerd[1920]: time="2026-03-06T02:57:08.639398272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8659d8567b-bwxtm,Uid:89cc2055-c0e4-4f2d-baa5-a51a5bbbd9f6,Namespace:calico-system,Attempt:0,}" Mar 6 02:57:08.729266 containerd[1920]: time="2026-03-06T02:57:08.729055269Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-mz5wt,Uid:2e5652d3-fe74-46a9-ae0e-f644c51ba2ba,Namespace:kube-system,Attempt:0,}" Mar 6 02:57:08.840076 kubelet[3624]: I0306 02:57:08.839982 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6f48dd5f8d-p52s4" podStartSLOduration=2.69163069 podStartE2EDuration="12.839963665s" podCreationTimestamp="2026-03-06 02:56:56 +0000 UTC" firstStartedPulling="2026-03-06 02:56:57.545188989 +0000 UTC m=+79.056102712" lastFinishedPulling="2026-03-06 02:57:07.693521964 +0000 UTC m=+89.204435687" observedRunningTime="2026-03-06 02:57:08.838994178 +0000 UTC m=+90.349907901" watchObservedRunningTime="2026-03-06 02:57:08.839963665 +0000 UTC m=+90.350877388" Mar 6 02:57:09.067815 systemd-networkd[1485]: calib8482cd0ddd: Link UP Mar 6 02:57:09.068875 systemd-networkd[1485]: calib8482cd0ddd: Gained carrier Mar 6 02:57:09.087577 containerd[1920]: 2026-03-06 02:57:09.008 [INFO][5359] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--bf8f1184ca-k8s-calico--kube--controllers--8659d8567b--bwxtm-eth0 calico-kube-controllers-8659d8567b- calico-system 89cc2055-c0e4-4f2d-baa5-a51a5bbbd9f6 918 0 2026-03-06 02:56:00 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8659d8567b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459.2.3-n-bf8f1184ca calico-kube-controllers-8659d8567b-bwxtm eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calib8482cd0ddd [] [] }} ContainerID="21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" Namespace="calico-system" Pod="calico-kube-controllers-8659d8567b-bwxtm" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--kube--controllers--8659d8567b--bwxtm-" Mar 6 02:57:09.087577 containerd[1920]: 2026-03-06 02:57:09.008 [INFO][5359] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" Namespace="calico-system" Pod="calico-kube-controllers-8659d8567b-bwxtm" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--kube--controllers--8659d8567b--bwxtm-eth0" Mar 6 02:57:09.087577 containerd[1920]: 2026-03-06 02:57:09.026 [INFO][5371] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" HandleID="k8s-pod-network.21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-calico--kube--controllers--8659d8567b--bwxtm-eth0" Mar 6 02:57:09.087946 containerd[1920]: 2026-03-06 02:57:09.033 [INFO][5371] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" HandleID="k8s-pod-network.21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-calico--kube--controllers--8659d8567b--bwxtm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed4b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-bf8f1184ca", "pod":"calico-kube-controllers-8659d8567b-bwxtm", "timestamp":"2026-03-06 02:57:09.02672882 +0000 UTC"}, Hostname:"ci-4459.2.3-n-bf8f1184ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000333080)} Mar 6 02:57:09.087946 containerd[1920]: 2026-03-06 02:57:09.033 [INFO][5371] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 02:57:09.087946 containerd[1920]: 2026-03-06 02:57:09.033 [INFO][5371] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 02:57:09.087946 containerd[1920]: 2026-03-06 02:57:09.033 [INFO][5371] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-bf8f1184ca' Mar 6 02:57:09.087946 containerd[1920]: 2026-03-06 02:57:09.036 [INFO][5371] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.087946 containerd[1920]: 2026-03-06 02:57:09.040 [INFO][5371] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.087946 containerd[1920]: 2026-03-06 02:57:09.044 [INFO][5371] ipam/ipam.go 526: Trying affinity for 192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.087946 containerd[1920]: 2026-03-06 02:57:09.046 [INFO][5371] ipam/ipam.go 160: Attempting to load block cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.087946 containerd[1920]: 2026-03-06 02:57:09.048 [INFO][5371] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.088323 containerd[1920]: 2026-03-06 02:57:09.048 [INFO][5371] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.30.192/26 handle="k8s-pod-network.21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.088323 containerd[1920]: 2026-03-06 02:57:09.049 [INFO][5371] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a Mar 6 02:57:09.088323 containerd[1920]: 2026-03-06 02:57:09.054 [INFO][5371] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.30.192/26 handle="k8s-pod-network.21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.088323 containerd[1920]: 2026-03-06 02:57:09.062 [INFO][5371] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.30.196/26] block=192.168.30.192/26 handle="k8s-pod-network.21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.088323 containerd[1920]: 2026-03-06 02:57:09.062 [INFO][5371] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.30.196/26] handle="k8s-pod-network.21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.088323 containerd[1920]: 2026-03-06 02:57:09.062 [INFO][5371] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 02:57:09.088323 containerd[1920]: 2026-03-06 02:57:09.062 [INFO][5371] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.30.196/26] IPv6=[] ContainerID="21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" HandleID="k8s-pod-network.21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-calico--kube--controllers--8659d8567b--bwxtm-eth0" Mar 6 02:57:09.088466 containerd[1920]: 2026-03-06 02:57:09.064 [INFO][5359] cni-plugin/k8s.go 418: Populated endpoint ContainerID="21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" Namespace="calico-system" Pod="calico-kube-controllers-8659d8567b-bwxtm" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--kube--controllers--8659d8567b--bwxtm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-calico--kube--controllers--8659d8567b--bwxtm-eth0", GenerateName:"calico-kube-controllers-8659d8567b-", Namespace:"calico-system", SelfLink:"", UID:"89cc2055-c0e4-4f2d-baa5-a51a5bbbd9f6", ResourceVersion:"918", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 56, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8659d8567b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"", Pod:"calico-kube-controllers-8659d8567b-bwxtm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.30.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib8482cd0ddd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:57:09.088524 containerd[1920]: 2026-03-06 02:57:09.064 [INFO][5359] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.196/32] ContainerID="21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" Namespace="calico-system" Pod="calico-kube-controllers-8659d8567b-bwxtm" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--kube--controllers--8659d8567b--bwxtm-eth0" Mar 6 02:57:09.088524 containerd[1920]: 2026-03-06 02:57:09.064 [INFO][5359] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib8482cd0ddd ContainerID="21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" Namespace="calico-system" Pod="calico-kube-controllers-8659d8567b-bwxtm" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--kube--controllers--8659d8567b--bwxtm-eth0" Mar 6 02:57:09.088524 containerd[1920]: 2026-03-06 02:57:09.069 [INFO][5359] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" Namespace="calico-system" Pod="calico-kube-controllers-8659d8567b-bwxtm" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--kube--controllers--8659d8567b--bwxtm-eth0" Mar 6 02:57:09.088593 containerd[1920]: 2026-03-06 02:57:09.071 [INFO][5359] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" Namespace="calico-system" Pod="calico-kube-controllers-8659d8567b-bwxtm" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--kube--controllers--8659d8567b--bwxtm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-calico--kube--controllers--8659d8567b--bwxtm-eth0", GenerateName:"calico-kube-controllers-8659d8567b-", Namespace:"calico-system", SelfLink:"", UID:"89cc2055-c0e4-4f2d-baa5-a51a5bbbd9f6", ResourceVersion:"918", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 56, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8659d8567b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a", Pod:"calico-kube-controllers-8659d8567b-bwxtm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.30.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib8482cd0ddd", MAC:"7e:9d:6c:0c:f1:de", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:57:09.088636 containerd[1920]: 2026-03-06 02:57:09.083 [INFO][5359] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" Namespace="calico-system" Pod="calico-kube-controllers-8659d8567b-bwxtm" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--kube--controllers--8659d8567b--bwxtm-eth0" Mar 6 02:57:09.239111 systemd-networkd[1485]: cali19d81106063: Link UP Mar 6 02:57:09.240181 systemd-networkd[1485]: cali19d81106063: Gained carrier Mar 6 02:57:09.264070 containerd[1920]: 2026-03-06 02:57:09.166 [INFO][5392] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--mz5wt-eth0 coredns-66bc5c9577- kube-system 2e5652d3-fe74-46a9-ae0e-f644c51ba2ba 922 0 2026-03-06 02:55:44 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.2.3-n-bf8f1184ca coredns-66bc5c9577-mz5wt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali19d81106063 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" Namespace="kube-system" Pod="coredns-66bc5c9577-mz5wt" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--mz5wt-" Mar 6 02:57:09.264070 containerd[1920]: 2026-03-06 02:57:09.166 [INFO][5392] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" Namespace="kube-system" Pod="coredns-66bc5c9577-mz5wt" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--mz5wt-eth0" Mar 6 02:57:09.264070 containerd[1920]: 2026-03-06 02:57:09.194 [INFO][5405] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" HandleID="k8s-pod-network.0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--mz5wt-eth0" Mar 6 02:57:09.264343 containerd[1920]: 2026-03-06 02:57:09.201 [INFO][5405] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" HandleID="k8s-pod-network.0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--mz5wt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fb330), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.2.3-n-bf8f1184ca", "pod":"coredns-66bc5c9577-mz5wt", "timestamp":"2026-03-06 02:57:09.194859404 +0000 UTC"}, Hostname:"ci-4459.2.3-n-bf8f1184ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400030ef20)} Mar 6 02:57:09.264343 containerd[1920]: 2026-03-06 02:57:09.202 [INFO][5405] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 02:57:09.264343 containerd[1920]: 2026-03-06 02:57:09.202 [INFO][5405] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 02:57:09.264343 containerd[1920]: 2026-03-06 02:57:09.202 [INFO][5405] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-bf8f1184ca' Mar 6 02:57:09.264343 containerd[1920]: 2026-03-06 02:57:09.205 [INFO][5405] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.264343 containerd[1920]: 2026-03-06 02:57:09.209 [INFO][5405] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.264343 containerd[1920]: 2026-03-06 02:57:09.214 [INFO][5405] ipam/ipam.go 526: Trying affinity for 192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.264343 containerd[1920]: 2026-03-06 02:57:09.215 [INFO][5405] ipam/ipam.go 160: Attempting to load block cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.264343 containerd[1920]: 2026-03-06 02:57:09.218 [INFO][5405] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.264484 containerd[1920]: 2026-03-06 02:57:09.218 [INFO][5405] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.30.192/26 handle="k8s-pod-network.0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.264484 containerd[1920]: 2026-03-06 02:57:09.220 [INFO][5405] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645 Mar 6 02:57:09.264484 containerd[1920]: 2026-03-06 02:57:09.224 [INFO][5405] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.30.192/26 handle="k8s-pod-network.0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.264484 containerd[1920]: 2026-03-06 02:57:09.232 [INFO][5405] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.30.197/26] block=192.168.30.192/26 handle="k8s-pod-network.0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.264484 containerd[1920]: 2026-03-06 02:57:09.233 [INFO][5405] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.30.197/26] handle="k8s-pod-network.0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:09.264484 containerd[1920]: 2026-03-06 02:57:09.233 [INFO][5405] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 02:57:09.264484 containerd[1920]: 2026-03-06 02:57:09.233 [INFO][5405] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.30.197/26] IPv6=[] ContainerID="0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" HandleID="k8s-pod-network.0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--mz5wt-eth0" Mar 6 02:57:09.264583 containerd[1920]: 2026-03-06 02:57:09.236 [INFO][5392] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" Namespace="kube-system" Pod="coredns-66bc5c9577-mz5wt" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--mz5wt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--mz5wt-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"2e5652d3-fe74-46a9-ae0e-f644c51ba2ba", ResourceVersion:"922", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 55, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"", Pod:"coredns-66bc5c9577-mz5wt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.30.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali19d81106063", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:57:09.264583 containerd[1920]: 2026-03-06 02:57:09.236 [INFO][5392] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.197/32] ContainerID="0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" Namespace="kube-system" Pod="coredns-66bc5c9577-mz5wt" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--mz5wt-eth0" Mar 6 02:57:09.264583 containerd[1920]: 2026-03-06 02:57:09.236 [INFO][5392] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali19d81106063 ContainerID="0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" Namespace="kube-system" Pod="coredns-66bc5c9577-mz5wt" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--mz5wt-eth0" Mar 6 02:57:09.264583 containerd[1920]: 2026-03-06 02:57:09.240 [INFO][5392] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" Namespace="kube-system" Pod="coredns-66bc5c9577-mz5wt" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--mz5wt-eth0" Mar 6 02:57:09.264583 containerd[1920]: 2026-03-06 02:57:09.243 [INFO][5392] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" Namespace="kube-system" Pod="coredns-66bc5c9577-mz5wt" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--mz5wt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--mz5wt-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"2e5652d3-fe74-46a9-ae0e-f644c51ba2ba", ResourceVersion:"922", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 55, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645", Pod:"coredns-66bc5c9577-mz5wt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.30.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali19d81106063", MAC:"9e:35:31:9a:6c:22", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:57:09.264715 containerd[1920]: 2026-03-06 02:57:09.262 [INFO][5392] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" Namespace="kube-system" Pod="coredns-66bc5c9577-mz5wt" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--mz5wt-eth0" Mar 6 02:57:09.691238 containerd[1920]: time="2026-03-06T02:57:09.691184366Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-gmmsw,Uid:4523b3f9-467b-41e3-93e3-a2e8bb564f83,Namespace:kube-system,Attempt:0,}" Mar 6 02:57:09.740507 containerd[1920]: time="2026-03-06T02:57:09.740326348Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6856446c58-8x2gl,Uid:9a2bd17b-a387-4762-bdc0-a8863bbccf80,Namespace:calico-system,Attempt:0,}" Mar 6 02:57:09.947634 containerd[1920]: time="2026-03-06T02:57:09.947283020Z" level=info msg="connecting to shim 21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a" address="unix:///run/containerd/s/e1aa92c7b3490c964bbef40406862fcf3a1818bc567b3afd845cc79bc65645e6" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:57:09.973063 systemd[1]: Started cri-containerd-21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a.scope - libcontainer container 21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a. Mar 6 02:57:10.285451 containerd[1920]: time="2026-03-06T02:57:10.285346127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8659d8567b-bwxtm,Uid:89cc2055-c0e4-4f2d-baa5-a51a5bbbd9f6,Namespace:calico-system,Attempt:0,} returns sandbox id \"21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a\"" Mar 6 02:57:10.434369 systemd-networkd[1485]: cali461ad646f9a: Link UP Mar 6 02:57:10.435486 systemd-networkd[1485]: cali461ad646f9a: Gained carrier Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.357 [INFO][5489] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--gmmsw-eth0 coredns-66bc5c9577- kube-system 4523b3f9-467b-41e3-93e3-a2e8bb564f83 923 0 2026-03-06 02:55:44 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.2.3-n-bf8f1184ca coredns-66bc5c9577-gmmsw eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali461ad646f9a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" Namespace="kube-system" Pod="coredns-66bc5c9577-gmmsw" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--gmmsw-" Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.357 [INFO][5489] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" Namespace="kube-system" Pod="coredns-66bc5c9577-gmmsw" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--gmmsw-eth0" Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.376 [INFO][5501] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" HandleID="k8s-pod-network.0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--gmmsw-eth0" Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.383 [INFO][5501] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" HandleID="k8s-pod-network.0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--gmmsw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed4b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.2.3-n-bf8f1184ca", "pod":"coredns-66bc5c9577-gmmsw", "timestamp":"2026-03-06 02:57:10.37642889 +0000 UTC"}, Hostname:"ci-4459.2.3-n-bf8f1184ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003aaf20)} Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.384 [INFO][5501] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.384 [INFO][5501] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.384 [INFO][5501] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-bf8f1184ca' Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.397 [INFO][5501] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.401 [INFO][5501] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.406 [INFO][5501] ipam/ipam.go 526: Trying affinity for 192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.409 [INFO][5501] ipam/ipam.go 160: Attempting to load block cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.411 [INFO][5501] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.411 [INFO][5501] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.30.192/26 handle="k8s-pod-network.0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.413 [INFO][5501] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05 Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.419 [INFO][5501] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.30.192/26 handle="k8s-pod-network.0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.428 [INFO][5501] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.30.198/26] block=192.168.30.192/26 handle="k8s-pod-network.0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.428 [INFO][5501] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.30.198/26] handle="k8s-pod-network.0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.429 [INFO][5501] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 02:57:10.459829 containerd[1920]: 2026-03-06 02:57:10.429 [INFO][5501] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.30.198/26] IPv6=[] ContainerID="0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" HandleID="k8s-pod-network.0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--gmmsw-eth0" Mar 6 02:57:10.460331 containerd[1920]: 2026-03-06 02:57:10.432 [INFO][5489] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" Namespace="kube-system" Pod="coredns-66bc5c9577-gmmsw" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--gmmsw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--gmmsw-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"4523b3f9-467b-41e3-93e3-a2e8bb564f83", ResourceVersion:"923", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 55, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"", Pod:"coredns-66bc5c9577-gmmsw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.30.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali461ad646f9a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:57:10.460331 containerd[1920]: 2026-03-06 02:57:10.432 [INFO][5489] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.198/32] ContainerID="0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" Namespace="kube-system" Pod="coredns-66bc5c9577-gmmsw" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--gmmsw-eth0" Mar 6 02:57:10.460331 containerd[1920]: 2026-03-06 02:57:10.432 [INFO][5489] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali461ad646f9a ContainerID="0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" Namespace="kube-system" Pod="coredns-66bc5c9577-gmmsw" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--gmmsw-eth0" Mar 6 02:57:10.460331 containerd[1920]: 2026-03-06 02:57:10.435 [INFO][5489] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" Namespace="kube-system" Pod="coredns-66bc5c9577-gmmsw" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--gmmsw-eth0" Mar 6 02:57:10.460331 containerd[1920]: 2026-03-06 02:57:10.439 [INFO][5489] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" Namespace="kube-system" Pod="coredns-66bc5c9577-gmmsw" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--gmmsw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--gmmsw-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"4523b3f9-467b-41e3-93e3-a2e8bb564f83", ResourceVersion:"923", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 55, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05", Pod:"coredns-66bc5c9577-gmmsw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.30.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali461ad646f9a", MAC:"e6:1f:4b:96:dc:29", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:57:10.461273 containerd[1920]: 2026-03-06 02:57:10.454 [INFO][5489] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" Namespace="kube-system" Pod="coredns-66bc5c9577-gmmsw" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-coredns--66bc5c9577--gmmsw-eth0" Mar 6 02:57:10.536947 systemd-networkd[1485]: calibf23d880622: Link UP Mar 6 02:57:10.538522 systemd-networkd[1485]: calibf23d880622: Gained carrier Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.411 [INFO][5507] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--8x2gl-eth0 calico-apiserver-6856446c58- calico-system 9a2bd17b-a387-4762-bdc0-a8863bbccf80 921 0 2026-03-06 02:55:59 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6856446c58 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.2.3-n-bf8f1184ca calico-apiserver-6856446c58-8x2gl eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calibf23d880622 [] [] }} ContainerID="742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" Namespace="calico-system" Pod="calico-apiserver-6856446c58-8x2gl" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--8x2gl-" Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.412 [INFO][5507] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" Namespace="calico-system" Pod="calico-apiserver-6856446c58-8x2gl" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--8x2gl-eth0" Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.440 [INFO][5520] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" HandleID="k8s-pod-network.742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--8x2gl-eth0" Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.454 [INFO][5520] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" HandleID="k8s-pod-network.742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--8x2gl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000273440), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-bf8f1184ca", "pod":"calico-apiserver-6856446c58-8x2gl", "timestamp":"2026-03-06 02:57:10.440750971 +0000 UTC"}, Hostname:"ci-4459.2.3-n-bf8f1184ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002e4f20)} Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.456 [INFO][5520] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.457 [INFO][5520] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.457 [INFO][5520] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-bf8f1184ca' Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.498 [INFO][5520] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.503 [INFO][5520] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.507 [INFO][5520] ipam/ipam.go 526: Trying affinity for 192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.509 [INFO][5520] ipam/ipam.go 160: Attempting to load block cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.511 [INFO][5520] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.511 [INFO][5520] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.30.192/26 handle="k8s-pod-network.742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.512 [INFO][5520] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6 Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.519 [INFO][5520] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.30.192/26 handle="k8s-pod-network.742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.530 [INFO][5520] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.30.199/26] block=192.168.30.192/26 handle="k8s-pod-network.742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.530 [INFO][5520] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.30.199/26] handle="k8s-pod-network.742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.530 [INFO][5520] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 02:57:10.556017 containerd[1920]: 2026-03-06 02:57:10.530 [INFO][5520] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.30.199/26] IPv6=[] ContainerID="742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" HandleID="k8s-pod-network.742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--8x2gl-eth0" Mar 6 02:57:10.556836 containerd[1920]: 2026-03-06 02:57:10.532 [INFO][5507] cni-plugin/k8s.go 418: Populated endpoint ContainerID="742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" Namespace="calico-system" Pod="calico-apiserver-6856446c58-8x2gl" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--8x2gl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--8x2gl-eth0", GenerateName:"calico-apiserver-6856446c58-", Namespace:"calico-system", SelfLink:"", UID:"9a2bd17b-a387-4762-bdc0-a8863bbccf80", ResourceVersion:"921", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 55, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6856446c58", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"", Pod:"calico-apiserver-6856446c58-8x2gl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.30.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calibf23d880622", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:57:10.556836 containerd[1920]: 2026-03-06 02:57:10.532 [INFO][5507] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.199/32] ContainerID="742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" Namespace="calico-system" Pod="calico-apiserver-6856446c58-8x2gl" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--8x2gl-eth0" Mar 6 02:57:10.556836 containerd[1920]: 2026-03-06 02:57:10.533 [INFO][5507] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibf23d880622 ContainerID="742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" Namespace="calico-system" Pod="calico-apiserver-6856446c58-8x2gl" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--8x2gl-eth0" Mar 6 02:57:10.556836 containerd[1920]: 2026-03-06 02:57:10.538 [INFO][5507] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" Namespace="calico-system" Pod="calico-apiserver-6856446c58-8x2gl" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--8x2gl-eth0" Mar 6 02:57:10.556836 containerd[1920]: 2026-03-06 02:57:10.538 [INFO][5507] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" Namespace="calico-system" Pod="calico-apiserver-6856446c58-8x2gl" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--8x2gl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--8x2gl-eth0", GenerateName:"calico-apiserver-6856446c58-", Namespace:"calico-system", SelfLink:"", UID:"9a2bd17b-a387-4762-bdc0-a8863bbccf80", ResourceVersion:"921", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 55, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6856446c58", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6", Pod:"calico-apiserver-6856446c58-8x2gl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.30.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calibf23d880622", MAC:"da:59:dd:bc:5c:72", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:57:10.556836 containerd[1920]: 2026-03-06 02:57:10.554 [INFO][5507] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" Namespace="calico-system" Pod="calico-apiserver-6856446c58-8x2gl" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--8x2gl-eth0" Mar 6 02:57:10.691109 systemd-networkd[1485]: calib8482cd0ddd: Gained IPv6LL Mar 6 02:57:10.844285 update_engine[1869]: I20260306 02:57:10.843779 1869 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 6 02:57:10.845074 update_engine[1869]: I20260306 02:57:10.844626 1869 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 6 02:57:10.845306 update_engine[1869]: I20260306 02:57:10.845273 1869 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 6 02:57:10.948073 update_engine[1869]: E20260306 02:57:10.947919 1869 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 6 02:57:10.948073 update_engine[1869]: I20260306 02:57:10.948035 1869 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 6 02:57:11.011084 systemd-networkd[1485]: cali19d81106063: Gained IPv6LL Mar 6 02:57:11.715132 systemd-networkd[1485]: cali461ad646f9a: Gained IPv6LL Mar 6 02:57:12.355066 systemd-networkd[1485]: calibf23d880622: Gained IPv6LL Mar 6 02:57:13.838994 containerd[1920]: time="2026-03-06T02:57:13.838874543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6856446c58-tcg5n,Uid:9f8be92d-b77f-4be7-b306-4ecace67782f,Namespace:calico-system,Attempt:0,}" Mar 6 02:57:13.993322 containerd[1920]: time="2026-03-06T02:57:13.993278267Z" level=info msg="connecting to shim 0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645" address="unix:///run/containerd/s/326d3589315b5ead980cfb39e3987b5591bb2343fe757780426be1c2694dd141" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:57:14.014048 systemd[1]: Started cri-containerd-0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645.scope - libcontainer container 0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645. Mar 6 02:57:14.240888 containerd[1920]: time="2026-03-06T02:57:14.240790275Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-mz5wt,Uid:2e5652d3-fe74-46a9-ae0e-f644c51ba2ba,Namespace:kube-system,Attempt:0,} returns sandbox id \"0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645\"" Mar 6 02:57:14.336087 containerd[1920]: time="2026-03-06T02:57:14.335816126Z" level=info msg="CreateContainer within sandbox \"0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 6 02:57:14.336759 systemd-networkd[1485]: calif29696072b3: Link UP Mar 6 02:57:14.338974 systemd-networkd[1485]: calif29696072b3: Gained carrier Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.269 [INFO][5627] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--tcg5n-eth0 calico-apiserver-6856446c58- calico-system 9f8be92d-b77f-4be7-b306-4ecace67782f 919 0 2026-03-06 02:55:59 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6856446c58 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.2.3-n-bf8f1184ca calico-apiserver-6856446c58-tcg5n eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calif29696072b3 [] [] }} ContainerID="bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" Namespace="calico-system" Pod="calico-apiserver-6856446c58-tcg5n" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--tcg5n-" Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.269 [INFO][5627] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" Namespace="calico-system" Pod="calico-apiserver-6856446c58-tcg5n" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--tcg5n-eth0" Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.290 [INFO][5640] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" HandleID="k8s-pod-network.bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--tcg5n-eth0" Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.296 [INFO][5640] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" HandleID="k8s-pod-network.bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--tcg5n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed4b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-bf8f1184ca", "pod":"calico-apiserver-6856446c58-tcg5n", "timestamp":"2026-03-06 02:57:14.290672649 +0000 UTC"}, Hostname:"ci-4459.2.3-n-bf8f1184ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000309080)} Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.296 [INFO][5640] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.296 [INFO][5640] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.296 [INFO][5640] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-bf8f1184ca' Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.298 [INFO][5640] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.303 [INFO][5640] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.307 [INFO][5640] ipam/ipam.go 526: Trying affinity for 192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.309 [INFO][5640] ipam/ipam.go 160: Attempting to load block cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.311 [INFO][5640] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.30.192/26 host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.311 [INFO][5640] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.30.192/26 handle="k8s-pod-network.bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.312 [INFO][5640] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.319 [INFO][5640] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.30.192/26 handle="k8s-pod-network.bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.327 [INFO][5640] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.30.200/26] block=192.168.30.192/26 handle="k8s-pod-network.bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.328 [INFO][5640] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.30.200/26] handle="k8s-pod-network.bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" host="ci-4459.2.3-n-bf8f1184ca" Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.328 [INFO][5640] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 02:57:14.356337 containerd[1920]: 2026-03-06 02:57:14.328 [INFO][5640] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.30.200/26] IPv6=[] ContainerID="bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" HandleID="k8s-pod-network.bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" Workload="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--tcg5n-eth0" Mar 6 02:57:14.356852 containerd[1920]: 2026-03-06 02:57:14.330 [INFO][5627] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" Namespace="calico-system" Pod="calico-apiserver-6856446c58-tcg5n" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--tcg5n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--tcg5n-eth0", GenerateName:"calico-apiserver-6856446c58-", Namespace:"calico-system", SelfLink:"", UID:"9f8be92d-b77f-4be7-b306-4ecace67782f", ResourceVersion:"919", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 55, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6856446c58", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"", Pod:"calico-apiserver-6856446c58-tcg5n", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.30.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calif29696072b3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:57:14.356852 containerd[1920]: 2026-03-06 02:57:14.330 [INFO][5627] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.200/32] ContainerID="bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" Namespace="calico-system" Pod="calico-apiserver-6856446c58-tcg5n" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--tcg5n-eth0" Mar 6 02:57:14.356852 containerd[1920]: 2026-03-06 02:57:14.330 [INFO][5627] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif29696072b3 ContainerID="bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" Namespace="calico-system" Pod="calico-apiserver-6856446c58-tcg5n" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--tcg5n-eth0" Mar 6 02:57:14.356852 containerd[1920]: 2026-03-06 02:57:14.339 [INFO][5627] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" Namespace="calico-system" Pod="calico-apiserver-6856446c58-tcg5n" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--tcg5n-eth0" Mar 6 02:57:14.356852 containerd[1920]: 2026-03-06 02:57:14.339 [INFO][5627] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" Namespace="calico-system" Pod="calico-apiserver-6856446c58-tcg5n" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--tcg5n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--tcg5n-eth0", GenerateName:"calico-apiserver-6856446c58-", Namespace:"calico-system", SelfLink:"", UID:"9f8be92d-b77f-4be7-b306-4ecace67782f", ResourceVersion:"919", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 55, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6856446c58", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-bf8f1184ca", ContainerID:"bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a", Pod:"calico-apiserver-6856446c58-tcg5n", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.30.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calif29696072b3", MAC:"6e:f1:17:9e:ea:78", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 02:57:14.356852 containerd[1920]: 2026-03-06 02:57:14.353 [INFO][5627] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" Namespace="calico-system" Pod="calico-apiserver-6856446c58-tcg5n" WorkloadEndpoint="ci--4459.2.3--n--bf8f1184ca-k8s-calico--apiserver--6856446c58--tcg5n-eth0" Mar 6 02:57:16.003039 systemd-networkd[1485]: calif29696072b3: Gained IPv6LL Mar 6 02:57:16.401676 containerd[1920]: time="2026-03-06T02:57:16.400446973Z" level=info msg="Container 548663af0a6060bf5922e3fd0ed088b9eebec57fe2f678c2a307490a00c835f6: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:57:16.402208 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3138026994.mount: Deactivated successfully. Mar 6 02:57:16.550260 containerd[1920]: time="2026-03-06T02:57:16.550085690Z" level=info msg="connecting to shim 0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05" address="unix:///run/containerd/s/9b4a5a73d22eec1abb014dc863d83bd07e80236f68904bd27bf4daeea8105777" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:57:16.715061 systemd[1]: Started cri-containerd-0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05.scope - libcontainer container 0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05. Mar 6 02:57:16.730420 containerd[1920]: time="2026-03-06T02:57:16.729754956Z" level=info msg="connecting to shim 742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6" address="unix:///run/containerd/s/53bf4091f591b34bf648352efe3316095f4473f27abcdc0fd3cec4af16cce417" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:57:16.732661 containerd[1920]: time="2026-03-06T02:57:16.732625746Z" level=info msg="CreateContainer within sandbox \"0d442282bcb1086039b1cc6804896bca365d2580fa6bd01b1f7015f98727b645\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"548663af0a6060bf5922e3fd0ed088b9eebec57fe2f678c2a307490a00c835f6\"" Mar 6 02:57:16.733684 containerd[1920]: time="2026-03-06T02:57:16.733648515Z" level=info msg="StartContainer for \"548663af0a6060bf5922e3fd0ed088b9eebec57fe2f678c2a307490a00c835f6\"" Mar 6 02:57:16.737887 containerd[1920]: time="2026-03-06T02:57:16.737843188Z" level=info msg="connecting to shim 548663af0a6060bf5922e3fd0ed088b9eebec57fe2f678c2a307490a00c835f6" address="unix:///run/containerd/s/326d3589315b5ead980cfb39e3987b5591bb2343fe757780426be1c2694dd141" protocol=ttrpc version=3 Mar 6 02:57:16.747962 containerd[1920]: time="2026-03-06T02:57:16.747931573Z" level=info msg="connecting to shim bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a" address="unix:///run/containerd/s/793adc4ec435bd56963d14ae1eb1fbf50be8ebe9179d2503a5449e7af964bdc8" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:57:16.775073 systemd[1]: Started cri-containerd-742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6.scope - libcontainer container 742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6. Mar 6 02:57:16.787158 systemd[1]: Started cri-containerd-548663af0a6060bf5922e3fd0ed088b9eebec57fe2f678c2a307490a00c835f6.scope - libcontainer container 548663af0a6060bf5922e3fd0ed088b9eebec57fe2f678c2a307490a00c835f6. Mar 6 02:57:16.797066 containerd[1920]: time="2026-03-06T02:57:16.796979071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-gmmsw,Uid:4523b3f9-467b-41e3-93e3-a2e8bb564f83,Namespace:kube-system,Attempt:0,} returns sandbox id \"0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05\"" Mar 6 02:57:16.809799 systemd[1]: Started cri-containerd-bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a.scope - libcontainer container bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a. Mar 6 02:57:16.810224 containerd[1920]: time="2026-03-06T02:57:16.810123948Z" level=info msg="CreateContainer within sandbox \"0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 6 02:57:16.855584 containerd[1920]: time="2026-03-06T02:57:16.855518102Z" level=info msg="Container adedf3174ecfe58622632c508d86aba77c1f5c51261699a07faee115f9cf3d04: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:57:16.874965 containerd[1920]: time="2026-03-06T02:57:16.874940272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6856446c58-8x2gl,Uid:9a2bd17b-a387-4762-bdc0-a8863bbccf80,Namespace:calico-system,Attempt:0,} returns sandbox id \"742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6\"" Mar 6 02:57:16.882022 containerd[1920]: time="2026-03-06T02:57:16.881686636Z" level=info msg="StartContainer for \"548663af0a6060bf5922e3fd0ed088b9eebec57fe2f678c2a307490a00c835f6\" returns successfully" Mar 6 02:57:16.894911 containerd[1920]: time="2026-03-06T02:57:16.894832449Z" level=info msg="CreateContainer within sandbox \"0204cd5cc48d5bacce7bdc0804f600461185896eeb110408cfb3747c7c068b05\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"adedf3174ecfe58622632c508d86aba77c1f5c51261699a07faee115f9cf3d04\"" Mar 6 02:57:16.896497 containerd[1920]: time="2026-03-06T02:57:16.896017560Z" level=info msg="StartContainer for \"adedf3174ecfe58622632c508d86aba77c1f5c51261699a07faee115f9cf3d04\"" Mar 6 02:57:16.896977 containerd[1920]: time="2026-03-06T02:57:16.896954287Z" level=info msg="connecting to shim adedf3174ecfe58622632c508d86aba77c1f5c51261699a07faee115f9cf3d04" address="unix:///run/containerd/s/9b4a5a73d22eec1abb014dc863d83bd07e80236f68904bd27bf4daeea8105777" protocol=ttrpc version=3 Mar 6 02:57:16.917122 systemd[1]: Started cri-containerd-adedf3174ecfe58622632c508d86aba77c1f5c51261699a07faee115f9cf3d04.scope - libcontainer container adedf3174ecfe58622632c508d86aba77c1f5c51261699a07faee115f9cf3d04. Mar 6 02:57:16.924519 containerd[1920]: time="2026-03-06T02:57:16.924350989Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6856446c58-tcg5n,Uid:9f8be92d-b77f-4be7-b306-4ecace67782f,Namespace:calico-system,Attempt:0,} returns sandbox id \"bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a\"" Mar 6 02:57:16.964968 containerd[1920]: time="2026-03-06T02:57:16.964912585Z" level=info msg="StartContainer for \"adedf3174ecfe58622632c508d86aba77c1f5c51261699a07faee115f9cf3d04\" returns successfully" Mar 6 02:57:17.397144 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1359415413.mount: Deactivated successfully. Mar 6 02:57:17.417949 containerd[1920]: time="2026-03-06T02:57:17.417880789Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:17.420823 containerd[1920]: time="2026-03-06T02:57:17.420793964Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Mar 6 02:57:17.425738 containerd[1920]: time="2026-03-06T02:57:17.425092361Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:17.429829 containerd[1920]: time="2026-03-06T02:57:17.429802842Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:17.430868 containerd[1920]: time="2026-03-06T02:57:17.430841860Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 9.642701475s" Mar 6 02:57:17.430977 containerd[1920]: time="2026-03-06T02:57:17.430962920Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Mar 6 02:57:17.432655 containerd[1920]: time="2026-03-06T02:57:17.432610366Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 6 02:57:17.438437 containerd[1920]: time="2026-03-06T02:57:17.438404699Z" level=info msg="CreateContainer within sandbox \"06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 6 02:57:17.460225 containerd[1920]: time="2026-03-06T02:57:17.460187579Z" level=info msg="Container 7dc2abd43945aa0c25c1ac83d3cdac91178b02653fa145502164bc068fe6960e: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:57:17.480700 containerd[1920]: time="2026-03-06T02:57:17.480624622Z" level=info msg="CreateContainer within sandbox \"06bfe4ec94194962bdec94496140f4417bf935d5bd06d751649c7ba4e25fd3bb\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"7dc2abd43945aa0c25c1ac83d3cdac91178b02653fa145502164bc068fe6960e\"" Mar 6 02:57:17.481949 containerd[1920]: time="2026-03-06T02:57:17.481368854Z" level=info msg="StartContainer for \"7dc2abd43945aa0c25c1ac83d3cdac91178b02653fa145502164bc068fe6960e\"" Mar 6 02:57:17.483676 containerd[1920]: time="2026-03-06T02:57:17.483558814Z" level=info msg="connecting to shim 7dc2abd43945aa0c25c1ac83d3cdac91178b02653fa145502164bc068fe6960e" address="unix:///run/containerd/s/563742fd09e20e34db699f9742b7c6bf894d3ce21972b639cb156e01d7ff45d3" protocol=ttrpc version=3 Mar 6 02:57:17.504056 systemd[1]: Started cri-containerd-7dc2abd43945aa0c25c1ac83d3cdac91178b02653fa145502164bc068fe6960e.scope - libcontainer container 7dc2abd43945aa0c25c1ac83d3cdac91178b02653fa145502164bc068fe6960e. Mar 6 02:57:17.540924 containerd[1920]: time="2026-03-06T02:57:17.540571939Z" level=info msg="StartContainer for \"7dc2abd43945aa0c25c1ac83d3cdac91178b02653fa145502164bc068fe6960e\" returns successfully" Mar 6 02:57:17.915996 kubelet[3624]: I0306 02:57:17.915332 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-cccfbd5cf-7p82v" podStartSLOduration=69.270648921 podStartE2EDuration="1m18.915316269s" podCreationTimestamp="2026-03-06 02:55:59 +0000 UTC" firstStartedPulling="2026-03-06 02:57:07.787043053 +0000 UTC m=+89.297956776" lastFinishedPulling="2026-03-06 02:57:17.431710401 +0000 UTC m=+98.942624124" observedRunningTime="2026-03-06 02:57:17.914524163 +0000 UTC m=+99.425437966" watchObservedRunningTime="2026-03-06 02:57:17.915316269 +0000 UTC m=+99.426229992" Mar 6 02:57:17.962011 kubelet[3624]: I0306 02:57:17.961802 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-mz5wt" podStartSLOduration=93.961785194 podStartE2EDuration="1m33.961785194s" podCreationTimestamp="2026-03-06 02:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 02:57:17.943720773 +0000 UTC m=+99.454634528" watchObservedRunningTime="2026-03-06 02:57:17.961785194 +0000 UTC m=+99.472698941" Mar 6 02:57:20.845943 update_engine[1869]: I20260306 02:57:20.845553 1869 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 6 02:57:20.845943 update_engine[1869]: I20260306 02:57:20.845651 1869 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 6 02:57:20.846892 update_engine[1869]: I20260306 02:57:20.846839 1869 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 6 02:57:20.884050 update_engine[1869]: E20260306 02:57:20.883874 1869 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 6 02:57:20.884050 update_engine[1869]: I20260306 02:57:20.884018 1869 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 6 02:57:25.740923 containerd[1920]: time="2026-03-06T02:57:25.740826759Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:25.744322 containerd[1920]: time="2026-03-06T02:57:25.743919723Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Mar 6 02:57:25.786688 containerd[1920]: time="2026-03-06T02:57:25.786637498Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:25.833704 containerd[1920]: time="2026-03-06T02:57:25.833649773Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:25.834475 containerd[1920]: time="2026-03-06T02:57:25.834439110Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 8.40163089s" Mar 6 02:57:25.834530 containerd[1920]: time="2026-03-06T02:57:25.834477199Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Mar 6 02:57:25.835970 containerd[1920]: time="2026-03-06T02:57:25.835746993Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 6 02:57:25.989709 containerd[1920]: time="2026-03-06T02:57:25.989662448Z" level=info msg="CreateContainer within sandbox \"21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 6 02:57:26.149986 containerd[1920]: time="2026-03-06T02:57:26.146356610Z" level=info msg="Container 9ff0d257ee6f45aec349de6bc2218a52c1ac32de964d12b5737453001bef1c2a: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:57:26.285104 containerd[1920]: time="2026-03-06T02:57:26.285061485Z" level=info msg="CreateContainer within sandbox \"21cf6a99338f2e6545e085d5790b47658d927d92602001ff28458fb48eb0e94a\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"9ff0d257ee6f45aec349de6bc2218a52c1ac32de964d12b5737453001bef1c2a\"" Mar 6 02:57:26.285975 containerd[1920]: time="2026-03-06T02:57:26.285943202Z" level=info msg="StartContainer for \"9ff0d257ee6f45aec349de6bc2218a52c1ac32de964d12b5737453001bef1c2a\"" Mar 6 02:57:26.287068 containerd[1920]: time="2026-03-06T02:57:26.287032701Z" level=info msg="connecting to shim 9ff0d257ee6f45aec349de6bc2218a52c1ac32de964d12b5737453001bef1c2a" address="unix:///run/containerd/s/e1aa92c7b3490c964bbef40406862fcf3a1818bc567b3afd845cc79bc65645e6" protocol=ttrpc version=3 Mar 6 02:57:26.304021 systemd[1]: Started cri-containerd-9ff0d257ee6f45aec349de6bc2218a52c1ac32de964d12b5737453001bef1c2a.scope - libcontainer container 9ff0d257ee6f45aec349de6bc2218a52c1ac32de964d12b5737453001bef1c2a. Mar 6 02:57:26.397871 containerd[1920]: time="2026-03-06T02:57:26.397833977Z" level=info msg="StartContainer for \"9ff0d257ee6f45aec349de6bc2218a52c1ac32de964d12b5737453001bef1c2a\" returns successfully" Mar 6 02:57:26.937366 kubelet[3624]: I0306 02:57:26.936875 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-8659d8567b-bwxtm" podStartSLOduration=71.388180237 podStartE2EDuration="1m26.936863383s" podCreationTimestamp="2026-03-06 02:56:00 +0000 UTC" firstStartedPulling="2026-03-06 02:57:10.286780285 +0000 UTC m=+91.797694008" lastFinishedPulling="2026-03-06 02:57:25.835463431 +0000 UTC m=+107.346377154" observedRunningTime="2026-03-06 02:57:26.936663777 +0000 UTC m=+108.447577508" watchObservedRunningTime="2026-03-06 02:57:26.936863383 +0000 UTC m=+108.447777106" Mar 6 02:57:26.938646 kubelet[3624]: I0306 02:57:26.938339 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-gmmsw" podStartSLOduration=102.938328895 podStartE2EDuration="1m42.938328895s" podCreationTimestamp="2026-03-06 02:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 02:57:17.961982209 +0000 UTC m=+99.472895940" watchObservedRunningTime="2026-03-06 02:57:26.938328895 +0000 UTC m=+108.449242634" Mar 6 02:57:30.751557 systemd[1]: Started sshd@7-10.200.20.34:22-10.200.16.10:56648.service - OpenSSH per-connection server daemon (10.200.16.10:56648). Mar 6 02:57:30.845300 update_engine[1869]: I20260306 02:57:30.845232 1869 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 6 02:57:30.845651 update_engine[1869]: I20260306 02:57:30.845322 1869 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 6 02:57:30.845671 update_engine[1869]: I20260306 02:57:30.845649 1869 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 6 02:57:30.882307 update_engine[1869]: E20260306 02:57:30.882246 1869 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 6 02:57:30.882457 update_engine[1869]: I20260306 02:57:30.882352 1869 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 6 02:57:30.882457 update_engine[1869]: I20260306 02:57:30.882361 1869 omaha_request_action.cc:617] Omaha request response: Mar 6 02:57:30.882457 update_engine[1869]: E20260306 02:57:30.882452 1869 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 6 02:57:30.882505 update_engine[1869]: I20260306 02:57:30.882467 1869 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 6 02:57:30.882505 update_engine[1869]: I20260306 02:57:30.882470 1869 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 6 02:57:30.882505 update_engine[1869]: I20260306 02:57:30.882475 1869 update_attempter.cc:306] Processing Done. Mar 6 02:57:30.882505 update_engine[1869]: E20260306 02:57:30.882486 1869 update_attempter.cc:619] Update failed. Mar 6 02:57:30.882505 update_engine[1869]: I20260306 02:57:30.882490 1869 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 6 02:57:30.882505 update_engine[1869]: I20260306 02:57:30.882493 1869 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 6 02:57:30.882505 update_engine[1869]: I20260306 02:57:30.882496 1869 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 6 02:57:30.882600 update_engine[1869]: I20260306 02:57:30.882562 1869 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 6 02:57:30.882600 update_engine[1869]: I20260306 02:57:30.882580 1869 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 6 02:57:30.882600 update_engine[1869]: I20260306 02:57:30.882583 1869 omaha_request_action.cc:272] Request: Mar 6 02:57:30.882600 update_engine[1869]: Mar 6 02:57:30.882600 update_engine[1869]: Mar 6 02:57:30.882600 update_engine[1869]: Mar 6 02:57:30.882600 update_engine[1869]: Mar 6 02:57:30.882600 update_engine[1869]: Mar 6 02:57:30.882600 update_engine[1869]: Mar 6 02:57:30.882600 update_engine[1869]: I20260306 02:57:30.882588 1869 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 6 02:57:30.882721 update_engine[1869]: I20260306 02:57:30.882606 1869 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 6 02:57:30.883188 locksmithd[1953]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 6 02:57:30.883417 update_engine[1869]: I20260306 02:57:30.883208 1869 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 6 02:57:30.888806 update_engine[1869]: E20260306 02:57:30.888585 1869 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 6 02:57:30.888806 update_engine[1869]: I20260306 02:57:30.888654 1869 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 6 02:57:30.888806 update_engine[1869]: I20260306 02:57:30.888660 1869 omaha_request_action.cc:617] Omaha request response: Mar 6 02:57:30.888806 update_engine[1869]: I20260306 02:57:30.888666 1869 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 6 02:57:30.888806 update_engine[1869]: I20260306 02:57:30.888670 1869 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 6 02:57:30.888806 update_engine[1869]: I20260306 02:57:30.888674 1869 update_attempter.cc:306] Processing Done. Mar 6 02:57:30.888806 update_engine[1869]: I20260306 02:57:30.888679 1869 update_attempter.cc:310] Error event sent. Mar 6 02:57:30.888806 update_engine[1869]: I20260306 02:57:30.888686 1869 update_check_scheduler.cc:74] Next update check in 47m0s Mar 6 02:57:30.889297 locksmithd[1953]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 6 02:57:31.179092 sshd[6096]: Accepted publickey for core from 10.200.16.10 port 56648 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:57:31.182193 sshd-session[6096]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:57:31.186060 systemd-logind[1868]: New session 10 of user core. Mar 6 02:57:31.191064 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 6 02:57:32.875980 sshd[6109]: Connection closed by 10.200.16.10 port 56648 Mar 6 02:57:32.876514 sshd-session[6096]: pam_unix(sshd:session): session closed for user core Mar 6 02:57:32.880154 systemd[1]: sshd@7-10.200.20.34:22-10.200.16.10:56648.service: Deactivated successfully. Mar 6 02:57:32.883277 systemd[1]: session-10.scope: Deactivated successfully. Mar 6 02:57:32.884102 systemd-logind[1868]: Session 10 logged out. Waiting for processes to exit. Mar 6 02:57:32.885895 systemd-logind[1868]: Removed session 10. Mar 6 02:57:35.781031 containerd[1920]: time="2026-03-06T02:57:35.780497579Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:35.784434 containerd[1920]: time="2026-03-06T02:57:35.784408122Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Mar 6 02:57:35.787477 containerd[1920]: time="2026-03-06T02:57:35.787441692Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:35.792939 containerd[1920]: time="2026-03-06T02:57:35.792278520Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:35.792939 containerd[1920]: time="2026-03-06T02:57:35.792732399Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 9.956958926s" Mar 6 02:57:35.792939 containerd[1920]: time="2026-03-06T02:57:35.792752375Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 6 02:57:35.794535 containerd[1920]: time="2026-03-06T02:57:35.793658173Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 6 02:57:35.801820 containerd[1920]: time="2026-03-06T02:57:35.801797852Z" level=info msg="CreateContainer within sandbox \"742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 6 02:57:35.823526 containerd[1920]: time="2026-03-06T02:57:35.823500586Z" level=info msg="Container 6d51d04f5634a6533925ee5aefea7727a6c1a9c3e3f233852b854488eaec0b46: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:57:35.851162 containerd[1920]: time="2026-03-06T02:57:35.851119327Z" level=info msg="CreateContainer within sandbox \"742f6223076b97fc0ef31084828979ce5f84728f1f1d09a3d85d7815dda2b4c6\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"6d51d04f5634a6533925ee5aefea7727a6c1a9c3e3f233852b854488eaec0b46\"" Mar 6 02:57:35.854496 containerd[1920]: time="2026-03-06T02:57:35.854464548Z" level=info msg="StartContainer for \"6d51d04f5634a6533925ee5aefea7727a6c1a9c3e3f233852b854488eaec0b46\"" Mar 6 02:57:35.855839 containerd[1920]: time="2026-03-06T02:57:35.855788606Z" level=info msg="connecting to shim 6d51d04f5634a6533925ee5aefea7727a6c1a9c3e3f233852b854488eaec0b46" address="unix:///run/containerd/s/53bf4091f591b34bf648352efe3316095f4473f27abcdc0fd3cec4af16cce417" protocol=ttrpc version=3 Mar 6 02:57:35.891049 systemd[1]: Started cri-containerd-6d51d04f5634a6533925ee5aefea7727a6c1a9c3e3f233852b854488eaec0b46.scope - libcontainer container 6d51d04f5634a6533925ee5aefea7727a6c1a9c3e3f233852b854488eaec0b46. Mar 6 02:57:35.925183 containerd[1920]: time="2026-03-06T02:57:35.925111513Z" level=info msg="StartContainer for \"6d51d04f5634a6533925ee5aefea7727a6c1a9c3e3f233852b854488eaec0b46\" returns successfully" Mar 6 02:57:35.956776 kubelet[3624]: I0306 02:57:35.955834 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-6856446c58-8x2gl" podStartSLOduration=78.042381386 podStartE2EDuration="1m36.955821258s" podCreationTimestamp="2026-03-06 02:55:59 +0000 UTC" firstStartedPulling="2026-03-06 02:57:16.880115009 +0000 UTC m=+98.391028732" lastFinishedPulling="2026-03-06 02:57:35.793554857 +0000 UTC m=+117.304468604" observedRunningTime="2026-03-06 02:57:35.954848395 +0000 UTC m=+117.465762126" watchObservedRunningTime="2026-03-06 02:57:35.955821258 +0000 UTC m=+117.466734981" Mar 6 02:57:37.188296 containerd[1920]: time="2026-03-06T02:57:37.188244691Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:57:37.191930 containerd[1920]: time="2026-03-06T02:57:37.191635073Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 6 02:57:37.192816 containerd[1920]: time="2026-03-06T02:57:37.192793046Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 1.398478028s" Mar 6 02:57:37.192955 containerd[1920]: time="2026-03-06T02:57:37.192939843Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 6 02:57:37.201070 containerd[1920]: time="2026-03-06T02:57:37.201047985Z" level=info msg="CreateContainer within sandbox \"bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 6 02:57:37.222081 containerd[1920]: time="2026-03-06T02:57:37.222049280Z" level=info msg="Container 6b5e3f248284c5eb994f5d6711886d21c06a1eacb1fc4c4fca71f4968aca6343: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:57:37.225225 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1633261718.mount: Deactivated successfully. Mar 6 02:57:37.239910 containerd[1920]: time="2026-03-06T02:57:37.239862209Z" level=info msg="CreateContainer within sandbox \"bf0f9b32e4d9b21cd942c832bd2fe6b2abbdc07ecacc32e593b1e83155f2be1a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"6b5e3f248284c5eb994f5d6711886d21c06a1eacb1fc4c4fca71f4968aca6343\"" Mar 6 02:57:37.241436 containerd[1920]: time="2026-03-06T02:57:37.240864033Z" level=info msg="StartContainer for \"6b5e3f248284c5eb994f5d6711886d21c06a1eacb1fc4c4fca71f4968aca6343\"" Mar 6 02:57:37.243225 containerd[1920]: time="2026-03-06T02:57:37.243190484Z" level=info msg="connecting to shim 6b5e3f248284c5eb994f5d6711886d21c06a1eacb1fc4c4fca71f4968aca6343" address="unix:///run/containerd/s/793adc4ec435bd56963d14ae1eb1fbf50be8ebe9179d2503a5449e7af964bdc8" protocol=ttrpc version=3 Mar 6 02:57:37.267031 systemd[1]: Started cri-containerd-6b5e3f248284c5eb994f5d6711886d21c06a1eacb1fc4c4fca71f4968aca6343.scope - libcontainer container 6b5e3f248284c5eb994f5d6711886d21c06a1eacb1fc4c4fca71f4968aca6343. Mar 6 02:57:37.299351 containerd[1920]: time="2026-03-06T02:57:37.299306947Z" level=info msg="StartContainer for \"6b5e3f248284c5eb994f5d6711886d21c06a1eacb1fc4c4fca71f4968aca6343\" returns successfully" Mar 6 02:57:37.958473 systemd[1]: Started sshd@8-10.200.20.34:22-10.200.16.10:56656.service - OpenSSH per-connection server daemon (10.200.16.10:56656). Mar 6 02:57:38.385362 sshd[6212]: Accepted publickey for core from 10.200.16.10 port 56656 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:57:38.387944 sshd-session[6212]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:57:38.393733 systemd-logind[1868]: New session 11 of user core. Mar 6 02:57:38.398028 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 6 02:57:38.637716 kubelet[3624]: I0306 02:57:38.637247 3624 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-6856446c58-tcg5n" podStartSLOduration=79.370910973 podStartE2EDuration="1m39.637233825s" podCreationTimestamp="2026-03-06 02:55:59 +0000 UTC" firstStartedPulling="2026-03-06 02:57:16.927322102 +0000 UTC m=+98.438235833" lastFinishedPulling="2026-03-06 02:57:37.193644962 +0000 UTC m=+118.704558685" observedRunningTime="2026-03-06 02:57:37.967372525 +0000 UTC m=+119.478286248" watchObservedRunningTime="2026-03-06 02:57:38.637233825 +0000 UTC m=+120.148147548" Mar 6 02:57:38.678420 sshd[6231]: Connection closed by 10.200.16.10 port 56656 Mar 6 02:57:38.678771 sshd-session[6212]: pam_unix(sshd:session): session closed for user core Mar 6 02:57:38.681925 systemd[1]: sshd@8-10.200.20.34:22-10.200.16.10:56656.service: Deactivated successfully. Mar 6 02:57:38.685563 systemd[1]: session-11.scope: Deactivated successfully. Mar 6 02:57:38.686277 systemd-logind[1868]: Session 11 logged out. Waiting for processes to exit. Mar 6 02:57:38.687653 systemd-logind[1868]: Removed session 11. Mar 6 02:57:43.769172 systemd[1]: Started sshd@9-10.200.20.34:22-10.200.16.10:49446.service - OpenSSH per-connection server daemon (10.200.16.10:49446). Mar 6 02:57:44.173088 sshd[6251]: Accepted publickey for core from 10.200.16.10 port 49446 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:57:44.174223 sshd-session[6251]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:57:44.177692 systemd-logind[1868]: New session 12 of user core. Mar 6 02:57:44.186077 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 6 02:57:44.438530 sshd[6254]: Connection closed by 10.200.16.10 port 49446 Mar 6 02:57:44.437540 sshd-session[6251]: pam_unix(sshd:session): session closed for user core Mar 6 02:57:44.440660 systemd[1]: sshd@9-10.200.20.34:22-10.200.16.10:49446.service: Deactivated successfully. Mar 6 02:57:44.442694 systemd[1]: session-12.scope: Deactivated successfully. Mar 6 02:57:44.444822 systemd-logind[1868]: Session 12 logged out. Waiting for processes to exit. Mar 6 02:57:44.446323 systemd-logind[1868]: Removed session 12. Mar 6 02:57:49.510514 systemd[1]: Started sshd@10-10.200.20.34:22-10.200.16.10:49452.service - OpenSSH per-connection server daemon (10.200.16.10:49452). Mar 6 02:57:49.871732 sshd[6296]: Accepted publickey for core from 10.200.16.10 port 49452 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:57:49.873153 sshd-session[6296]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:57:49.877407 systemd-logind[1868]: New session 13 of user core. Mar 6 02:57:49.883054 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 6 02:57:50.142765 sshd[6299]: Connection closed by 10.200.16.10 port 49452 Mar 6 02:57:50.143470 sshd-session[6296]: pam_unix(sshd:session): session closed for user core Mar 6 02:57:50.147924 systemd[1]: sshd@10-10.200.20.34:22-10.200.16.10:49452.service: Deactivated successfully. Mar 6 02:57:50.149550 systemd[1]: session-13.scope: Deactivated successfully. Mar 6 02:57:50.150972 systemd-logind[1868]: Session 13 logged out. Waiting for processes to exit. Mar 6 02:57:50.152273 systemd-logind[1868]: Removed session 13. Mar 6 02:57:55.241790 systemd[1]: Started sshd@11-10.200.20.34:22-10.200.16.10:58056.service - OpenSSH per-connection server daemon (10.200.16.10:58056). Mar 6 02:57:55.685800 sshd[6319]: Accepted publickey for core from 10.200.16.10 port 58056 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:57:55.687202 sshd-session[6319]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:57:55.691958 systemd-logind[1868]: New session 14 of user core. Mar 6 02:57:55.698043 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 6 02:57:55.976886 sshd[6367]: Connection closed by 10.200.16.10 port 58056 Mar 6 02:57:55.977508 sshd-session[6319]: pam_unix(sshd:session): session closed for user core Mar 6 02:57:55.980785 systemd[1]: sshd@11-10.200.20.34:22-10.200.16.10:58056.service: Deactivated successfully. Mar 6 02:57:55.983016 systemd[1]: session-14.scope: Deactivated successfully. Mar 6 02:57:55.984330 systemd-logind[1868]: Session 14 logged out. Waiting for processes to exit. Mar 6 02:57:55.987801 systemd-logind[1868]: Removed session 14. Mar 6 02:58:01.064811 systemd[1]: Started sshd@12-10.200.20.34:22-10.200.16.10:48024.service - OpenSSH per-connection server daemon (10.200.16.10:48024). Mar 6 02:58:01.491190 sshd[6423]: Accepted publickey for core from 10.200.16.10 port 48024 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:01.493684 sshd-session[6423]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:01.499492 systemd-logind[1868]: New session 15 of user core. Mar 6 02:58:01.504041 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 6 02:58:01.968330 sshd[6426]: Connection closed by 10.200.16.10 port 48024 Mar 6 02:58:01.969497 sshd-session[6423]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:01.975304 systemd[1]: sshd@12-10.200.20.34:22-10.200.16.10:48024.service: Deactivated successfully. Mar 6 02:58:01.975509 systemd-logind[1868]: Session 15 logged out. Waiting for processes to exit. Mar 6 02:58:01.978950 systemd[1]: session-15.scope: Deactivated successfully. Mar 6 02:58:01.981017 systemd-logind[1868]: Removed session 15. Mar 6 02:58:07.058101 systemd[1]: Started sshd@13-10.200.20.34:22-10.200.16.10:48032.service - OpenSSH per-connection server daemon (10.200.16.10:48032). Mar 6 02:58:07.476933 sshd[6438]: Accepted publickey for core from 10.200.16.10 port 48032 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:07.477869 sshd-session[6438]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:07.482230 systemd-logind[1868]: New session 16 of user core. Mar 6 02:58:07.488050 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 6 02:58:07.762020 sshd[6441]: Connection closed by 10.200.16.10 port 48032 Mar 6 02:58:07.762698 sshd-session[6438]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:07.765893 systemd[1]: sshd@13-10.200.20.34:22-10.200.16.10:48032.service: Deactivated successfully. Mar 6 02:58:07.767831 systemd[1]: session-16.scope: Deactivated successfully. Mar 6 02:58:07.768622 systemd-logind[1868]: Session 16 logged out. Waiting for processes to exit. Mar 6 02:58:07.769915 systemd-logind[1868]: Removed session 16. Mar 6 02:58:12.834811 systemd[1]: Started sshd@14-10.200.20.34:22-10.200.16.10:37322.service - OpenSSH per-connection server daemon (10.200.16.10:37322). Mar 6 02:58:13.210647 sshd[6454]: Accepted publickey for core from 10.200.16.10 port 37322 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:13.211778 sshd-session[6454]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:13.216086 systemd-logind[1868]: New session 17 of user core. Mar 6 02:58:13.220038 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 6 02:58:13.461340 sshd[6457]: Connection closed by 10.200.16.10 port 37322 Mar 6 02:58:13.460545 sshd-session[6454]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:13.463664 systemd[1]: sshd@14-10.200.20.34:22-10.200.16.10:37322.service: Deactivated successfully. Mar 6 02:58:13.465604 systemd[1]: session-17.scope: Deactivated successfully. Mar 6 02:58:13.467603 systemd-logind[1868]: Session 17 logged out. Waiting for processes to exit. Mar 6 02:58:13.469020 systemd-logind[1868]: Removed session 17. Mar 6 02:58:18.539057 systemd[1]: Started sshd@15-10.200.20.34:22-10.200.16.10:37326.service - OpenSSH per-connection server daemon (10.200.16.10:37326). Mar 6 02:58:18.909171 sshd[6488]: Accepted publickey for core from 10.200.16.10 port 37326 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:18.910296 sshd-session[6488]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:18.918769 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 6 02:58:18.919228 systemd-logind[1868]: New session 18 of user core. Mar 6 02:58:19.153764 sshd[6513]: Connection closed by 10.200.16.10 port 37326 Mar 6 02:58:19.154377 sshd-session[6488]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:19.157591 systemd[1]: sshd@15-10.200.20.34:22-10.200.16.10:37326.service: Deactivated successfully. Mar 6 02:58:19.160280 systemd[1]: session-18.scope: Deactivated successfully. Mar 6 02:58:19.161170 systemd-logind[1868]: Session 18 logged out. Waiting for processes to exit. Mar 6 02:58:19.162430 systemd-logind[1868]: Removed session 18. Mar 6 02:58:19.242672 systemd[1]: Started sshd@16-10.200.20.34:22-10.200.16.10:37334.service - OpenSSH per-connection server daemon (10.200.16.10:37334). Mar 6 02:58:19.642482 sshd[6528]: Accepted publickey for core from 10.200.16.10 port 37334 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:19.643545 sshd-session[6528]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:19.647251 systemd-logind[1868]: New session 19 of user core. Mar 6 02:58:19.653194 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 6 02:58:19.930359 sshd[6531]: Connection closed by 10.200.16.10 port 37334 Mar 6 02:58:19.930414 sshd-session[6528]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:19.935104 systemd-logind[1868]: Session 19 logged out. Waiting for processes to exit. Mar 6 02:58:19.935619 systemd[1]: sshd@16-10.200.20.34:22-10.200.16.10:37334.service: Deactivated successfully. Mar 6 02:58:19.937339 systemd[1]: session-19.scope: Deactivated successfully. Mar 6 02:58:19.939062 systemd-logind[1868]: Removed session 19. Mar 6 02:58:20.018099 systemd[1]: Started sshd@17-10.200.20.34:22-10.200.16.10:41342.service - OpenSSH per-connection server daemon (10.200.16.10:41342). Mar 6 02:58:20.430265 sshd[6541]: Accepted publickey for core from 10.200.16.10 port 41342 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:20.431472 sshd-session[6541]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:20.434952 systemd-logind[1868]: New session 20 of user core. Mar 6 02:58:20.442017 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 6 02:58:20.707524 sshd[6544]: Connection closed by 10.200.16.10 port 41342 Mar 6 02:58:20.708068 sshd-session[6541]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:20.711336 systemd[1]: sshd@17-10.200.20.34:22-10.200.16.10:41342.service: Deactivated successfully. Mar 6 02:58:20.714260 systemd[1]: session-20.scope: Deactivated successfully. Mar 6 02:58:20.714873 systemd-logind[1868]: Session 20 logged out. Waiting for processes to exit. Mar 6 02:58:20.716500 systemd-logind[1868]: Removed session 20. Mar 6 02:58:25.805461 systemd[1]: Started sshd@18-10.200.20.34:22-10.200.16.10:41356.service - OpenSSH per-connection server daemon (10.200.16.10:41356). Mar 6 02:58:26.254938 sshd[6577]: Accepted publickey for core from 10.200.16.10 port 41356 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:26.255757 sshd-session[6577]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:26.259616 systemd-logind[1868]: New session 21 of user core. Mar 6 02:58:26.264033 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 6 02:58:26.548325 sshd[6580]: Connection closed by 10.200.16.10 port 41356 Mar 6 02:58:26.548865 sshd-session[6577]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:26.552452 systemd[1]: sshd@18-10.200.20.34:22-10.200.16.10:41356.service: Deactivated successfully. Mar 6 02:58:26.554378 systemd[1]: session-21.scope: Deactivated successfully. Mar 6 02:58:26.555805 systemd-logind[1868]: Session 21 logged out. Waiting for processes to exit. Mar 6 02:58:26.558452 systemd-logind[1868]: Removed session 21. Mar 6 02:58:31.625127 systemd[1]: Started sshd@19-10.200.20.34:22-10.200.16.10:49780.service - OpenSSH per-connection server daemon (10.200.16.10:49780). Mar 6 02:58:32.008134 sshd[6640]: Accepted publickey for core from 10.200.16.10 port 49780 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:32.009292 sshd-session[6640]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:32.015207 systemd-logind[1868]: New session 22 of user core. Mar 6 02:58:32.020034 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 6 02:58:32.266588 sshd[6643]: Connection closed by 10.200.16.10 port 49780 Mar 6 02:58:32.266413 sshd-session[6640]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:32.272551 systemd[1]: sshd@19-10.200.20.34:22-10.200.16.10:49780.service: Deactivated successfully. Mar 6 02:58:32.275460 systemd[1]: session-22.scope: Deactivated successfully. Mar 6 02:58:32.278353 systemd-logind[1868]: Session 22 logged out. Waiting for processes to exit. Mar 6 02:58:32.279971 systemd-logind[1868]: Removed session 22. Mar 6 02:58:37.344137 systemd[1]: Started sshd@20-10.200.20.34:22-10.200.16.10:49790.service - OpenSSH per-connection server daemon (10.200.16.10:49790). Mar 6 02:58:37.703847 sshd[6694]: Accepted publickey for core from 10.200.16.10 port 49790 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:37.704875 sshd-session[6694]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:37.708542 systemd-logind[1868]: New session 23 of user core. Mar 6 02:58:37.713026 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 6 02:58:37.939286 sshd[6697]: Connection closed by 10.200.16.10 port 49790 Mar 6 02:58:37.939878 sshd-session[6694]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:37.943111 systemd[1]: sshd@20-10.200.20.34:22-10.200.16.10:49790.service: Deactivated successfully. Mar 6 02:58:37.944662 systemd[1]: session-23.scope: Deactivated successfully. Mar 6 02:58:37.945425 systemd-logind[1868]: Session 23 logged out. Waiting for processes to exit. Mar 6 02:58:37.946851 systemd-logind[1868]: Removed session 23. Mar 6 02:58:43.022927 systemd[1]: Started sshd@21-10.200.20.34:22-10.200.16.10:37112.service - OpenSSH per-connection server daemon (10.200.16.10:37112). Mar 6 02:58:43.417068 sshd[6715]: Accepted publickey for core from 10.200.16.10 port 37112 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:43.418087 sshd-session[6715]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:43.421955 systemd-logind[1868]: New session 24 of user core. Mar 6 02:58:43.426056 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 6 02:58:43.681431 sshd[6718]: Connection closed by 10.200.16.10 port 37112 Mar 6 02:58:43.681001 sshd-session[6715]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:43.684149 systemd-logind[1868]: Session 24 logged out. Waiting for processes to exit. Mar 6 02:58:43.684542 systemd[1]: sshd@21-10.200.20.34:22-10.200.16.10:37112.service: Deactivated successfully. Mar 6 02:58:43.686554 systemd[1]: session-24.scope: Deactivated successfully. Mar 6 02:58:43.688875 systemd-logind[1868]: Removed session 24. Mar 6 02:58:43.762523 systemd[1]: Started sshd@22-10.200.20.34:22-10.200.16.10:37114.service - OpenSSH per-connection server daemon (10.200.16.10:37114). Mar 6 02:58:44.132322 sshd[6731]: Accepted publickey for core from 10.200.16.10 port 37114 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:44.133390 sshd-session[6731]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:44.137460 systemd-logind[1868]: New session 25 of user core. Mar 6 02:58:44.141050 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 6 02:58:44.514513 sshd[6734]: Connection closed by 10.200.16.10 port 37114 Mar 6 02:58:44.591071 sshd-session[6731]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:44.595634 systemd[1]: sshd@22-10.200.20.34:22-10.200.16.10:37114.service: Deactivated successfully. Mar 6 02:58:44.597585 systemd[1]: session-25.scope: Deactivated successfully. Mar 6 02:58:44.598677 systemd-logind[1868]: Session 25 logged out. Waiting for processes to exit. Mar 6 02:58:44.599775 systemd-logind[1868]: Removed session 25. Mar 6 02:58:44.612130 systemd[1]: Started sshd@23-10.200.20.34:22-10.200.16.10:37124.service - OpenSSH per-connection server daemon (10.200.16.10:37124). Mar 6 02:58:45.058847 sshd[6745]: Accepted publickey for core from 10.200.16.10 port 37124 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:45.060706 sshd-session[6745]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:45.064418 systemd-logind[1868]: New session 26 of user core. Mar 6 02:58:45.072485 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 6 02:58:45.865994 sshd[6749]: Connection closed by 10.200.16.10 port 37124 Mar 6 02:58:45.867136 sshd-session[6745]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:45.870963 systemd[1]: sshd@23-10.200.20.34:22-10.200.16.10:37124.service: Deactivated successfully. Mar 6 02:58:45.873793 systemd[1]: session-26.scope: Deactivated successfully. Mar 6 02:58:45.874722 systemd-logind[1868]: Session 26 logged out. Waiting for processes to exit. Mar 6 02:58:45.876526 systemd-logind[1868]: Removed session 26. Mar 6 02:58:45.944638 systemd[1]: Started sshd@24-10.200.20.34:22-10.200.16.10:37136.service - OpenSSH per-connection server daemon (10.200.16.10:37136). Mar 6 02:58:46.361568 sshd[6780]: Accepted publickey for core from 10.200.16.10 port 37136 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:46.362626 sshd-session[6780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:46.366561 systemd-logind[1868]: New session 27 of user core. Mar 6 02:58:46.375075 systemd[1]: Started session-27.scope - Session 27 of User core. Mar 6 02:58:46.721933 sshd[6783]: Connection closed by 10.200.16.10 port 37136 Mar 6 02:58:46.722286 sshd-session[6780]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:46.726227 systemd-logind[1868]: Session 27 logged out. Waiting for processes to exit. Mar 6 02:58:46.726930 systemd[1]: sshd@24-10.200.20.34:22-10.200.16.10:37136.service: Deactivated successfully. Mar 6 02:58:46.731689 systemd[1]: session-27.scope: Deactivated successfully. Mar 6 02:58:46.733414 systemd-logind[1868]: Removed session 27. Mar 6 02:58:46.816058 systemd[1]: Started sshd@25-10.200.20.34:22-10.200.16.10:37146.service - OpenSSH per-connection server daemon (10.200.16.10:37146). Mar 6 02:58:47.224046 sshd[6795]: Accepted publickey for core from 10.200.16.10 port 37146 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:47.225115 sshd-session[6795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:47.228896 systemd-logind[1868]: New session 28 of user core. Mar 6 02:58:47.237053 systemd[1]: Started session-28.scope - Session 28 of User core. Mar 6 02:58:47.484498 sshd[6798]: Connection closed by 10.200.16.10 port 37146 Mar 6 02:58:47.485042 sshd-session[6795]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:47.488668 systemd[1]: sshd@25-10.200.20.34:22-10.200.16.10:37146.service: Deactivated successfully. Mar 6 02:58:47.490559 systemd[1]: session-28.scope: Deactivated successfully. Mar 6 02:58:47.491848 systemd-logind[1868]: Session 28 logged out. Waiting for processes to exit. Mar 6 02:58:47.492893 systemd-logind[1868]: Removed session 28. Mar 6 02:58:52.564256 systemd[1]: Started sshd@26-10.200.20.34:22-10.200.16.10:34904.service - OpenSSH per-connection server daemon (10.200.16.10:34904). Mar 6 02:58:52.924320 sshd[6829]: Accepted publickey for core from 10.200.16.10 port 34904 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:52.925372 sshd-session[6829]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:52.930257 systemd-logind[1868]: New session 29 of user core. Mar 6 02:58:52.937058 systemd[1]: Started session-29.scope - Session 29 of User core. Mar 6 02:58:53.162981 sshd[6832]: Connection closed by 10.200.16.10 port 34904 Mar 6 02:58:53.163573 sshd-session[6829]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:53.167012 systemd-logind[1868]: Session 29 logged out. Waiting for processes to exit. Mar 6 02:58:53.168021 systemd[1]: sshd@26-10.200.20.34:22-10.200.16.10:34904.service: Deactivated successfully. Mar 6 02:58:53.170471 systemd[1]: session-29.scope: Deactivated successfully. Mar 6 02:58:53.172730 systemd-logind[1868]: Removed session 29. Mar 6 02:58:58.258077 systemd[1]: Started sshd@27-10.200.20.34:22-10.200.16.10:34918.service - OpenSSH per-connection server daemon (10.200.16.10:34918). Mar 6 02:58:58.672274 sshd[6930]: Accepted publickey for core from 10.200.16.10 port 34918 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:58:58.671638 sshd-session[6930]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:58.678054 systemd-logind[1868]: New session 30 of user core. Mar 6 02:58:58.683036 systemd[1]: Started session-30.scope - Session 30 of User core. Mar 6 02:58:58.939000 sshd[6933]: Connection closed by 10.200.16.10 port 34918 Mar 6 02:58:58.938051 sshd-session[6930]: pam_unix(sshd:session): session closed for user core Mar 6 02:58:58.941622 systemd[1]: sshd@27-10.200.20.34:22-10.200.16.10:34918.service: Deactivated successfully. Mar 6 02:58:58.944312 systemd[1]: session-30.scope: Deactivated successfully. Mar 6 02:58:58.945538 systemd-logind[1868]: Session 30 logged out. Waiting for processes to exit. Mar 6 02:58:58.946974 systemd-logind[1868]: Removed session 30. Mar 6 02:59:04.028114 systemd[1]: Started sshd@28-10.200.20.34:22-10.200.16.10:50324.service - OpenSSH per-connection server daemon (10.200.16.10:50324). Mar 6 02:59:04.445992 sshd[6947]: Accepted publickey for core from 10.200.16.10 port 50324 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:59:04.446735 sshd-session[6947]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:04.450327 systemd-logind[1868]: New session 31 of user core. Mar 6 02:59:04.457067 systemd[1]: Started session-31.scope - Session 31 of User core. Mar 6 02:59:04.715946 sshd[6950]: Connection closed by 10.200.16.10 port 50324 Mar 6 02:59:04.716472 sshd-session[6947]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:04.719599 systemd[1]: sshd@28-10.200.20.34:22-10.200.16.10:50324.service: Deactivated successfully. Mar 6 02:59:04.721748 systemd[1]: session-31.scope: Deactivated successfully. Mar 6 02:59:04.722558 systemd-logind[1868]: Session 31 logged out. Waiting for processes to exit. Mar 6 02:59:04.724400 systemd-logind[1868]: Removed session 31. Mar 6 02:59:09.784196 systemd[1]: Started sshd@29-10.200.20.34:22-10.200.16.10:50326.service - OpenSSH per-connection server daemon (10.200.16.10:50326). Mar 6 02:59:10.147766 sshd[6962]: Accepted publickey for core from 10.200.16.10 port 50326 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:59:10.148547 sshd-session[6962]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:10.153693 systemd-logind[1868]: New session 32 of user core. Mar 6 02:59:10.157137 systemd[1]: Started session-32.scope - Session 32 of User core. Mar 6 02:59:10.384081 sshd[6965]: Connection closed by 10.200.16.10 port 50326 Mar 6 02:59:10.383990 sshd-session[6962]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:10.388298 systemd[1]: sshd@29-10.200.20.34:22-10.200.16.10:50326.service: Deactivated successfully. Mar 6 02:59:10.390921 systemd[1]: session-32.scope: Deactivated successfully. Mar 6 02:59:10.391595 systemd-logind[1868]: Session 32 logged out. Waiting for processes to exit. Mar 6 02:59:10.392941 systemd-logind[1868]: Removed session 32. Mar 6 02:59:15.465104 systemd[1]: Started sshd@30-10.200.20.34:22-10.200.16.10:44286.service - OpenSSH per-connection server daemon (10.200.16.10:44286). Mar 6 02:59:15.832014 sshd[6977]: Accepted publickey for core from 10.200.16.10 port 44286 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:59:15.833027 sshd-session[6977]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:15.836657 systemd-logind[1868]: New session 33 of user core. Mar 6 02:59:15.841027 systemd[1]: Started session-33.scope - Session 33 of User core. Mar 6 02:59:16.076792 sshd[6982]: Connection closed by 10.200.16.10 port 44286 Mar 6 02:59:16.077413 sshd-session[6977]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:16.080428 systemd[1]: sshd@30-10.200.20.34:22-10.200.16.10:44286.service: Deactivated successfully. Mar 6 02:59:16.082586 systemd[1]: session-33.scope: Deactivated successfully. Mar 6 02:59:16.083628 systemd-logind[1868]: Session 33 logged out. Waiting for processes to exit. Mar 6 02:59:16.085841 systemd-logind[1868]: Removed session 33. Mar 6 02:59:21.165317 systemd[1]: Started sshd@31-10.200.20.34:22-10.200.16.10:36240.service - OpenSSH per-connection server daemon (10.200.16.10:36240). Mar 6 02:59:21.538004 sshd[7016]: Accepted publickey for core from 10.200.16.10 port 36240 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:59:21.539159 sshd-session[7016]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:21.543317 systemd-logind[1868]: New session 34 of user core. Mar 6 02:59:21.548048 systemd[1]: Started session-34.scope - Session 34 of User core. Mar 6 02:59:21.784256 sshd[7019]: Connection closed by 10.200.16.10 port 36240 Mar 6 02:59:21.784835 sshd-session[7016]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:21.788284 systemd[1]: sshd@31-10.200.20.34:22-10.200.16.10:36240.service: Deactivated successfully. Mar 6 02:59:21.790665 systemd[1]: session-34.scope: Deactivated successfully. Mar 6 02:59:21.792162 systemd-logind[1868]: Session 34 logged out. Waiting for processes to exit. Mar 6 02:59:21.793932 systemd-logind[1868]: Removed session 34. Mar 6 02:59:26.862731 systemd[1]: Started sshd@32-10.200.20.34:22-10.200.16.10:36246.service - OpenSSH per-connection server daemon (10.200.16.10:36246). Mar 6 02:59:27.231795 sshd[7031]: Accepted publickey for core from 10.200.16.10 port 36246 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:59:27.233167 sshd-session[7031]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:27.238285 systemd-logind[1868]: New session 35 of user core. Mar 6 02:59:27.244067 systemd[1]: Started session-35.scope - Session 35 of User core. Mar 6 02:59:27.475799 sshd[7057]: Connection closed by 10.200.16.10 port 36246 Mar 6 02:59:27.476382 sshd-session[7031]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:27.479711 systemd[1]: sshd@32-10.200.20.34:22-10.200.16.10:36246.service: Deactivated successfully. Mar 6 02:59:27.482224 systemd[1]: session-35.scope: Deactivated successfully. Mar 6 02:59:27.483389 systemd-logind[1868]: Session 35 logged out. Waiting for processes to exit. Mar 6 02:59:27.484709 systemd-logind[1868]: Removed session 35.