Mar 17 17:24:55.310827 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 17 17:24:55.310850 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT Mon Mar 17 16:05:23 -00 2025 Mar 17 17:24:55.310858 kernel: KASLR enabled Mar 17 17:24:55.310864 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Mar 17 17:24:55.310871 kernel: printk: bootconsole [pl11] enabled Mar 17 17:24:55.310877 kernel: efi: EFI v2.7 by EDK II Mar 17 17:24:55.310884 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3ead8b98 RNG=0x3fd5f998 MEMRESERVE=0x3e423d98 Mar 17 17:24:55.310890 kernel: random: crng init done Mar 17 17:24:55.310896 kernel: secureboot: Secure boot disabled Mar 17 17:24:55.310902 kernel: ACPI: Early table checksum verification disabled Mar 17 17:24:55.310908 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) Mar 17 17:24:55.310914 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310920 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310927 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) Mar 17 17:24:55.310935 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310941 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310947 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310955 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310961 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310967 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310973 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Mar 17 17:24:55.310979 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310986 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Mar 17 17:24:55.310992 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] Mar 17 17:24:55.310998 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] Mar 17 17:24:55.311004 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] Mar 17 17:24:55.311010 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] Mar 17 17:24:55.311016 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] Mar 17 17:24:55.311024 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] Mar 17 17:24:55.311030 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] Mar 17 17:24:55.311036 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] Mar 17 17:24:55.311042 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] Mar 17 17:24:55.311049 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] Mar 17 17:24:55.311055 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] Mar 17 17:24:55.311061 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] Mar 17 17:24:55.311067 kernel: NUMA: NODE_DATA [mem 0x1bf7ee800-0x1bf7f3fff] Mar 17 17:24:55.311073 kernel: Zone ranges: Mar 17 17:24:55.311079 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Mar 17 17:24:55.311085 kernel: DMA32 empty Mar 17 17:24:55.311091 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Mar 17 17:24:55.311101 kernel: Movable zone start for each node Mar 17 17:24:55.311108 kernel: Early memory node ranges Mar 17 17:24:55.311114 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Mar 17 17:24:55.311121 kernel: node 0: [mem 0x0000000000824000-0x000000003e54ffff] Mar 17 17:24:55.311128 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] Mar 17 17:24:55.311135 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] Mar 17 17:24:55.311142 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] Mar 17 17:24:55.311148 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] Mar 17 17:24:55.311155 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Mar 17 17:24:55.311162 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Mar 17 17:24:55.311168 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Mar 17 17:24:55.311175 kernel: psci: probing for conduit method from ACPI. Mar 17 17:24:55.311182 kernel: psci: PSCIv1.1 detected in firmware. Mar 17 17:24:55.311189 kernel: psci: Using standard PSCI v0.2 function IDs Mar 17 17:24:55.311195 kernel: psci: MIGRATE_INFO_TYPE not supported. Mar 17 17:24:55.311202 kernel: psci: SMC Calling Convention v1.4 Mar 17 17:24:55.311209 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Mar 17 17:24:55.311217 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Mar 17 17:24:55.311235 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Mar 17 17:24:55.311242 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Mar 17 17:24:55.311248 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 17 17:24:55.311255 kernel: Detected PIPT I-cache on CPU0 Mar 17 17:24:55.311262 kernel: CPU features: detected: GIC system register CPU interface Mar 17 17:24:55.311269 kernel: CPU features: detected: Hardware dirty bit management Mar 17 17:24:55.311275 kernel: CPU features: detected: Spectre-BHB Mar 17 17:24:55.311282 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 17 17:24:55.311289 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 17 17:24:55.313336 kernel: CPU features: detected: ARM erratum 1418040 Mar 17 17:24:55.313357 kernel: CPU features: detected: ARM erratum 1542419 (kernel portion) Mar 17 17:24:55.313364 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 17 17:24:55.313371 kernel: alternatives: applying boot alternatives Mar 17 17:24:55.313379 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=31b104f73129b84fa679201ebe02fbfd197d071bbf0576d6ccc5c5442bcbb405 Mar 17 17:24:55.313387 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 17:24:55.313394 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 17 17:24:55.313400 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 17 17:24:55.313407 kernel: Fallback order for Node 0: 0 Mar 17 17:24:55.313413 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1032156 Mar 17 17:24:55.313420 kernel: Policy zone: Normal Mar 17 17:24:55.313427 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 17:24:55.313435 kernel: software IO TLB: area num 2. Mar 17 17:24:55.313442 kernel: software IO TLB: mapped [mem 0x0000000036620000-0x000000003a620000] (64MB) Mar 17 17:24:55.313449 kernel: Memory: 3982372K/4194160K available (10240K kernel code, 2186K rwdata, 8100K rodata, 39744K init, 897K bss, 211788K reserved, 0K cma-reserved) Mar 17 17:24:55.313455 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 17 17:24:55.313462 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 17 17:24:55.313469 kernel: rcu: RCU event tracing is enabled. Mar 17 17:24:55.313476 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 17 17:24:55.313483 kernel: Trampoline variant of Tasks RCU enabled. Mar 17 17:24:55.313490 kernel: Tracing variant of Tasks RCU enabled. Mar 17 17:24:55.313496 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 17:24:55.313503 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 17 17:24:55.313511 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 17 17:24:55.313518 kernel: GICv3: 960 SPIs implemented Mar 17 17:24:55.313524 kernel: GICv3: 0 Extended SPIs implemented Mar 17 17:24:55.313531 kernel: Root IRQ handler: gic_handle_irq Mar 17 17:24:55.313537 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 17 17:24:55.313544 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Mar 17 17:24:55.313551 kernel: ITS: No ITS available, not enabling LPIs Mar 17 17:24:55.313558 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 17 17:24:55.313564 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:24:55.313571 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 17 17:24:55.313578 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 17 17:24:55.313585 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 17 17:24:55.313593 kernel: Console: colour dummy device 80x25 Mar 17 17:24:55.313600 kernel: printk: console [tty1] enabled Mar 17 17:24:55.313607 kernel: ACPI: Core revision 20230628 Mar 17 17:24:55.313614 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 17 17:24:55.313621 kernel: pid_max: default: 32768 minimum: 301 Mar 17 17:24:55.313628 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 17 17:24:55.313635 kernel: landlock: Up and running. Mar 17 17:24:55.313642 kernel: SELinux: Initializing. Mar 17 17:24:55.313649 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:24:55.313657 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:24:55.313664 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 17 17:24:55.313671 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 17 17:24:55.313678 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3a8030, hints 0xe, misc 0x31e1 Mar 17 17:24:55.313685 kernel: Hyper-V: Host Build 10.0.22477.1619-1-0 Mar 17 17:24:55.313692 kernel: Hyper-V: enabling crash_kexec_post_notifiers Mar 17 17:24:55.313699 kernel: rcu: Hierarchical SRCU implementation. Mar 17 17:24:55.313712 kernel: rcu: Max phase no-delay instances is 400. Mar 17 17:24:55.313719 kernel: Remapping and enabling EFI services. Mar 17 17:24:55.313727 kernel: smp: Bringing up secondary CPUs ... Mar 17 17:24:55.313734 kernel: Detected PIPT I-cache on CPU1 Mar 17 17:24:55.313741 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Mar 17 17:24:55.313750 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:24:55.313757 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 17 17:24:55.313764 kernel: smp: Brought up 1 node, 2 CPUs Mar 17 17:24:55.313771 kernel: SMP: Total of 2 processors activated. Mar 17 17:24:55.313779 kernel: CPU features: detected: 32-bit EL0 Support Mar 17 17:24:55.313787 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Mar 17 17:24:55.313795 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 17 17:24:55.313802 kernel: CPU features: detected: CRC32 instructions Mar 17 17:24:55.313809 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 17 17:24:55.313816 kernel: CPU features: detected: LSE atomic instructions Mar 17 17:24:55.313824 kernel: CPU features: detected: Privileged Access Never Mar 17 17:24:55.313831 kernel: CPU: All CPU(s) started at EL1 Mar 17 17:24:55.313838 kernel: alternatives: applying system-wide alternatives Mar 17 17:24:55.313845 kernel: devtmpfs: initialized Mar 17 17:24:55.313854 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 17:24:55.313862 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 17 17:24:55.313869 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 17:24:55.313876 kernel: SMBIOS 3.1.0 present. Mar 17 17:24:55.313883 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 Mar 17 17:24:55.313890 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 17:24:55.313898 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 17 17:24:55.313905 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 17 17:24:55.313915 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 17 17:24:55.313922 kernel: audit: initializing netlink subsys (disabled) Mar 17 17:24:55.313929 kernel: audit: type=2000 audit(0.047:1): state=initialized audit_enabled=0 res=1 Mar 17 17:24:55.313937 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 17:24:55.313944 kernel: cpuidle: using governor menu Mar 17 17:24:55.313951 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 17 17:24:55.313958 kernel: ASID allocator initialised with 32768 entries Mar 17 17:24:55.313965 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 17:24:55.313973 kernel: Serial: AMBA PL011 UART driver Mar 17 17:24:55.313981 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 17 17:24:55.313988 kernel: Modules: 0 pages in range for non-PLT usage Mar 17 17:24:55.313995 kernel: Modules: 508944 pages in range for PLT usage Mar 17 17:24:55.314003 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 17 17:24:55.314010 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 17 17:24:55.314017 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 17 17:24:55.314024 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 17 17:24:55.314032 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 17:24:55.314039 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 17 17:24:55.314048 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 17 17:24:55.314055 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 17 17:24:55.314062 kernel: ACPI: Added _OSI(Module Device) Mar 17 17:24:55.314069 kernel: ACPI: Added _OSI(Processor Device) Mar 17 17:24:55.314076 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 17:24:55.314084 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 17:24:55.314091 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 17:24:55.314098 kernel: ACPI: Interpreter enabled Mar 17 17:24:55.314106 kernel: ACPI: Using GIC for interrupt routing Mar 17 17:24:55.314113 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Mar 17 17:24:55.314122 kernel: printk: console [ttyAMA0] enabled Mar 17 17:24:55.314129 kernel: printk: bootconsole [pl11] disabled Mar 17 17:24:55.314136 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Mar 17 17:24:55.314143 kernel: iommu: Default domain type: Translated Mar 17 17:24:55.314150 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 17 17:24:55.314157 kernel: efivars: Registered efivars operations Mar 17 17:24:55.314164 kernel: vgaarb: loaded Mar 17 17:24:55.314172 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 17 17:24:55.314179 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 17:24:55.314188 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 17:24:55.314195 kernel: pnp: PnP ACPI init Mar 17 17:24:55.314202 kernel: pnp: PnP ACPI: found 0 devices Mar 17 17:24:55.314210 kernel: NET: Registered PF_INET protocol family Mar 17 17:24:55.314217 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 17 17:24:55.314224 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 17 17:24:55.314232 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 17:24:55.314239 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 17 17:24:55.314247 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 17 17:24:55.314255 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 17 17:24:55.314262 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:24:55.314269 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:24:55.314277 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 17:24:55.314284 kernel: PCI: CLS 0 bytes, default 64 Mar 17 17:24:55.314291 kernel: kvm [1]: HYP mode not available Mar 17 17:24:55.314309 kernel: Initialise system trusted keyrings Mar 17 17:24:55.314316 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 17 17:24:55.314326 kernel: Key type asymmetric registered Mar 17 17:24:55.314333 kernel: Asymmetric key parser 'x509' registered Mar 17 17:24:55.314340 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 17 17:24:55.314347 kernel: io scheduler mq-deadline registered Mar 17 17:24:55.314354 kernel: io scheduler kyber registered Mar 17 17:24:55.314361 kernel: io scheduler bfq registered Mar 17 17:24:55.314369 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 17:24:55.314376 kernel: thunder_xcv, ver 1.0 Mar 17 17:24:55.314383 kernel: thunder_bgx, ver 1.0 Mar 17 17:24:55.314390 kernel: nicpf, ver 1.0 Mar 17 17:24:55.314399 kernel: nicvf, ver 1.0 Mar 17 17:24:55.314546 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 17 17:24:55.314617 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-17T17:24:54 UTC (1742232294) Mar 17 17:24:55.314628 kernel: efifb: probing for efifb Mar 17 17:24:55.314635 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Mar 17 17:24:55.314642 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Mar 17 17:24:55.314650 kernel: efifb: scrolling: redraw Mar 17 17:24:55.314659 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 17 17:24:55.314666 kernel: Console: switching to colour frame buffer device 128x48 Mar 17 17:24:55.314673 kernel: fb0: EFI VGA frame buffer device Mar 17 17:24:55.314680 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Mar 17 17:24:55.314688 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 17 17:24:55.314695 kernel: No ACPI PMU IRQ for CPU0 Mar 17 17:24:55.314702 kernel: No ACPI PMU IRQ for CPU1 Mar 17 17:24:55.314709 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 1 counters available Mar 17 17:24:55.314717 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 17 17:24:55.314725 kernel: watchdog: Hard watchdog permanently disabled Mar 17 17:24:55.314733 kernel: NET: Registered PF_INET6 protocol family Mar 17 17:24:55.314740 kernel: Segment Routing with IPv6 Mar 17 17:24:55.314747 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 17:24:55.314754 kernel: NET: Registered PF_PACKET protocol family Mar 17 17:24:55.314761 kernel: Key type dns_resolver registered Mar 17 17:24:55.314768 kernel: registered taskstats version 1 Mar 17 17:24:55.314775 kernel: Loading compiled-in X.509 certificates Mar 17 17:24:55.314783 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 74c9b4f5dfad711856d7363c976664fc02c1e24c' Mar 17 17:24:55.314790 kernel: Key type .fscrypt registered Mar 17 17:24:55.314798 kernel: Key type fscrypt-provisioning registered Mar 17 17:24:55.314805 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 17:24:55.314813 kernel: ima: Allocated hash algorithm: sha1 Mar 17 17:24:55.314820 kernel: ima: No architecture policies found Mar 17 17:24:55.314827 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 17 17:24:55.314834 kernel: clk: Disabling unused clocks Mar 17 17:24:55.314841 kernel: Freeing unused kernel memory: 39744K Mar 17 17:24:55.314849 kernel: Run /init as init process Mar 17 17:24:55.314857 kernel: with arguments: Mar 17 17:24:55.314865 kernel: /init Mar 17 17:24:55.314871 kernel: with environment: Mar 17 17:24:55.314878 kernel: HOME=/ Mar 17 17:24:55.314885 kernel: TERM=linux Mar 17 17:24:55.314892 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 17:24:55.314902 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 17 17:24:55.314911 systemd[1]: Detected virtualization microsoft. Mar 17 17:24:55.314920 systemd[1]: Detected architecture arm64. Mar 17 17:24:55.314928 systemd[1]: Running in initrd. Mar 17 17:24:55.314935 systemd[1]: No hostname configured, using default hostname. Mar 17 17:24:55.314943 systemd[1]: Hostname set to . Mar 17 17:24:55.314951 systemd[1]: Initializing machine ID from random generator. Mar 17 17:24:55.314958 systemd[1]: Queued start job for default target initrd.target. Mar 17 17:24:55.314966 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:24:55.314974 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:24:55.314983 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 17 17:24:55.314991 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:24:55.314999 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 17 17:24:55.315007 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 17 17:24:55.315016 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 17 17:24:55.315024 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 17 17:24:55.315032 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:24:55.315041 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:24:55.315048 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:24:55.315056 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:24:55.315064 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:24:55.315071 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:24:55.315079 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:24:55.315086 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:24:55.315094 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 17 17:24:55.315102 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 17 17:24:55.315111 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:24:55.315119 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:24:55.315127 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:24:55.315134 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:24:55.315142 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 17 17:24:55.315150 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:24:55.315158 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 17 17:24:55.315165 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 17:24:55.315175 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:24:55.315182 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:24:55.315209 systemd-journald[218]: Collecting audit messages is disabled. Mar 17 17:24:55.315229 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:24:55.315239 systemd-journald[218]: Journal started Mar 17 17:24:55.315257 systemd-journald[218]: Runtime Journal (/run/log/journal/512df49dcb784633a071ef45a5cc2063) is 8.0M, max 78.5M, 70.5M free. Mar 17 17:24:55.328371 systemd-modules-load[219]: Inserted module 'overlay' Mar 17 17:24:55.353394 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:24:55.353425 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 17:24:55.363422 systemd-modules-load[219]: Inserted module 'br_netfilter' Mar 17 17:24:55.368700 kernel: Bridge firewalling registered Mar 17 17:24:55.367910 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 17 17:24:55.375251 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:24:55.394314 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 17:24:55.404916 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:24:55.414750 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:24:55.436707 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:24:55.445476 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:24:55.467461 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 17:24:55.490468 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:24:55.497986 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:24:55.506725 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:24:55.518071 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:24:55.531328 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:24:55.559626 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 17 17:24:55.574175 dracut-cmdline[252]: dracut-dracut-053 Mar 17 17:24:55.576933 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:24:55.586183 dracut-cmdline[252]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=31b104f73129b84fa679201ebe02fbfd197d071bbf0576d6ccc5c5442bcbb405 Mar 17 17:24:55.624576 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:24:55.646272 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:24:55.663257 systemd-resolved[260]: Positive Trust Anchors: Mar 17 17:24:55.663271 systemd-resolved[260]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:24:55.663392 systemd-resolved[260]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:24:55.666543 systemd-resolved[260]: Defaulting to hostname 'linux'. Mar 17 17:24:55.667449 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:24:55.683147 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:24:55.774310 kernel: SCSI subsystem initialized Mar 17 17:24:55.781315 kernel: Loading iSCSI transport class v2.0-870. Mar 17 17:24:55.792328 kernel: iscsi: registered transport (tcp) Mar 17 17:24:55.809599 kernel: iscsi: registered transport (qla4xxx) Mar 17 17:24:55.809660 kernel: QLogic iSCSI HBA Driver Mar 17 17:24:55.843194 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 17 17:24:55.859605 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 17 17:24:55.892440 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 17:24:55.892507 kernel: device-mapper: uevent: version 1.0.3 Mar 17 17:24:55.898943 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 17 17:24:55.948325 kernel: raid6: neonx8 gen() 15747 MB/s Mar 17 17:24:55.968315 kernel: raid6: neonx4 gen() 15653 MB/s Mar 17 17:24:55.988305 kernel: raid6: neonx2 gen() 13196 MB/s Mar 17 17:24:56.009307 kernel: raid6: neonx1 gen() 10483 MB/s Mar 17 17:24:56.029306 kernel: raid6: int64x8 gen() 6956 MB/s Mar 17 17:24:56.050306 kernel: raid6: int64x4 gen() 7337 MB/s Mar 17 17:24:56.071306 kernel: raid6: int64x2 gen() 6125 MB/s Mar 17 17:24:56.094783 kernel: raid6: int64x1 gen() 5059 MB/s Mar 17 17:24:56.094806 kernel: raid6: using algorithm neonx8 gen() 15747 MB/s Mar 17 17:24:56.118299 kernel: raid6: .... xor() 11919 MB/s, rmw enabled Mar 17 17:24:56.118314 kernel: raid6: using neon recovery algorithm Mar 17 17:24:56.127308 kernel: xor: measuring software checksum speed Mar 17 17:24:56.133786 kernel: 8regs : 18637 MB/sec Mar 17 17:24:56.133798 kernel: 32regs : 19679 MB/sec Mar 17 17:24:56.137134 kernel: arm64_neon : 26874 MB/sec Mar 17 17:24:56.141697 kernel: xor: using function: arm64_neon (26874 MB/sec) Mar 17 17:24:56.192310 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 17 17:24:56.203580 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:24:56.226463 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:24:56.248262 systemd-udevd[438]: Using default interface naming scheme 'v255'. Mar 17 17:24:56.253608 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:24:56.274519 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 17 17:24:56.286607 dracut-pre-trigger[449]: rd.md=0: removing MD RAID activation Mar 17 17:24:56.314734 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:24:56.333911 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:24:56.372784 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:24:56.391507 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 17 17:24:56.417615 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 17 17:24:56.432526 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:24:56.448747 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:24:56.456506 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:24:56.486589 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 17 17:24:56.504610 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:24:56.516797 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:24:56.563972 kernel: hv_vmbus: Vmbus version:5.3 Mar 17 17:24:56.563997 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 17 17:24:56.564007 kernel: hv_vmbus: registering driver hid_hyperv Mar 17 17:24:56.564017 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 17 17:24:56.516967 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:24:56.586873 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input0 Mar 17 17:24:56.531772 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:24:56.619085 kernel: hv_vmbus: registering driver hyperv_keyboard Mar 17 17:24:56.619113 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Mar 17 17:24:56.619331 kernel: hv_vmbus: registering driver hv_netvsc Mar 17 17:24:56.619348 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input1 Mar 17 17:24:56.540399 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:24:56.651556 kernel: hv_vmbus: registering driver hv_storvsc Mar 17 17:24:56.540599 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:24:56.699290 kernel: PTP clock support registered Mar 17 17:24:56.699333 kernel: hv_utils: Registering HyperV Utility Driver Mar 17 17:24:56.699343 kernel: hv_vmbus: registering driver hv_utils Mar 17 17:24:56.699352 kernel: scsi host0: storvsc_host_t Mar 17 17:24:56.988943 kernel: scsi host1: storvsc_host_t Mar 17 17:24:56.989101 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Mar 17 17:24:56.989199 kernel: hv_utils: Heartbeat IC version 3.0 Mar 17 17:24:56.989209 kernel: hv_utils: TimeSync IC version 4.0 Mar 17 17:24:56.989218 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 0 Mar 17 17:24:56.989313 kernel: hv_utils: Shutdown IC version 3.2 Mar 17 17:24:56.570275 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:24:56.662576 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:24:56.711153 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:24:56.711322 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:24:56.988539 systemd-resolved[260]: Clock change detected. Flushing caches. Mar 17 17:24:57.052836 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Mar 17 17:24:57.059081 kernel: hv_netvsc 002248c2-2c2e-0022-48c2-2c2e002248c2 eth0: VF slot 1 added Mar 17 17:24:57.059205 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 17 17:24:57.059216 kernel: hv_vmbus: registering driver hv_pci Mar 17 17:24:57.059226 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Mar 17 17:24:57.023527 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:24:57.073582 kernel: hv_pci a59db578-9ebc-47e7-8fc5-4c7bcb135f35: PCI VMBus probing: Using version 0x10004 Mar 17 17:24:57.173092 kernel: hv_pci a59db578-9ebc-47e7-8fc5-4c7bcb135f35: PCI host bridge to bus 9ebc:00 Mar 17 17:24:57.173203 kernel: pci_bus 9ebc:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Mar 17 17:24:57.173296 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Mar 17 17:24:57.173393 kernel: pci_bus 9ebc:00: No busn resource found for root bus, will use [bus 00-ff] Mar 17 17:24:57.173474 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Mar 17 17:24:57.173555 kernel: pci 9ebc:00:02.0: [15b3:1018] type 00 class 0x020000 Mar 17 17:24:57.173648 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 17 17:24:57.173730 kernel: pci 9ebc:00:02.0: reg 0x10: [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 17 17:24:57.173829 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Mar 17 17:24:57.173915 kernel: pci 9ebc:00:02.0: enabling Extended Tags Mar 17 17:24:57.173996 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Mar 17 17:24:57.174081 kernel: pci 9ebc:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 9ebc:00:02.0 (capable of 126.016 Gb/s with 8.0 GT/s PCIe x16 link) Mar 17 17:24:57.174161 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:24:57.174171 kernel: pci_bus 9ebc:00: busn_res: [bus 00-ff] end is updated to 00 Mar 17 17:24:57.174245 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 17 17:24:57.174325 kernel: pci 9ebc:00:02.0: BAR 0: assigned [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 17 17:24:57.146964 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:24:57.172003 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:24:57.216754 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:24:57.241374 kernel: mlx5_core 9ebc:00:02.0: enabling device (0000 -> 0002) Mar 17 17:24:57.450688 kernel: mlx5_core 9ebc:00:02.0: firmware version: 16.30.1284 Mar 17 17:24:57.450854 kernel: hv_netvsc 002248c2-2c2e-0022-48c2-2c2e002248c2 eth0: VF registering: eth1 Mar 17 17:24:57.450953 kernel: mlx5_core 9ebc:00:02.0 eth1: joined to eth0 Mar 17 17:24:57.451046 kernel: mlx5_core 9ebc:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Mar 17 17:24:57.459842 kernel: mlx5_core 9ebc:00:02.0 enP40636s1: renamed from eth1 Mar 17 17:24:57.744405 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Mar 17 17:24:57.856838 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by (udev-worker) (482) Mar 17 17:24:57.872409 kernel: BTRFS: device fsid c0c482e3-6885-4a4e-b31c-6bc8f8c403e7 devid 1 transid 40 /dev/sda3 scanned by (udev-worker) (496) Mar 17 17:24:57.873779 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Mar 17 17:24:57.893636 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 17 17:24:57.910172 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Mar 17 17:24:57.917890 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Mar 17 17:24:57.947053 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 17 17:24:57.973220 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:24:57.980828 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:24:58.989941 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:24:58.990002 disk-uuid[605]: The operation has completed successfully. Mar 17 17:24:59.048488 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 17:24:59.049827 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 17 17:24:59.078008 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 17 17:24:59.091391 sh[691]: Success Mar 17 17:24:59.136929 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 17 17:24:59.335167 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 17 17:24:59.357949 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 17 17:24:59.367461 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 17 17:24:59.400493 kernel: BTRFS info (device dm-0): first mount of filesystem c0c482e3-6885-4a4e-b31c-6bc8f8c403e7 Mar 17 17:24:59.400552 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:24:59.400563 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 17 17:24:59.411647 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 17 17:24:59.415626 kernel: BTRFS info (device dm-0): using free space tree Mar 17 17:24:59.806324 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 17 17:24:59.811454 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 17 17:24:59.832120 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 17 17:24:59.840988 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 17 17:24:59.875125 kernel: BTRFS info (device sda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:24:59.875177 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:24:59.879541 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:24:59.902834 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 17:24:59.911438 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 17:24:59.922508 kernel: BTRFS info (device sda6): last unmount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:24:59.927494 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 17 17:24:59.943976 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 17 17:24:59.952466 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:24:59.970348 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:25:00.011004 systemd-networkd[875]: lo: Link UP Mar 17 17:25:00.011014 systemd-networkd[875]: lo: Gained carrier Mar 17 17:25:00.012564 systemd-networkd[875]: Enumeration completed Mar 17 17:25:00.015909 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:25:00.016628 systemd-networkd[875]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:25:00.016631 systemd-networkd[875]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:25:00.022390 systemd[1]: Reached target network.target - Network. Mar 17 17:25:00.107820 kernel: mlx5_core 9ebc:00:02.0 enP40636s1: Link up Mar 17 17:25:00.144828 kernel: hv_netvsc 002248c2-2c2e-0022-48c2-2c2e002248c2 eth0: Data path switched to VF: enP40636s1 Mar 17 17:25:00.145321 systemd-networkd[875]: enP40636s1: Link UP Mar 17 17:25:00.145413 systemd-networkd[875]: eth0: Link UP Mar 17 17:25:00.145538 systemd-networkd[875]: eth0: Gained carrier Mar 17 17:25:00.145547 systemd-networkd[875]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:25:00.168282 systemd-networkd[875]: enP40636s1: Gained carrier Mar 17 17:25:00.182863 systemd-networkd[875]: eth0: DHCPv4 address 10.200.20.36/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 17 17:25:00.889612 ignition[873]: Ignition 2.20.0 Mar 17 17:25:00.889623 ignition[873]: Stage: fetch-offline Mar 17 17:25:00.893269 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:25:00.889657 ignition[873]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:25:00.889664 ignition[873]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:25:00.889764 ignition[873]: parsed url from cmdline: "" Mar 17 17:25:00.889768 ignition[873]: no config URL provided Mar 17 17:25:00.889772 ignition[873]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 17:25:00.889779 ignition[873]: no config at "/usr/lib/ignition/user.ign" Mar 17 17:25:00.925179 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 17 17:25:00.889784 ignition[873]: failed to fetch config: resource requires networking Mar 17 17:25:00.889973 ignition[873]: Ignition finished successfully Mar 17 17:25:00.945675 ignition[885]: Ignition 2.20.0 Mar 17 17:25:00.945681 ignition[885]: Stage: fetch Mar 17 17:25:00.946039 ignition[885]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:25:00.946049 ignition[885]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:25:00.946162 ignition[885]: parsed url from cmdline: "" Mar 17 17:25:00.946174 ignition[885]: no config URL provided Mar 17 17:25:00.946179 ignition[885]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 17:25:00.946186 ignition[885]: no config at "/usr/lib/ignition/user.ign" Mar 17 17:25:00.946211 ignition[885]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Mar 17 17:25:01.046877 ignition[885]: GET result: OK Mar 17 17:25:01.047006 ignition[885]: config has been read from IMDS userdata Mar 17 17:25:01.047082 ignition[885]: parsing config with SHA512: 113cefce37b638a5822866ff76285cc272a1115be0031e6af3ae3213bdd9948559c4af48b465f163a121afcf37ba30c599de216d6eb4b8c3b1826db2575835ba Mar 17 17:25:01.051962 unknown[885]: fetched base config from "system" Mar 17 17:25:01.052428 ignition[885]: fetch: fetch complete Mar 17 17:25:01.051989 unknown[885]: fetched base config from "system" Mar 17 17:25:01.052447 ignition[885]: fetch: fetch passed Mar 17 17:25:01.051996 unknown[885]: fetched user config from "azure" Mar 17 17:25:01.052496 ignition[885]: Ignition finished successfully Mar 17 17:25:01.057127 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 17 17:25:01.095673 ignition[891]: Ignition 2.20.0 Mar 17 17:25:01.074972 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 17 17:25:01.095680 ignition[891]: Stage: kargs Mar 17 17:25:01.103444 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 17 17:25:01.095897 ignition[891]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:25:01.095907 ignition[891]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:25:01.127110 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 17 17:25:01.096978 ignition[891]: kargs: kargs passed Mar 17 17:25:01.097035 ignition[891]: Ignition finished successfully Mar 17 17:25:01.153889 ignition[897]: Ignition 2.20.0 Mar 17 17:25:01.156947 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 17 17:25:01.153896 ignition[897]: Stage: disks Mar 17 17:25:01.163162 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 17 17:25:01.154076 ignition[897]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:25:01.172694 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 17 17:25:01.154086 ignition[897]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:25:01.184768 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:25:01.154982 ignition[897]: disks: disks passed Mar 17 17:25:01.193252 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:25:01.155028 ignition[897]: Ignition finished successfully Mar 17 17:25:01.204372 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:25:01.231045 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 17 17:25:01.315755 systemd-fsck[907]: ROOT: clean, 14/7326000 files, 477710/7359488 blocks Mar 17 17:25:01.325355 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 17 17:25:01.341036 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 17 17:25:01.362962 systemd-networkd[875]: eth0: Gained IPv6LL Mar 17 17:25:01.405878 kernel: EXT4-fs (sda9): mounted filesystem 6b579bf2-7716-4d59-98eb-b92ea668693e r/w with ordered data mode. Quota mode: none. Mar 17 17:25:01.406420 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 17 17:25:01.411130 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 17 17:25:01.465886 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:25:01.475424 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 17 17:25:01.483007 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 17 17:25:01.520211 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (918) Mar 17 17:25:01.520236 kernel: BTRFS info (device sda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:25:01.500909 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 17:25:01.537916 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:25:01.500951 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:25:01.553339 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:25:01.534685 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 17 17:25:01.560080 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 17 17:25:01.575167 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 17:25:01.575065 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:25:02.066988 systemd-networkd[875]: enP40636s1: Gained IPv6LL Mar 17 17:25:02.167698 coreos-metadata[920]: Mar 17 17:25:02.167 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 17 17:25:02.177306 coreos-metadata[920]: Mar 17 17:25:02.177 INFO Fetch successful Mar 17 17:25:02.182671 coreos-metadata[920]: Mar 17 17:25:02.182 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Mar 17 17:25:02.204481 coreos-metadata[920]: Mar 17 17:25:02.204 INFO Fetch successful Mar 17 17:25:02.218649 coreos-metadata[920]: Mar 17 17:25:02.218 INFO wrote hostname ci-4152.2.2-a-f9f073f8c6 to /sysroot/etc/hostname Mar 17 17:25:02.227901 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 17 17:25:02.493901 initrd-setup-root[948]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 17:25:02.531847 initrd-setup-root[955]: cut: /sysroot/etc/group: No such file or directory Mar 17 17:25:02.551611 initrd-setup-root[962]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 17:25:02.576142 initrd-setup-root[969]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 17:25:03.503953 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 17 17:25:03.521109 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 17 17:25:03.529982 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 17 17:25:03.547399 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 17 17:25:03.555818 kernel: BTRFS info (device sda6): last unmount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:25:03.572714 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 17 17:25:03.585500 ignition[1037]: INFO : Ignition 2.20.0 Mar 17 17:25:03.585500 ignition[1037]: INFO : Stage: mount Mar 17 17:25:03.594338 ignition[1037]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:25:03.594338 ignition[1037]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:25:03.594338 ignition[1037]: INFO : mount: mount passed Mar 17 17:25:03.594338 ignition[1037]: INFO : Ignition finished successfully Mar 17 17:25:03.590893 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 17 17:25:03.615025 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 17 17:25:03.635171 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:25:03.656837 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (1049) Mar 17 17:25:03.669402 kernel: BTRFS info (device sda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:25:03.669422 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:25:03.673495 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:25:03.680829 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 17:25:03.681441 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:25:03.709851 ignition[1066]: INFO : Ignition 2.20.0 Mar 17 17:25:03.709851 ignition[1066]: INFO : Stage: files Mar 17 17:25:03.719090 ignition[1066]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:25:03.719090 ignition[1066]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:25:03.719090 ignition[1066]: DEBUG : files: compiled without relabeling support, skipping Mar 17 17:25:03.719090 ignition[1066]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 17:25:03.719090 ignition[1066]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 17:25:03.803163 ignition[1066]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 17:25:03.810703 ignition[1066]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 17:25:03.810703 ignition[1066]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 17:25:03.803559 unknown[1066]: wrote ssh authorized keys file for user: core Mar 17 17:25:03.861348 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 17 17:25:03.872133 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Mar 17 17:25:03.898497 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 17 17:25:04.022136 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-arm64.raw: attempt #1 Mar 17 17:25:04.471288 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 17 17:25:04.714998 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 17 17:25:04.714998 ignition[1066]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 17 17:25:04.748092 ignition[1066]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:25:04.760243 ignition[1066]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:25:04.760243 ignition[1066]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 17 17:25:04.760243 ignition[1066]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 17 17:25:04.760243 ignition[1066]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 17 17:25:04.760243 ignition[1066]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:25:04.760243 ignition[1066]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:25:04.760243 ignition[1066]: INFO : files: files passed Mar 17 17:25:04.760243 ignition[1066]: INFO : Ignition finished successfully Mar 17 17:25:04.760122 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 17 17:25:04.818159 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 17 17:25:04.831020 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 17 17:25:04.841342 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 17:25:04.847763 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 17 17:25:04.879845 initrd-setup-root-after-ignition[1099]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:25:04.887746 initrd-setup-root-after-ignition[1095]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:25:04.887746 initrd-setup-root-after-ignition[1095]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:25:04.879950 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:25:04.894369 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 17 17:25:04.929087 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 17 17:25:04.963919 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 17:25:04.964037 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 17 17:25:04.977337 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 17 17:25:04.989734 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 17 17:25:05.000371 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 17 17:25:05.015089 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 17 17:25:05.035879 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:25:05.059144 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 17 17:25:05.076511 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:25:05.083679 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:25:05.096224 systemd[1]: Stopped target timers.target - Timer Units. Mar 17 17:25:05.107343 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 17:25:05.107517 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:25:05.123704 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 17 17:25:05.135994 systemd[1]: Stopped target basic.target - Basic System. Mar 17 17:25:05.147087 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 17 17:25:05.157922 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:25:05.170662 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 17 17:25:05.183318 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 17 17:25:05.195513 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:25:05.208381 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 17 17:25:05.221763 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 17 17:25:05.233047 systemd[1]: Stopped target swap.target - Swaps. Mar 17 17:25:05.243118 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 17:25:05.243289 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:25:05.258945 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:25:05.271003 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:25:05.283821 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 17 17:25:05.283934 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:25:05.298728 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 17:25:05.298922 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 17 17:25:05.317769 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 17:25:05.317954 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:25:05.330483 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 17:25:05.330633 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 17 17:25:05.342168 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 17 17:25:05.342322 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 17 17:25:05.375952 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 17 17:25:05.406902 ignition[1119]: INFO : Ignition 2.20.0 Mar 17 17:25:05.406902 ignition[1119]: INFO : Stage: umount Mar 17 17:25:05.406902 ignition[1119]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:25:05.406902 ignition[1119]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:25:05.406902 ignition[1119]: INFO : umount: umount passed Mar 17 17:25:05.406902 ignition[1119]: INFO : Ignition finished successfully Mar 17 17:25:05.393339 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 17:25:05.393527 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:25:05.404043 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 17 17:25:05.413478 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 17:25:05.413641 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:25:05.423975 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 17:25:05.424084 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:25:05.440861 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 17:25:05.440952 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 17 17:25:05.459918 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 17:25:05.460057 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 17 17:25:05.476652 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 17:25:05.476714 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 17 17:25:05.488794 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 17 17:25:05.488861 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 17 17:25:05.500288 systemd[1]: Stopped target network.target - Network. Mar 17 17:25:05.505895 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 17:25:05.505974 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:25:05.519726 systemd[1]: Stopped target paths.target - Path Units. Mar 17 17:25:05.536536 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 17:25:05.542241 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:25:05.550501 systemd[1]: Stopped target slices.target - Slice Units. Mar 17 17:25:05.560813 systemd[1]: Stopped target sockets.target - Socket Units. Mar 17 17:25:05.570873 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 17:25:05.570945 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:25:05.581569 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 17:25:05.581615 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:25:05.592698 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 17:25:05.592757 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 17 17:25:05.598288 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 17 17:25:05.598332 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 17 17:25:05.609285 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 17 17:25:05.620346 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 17 17:25:05.636070 systemd-networkd[875]: eth0: DHCPv6 lease lost Mar 17 17:25:05.639179 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 17:25:05.639849 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 17:25:05.834974 kernel: hv_netvsc 002248c2-2c2e-0022-48c2-2c2e002248c2 eth0: Data path switched from VF: enP40636s1 Mar 17 17:25:05.639942 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 17 17:25:05.648569 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 17:25:05.650745 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 17 17:25:05.661397 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 17:25:05.661599 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 17 17:25:05.676326 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 17:25:05.676385 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:25:05.704021 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 17 17:25:05.712950 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 17:25:05.713030 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:25:05.725186 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 17:25:05.725246 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:25:05.736513 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 17:25:05.736591 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 17 17:25:05.747074 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 17 17:25:05.747122 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:25:05.758891 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:25:05.810647 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 17:25:05.810795 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:25:05.831464 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 17:25:05.831523 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 17 17:25:05.840816 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 17:25:05.840865 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:25:05.853025 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 17:25:05.853082 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:25:05.869365 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 17:25:05.869425 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 17 17:25:05.886233 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:25:05.886300 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:25:05.931988 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 17 17:25:05.944884 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 17 17:25:05.944956 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:25:05.958639 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:25:05.958699 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:25:05.972223 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 17:25:05.972320 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 17 17:25:05.984396 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 17:25:05.984487 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 17 17:25:06.080640 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 17:25:06.080758 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 17 17:25:06.089690 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 17 17:25:06.100041 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 17:25:06.100104 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 17 17:25:06.127056 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 17 17:25:06.139167 systemd[1]: Switching root. Mar 17 17:25:06.243542 systemd-journald[218]: Journal stopped Mar 17 17:24:55.310827 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 17 17:24:55.310850 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT Mon Mar 17 16:05:23 -00 2025 Mar 17 17:24:55.310858 kernel: KASLR enabled Mar 17 17:24:55.310864 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Mar 17 17:24:55.310871 kernel: printk: bootconsole [pl11] enabled Mar 17 17:24:55.310877 kernel: efi: EFI v2.7 by EDK II Mar 17 17:24:55.310884 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3ead8b98 RNG=0x3fd5f998 MEMRESERVE=0x3e423d98 Mar 17 17:24:55.310890 kernel: random: crng init done Mar 17 17:24:55.310896 kernel: secureboot: Secure boot disabled Mar 17 17:24:55.310902 kernel: ACPI: Early table checksum verification disabled Mar 17 17:24:55.310908 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) Mar 17 17:24:55.310914 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310920 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310927 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) Mar 17 17:24:55.310935 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310941 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310947 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310955 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310961 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310967 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310973 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Mar 17 17:24:55.310979 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:24:55.310986 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Mar 17 17:24:55.310992 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] Mar 17 17:24:55.310998 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] Mar 17 17:24:55.311004 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] Mar 17 17:24:55.311010 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] Mar 17 17:24:55.311016 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] Mar 17 17:24:55.311024 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] Mar 17 17:24:55.311030 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] Mar 17 17:24:55.311036 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] Mar 17 17:24:55.311042 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] Mar 17 17:24:55.311049 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] Mar 17 17:24:55.311055 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] Mar 17 17:24:55.311061 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] Mar 17 17:24:55.311067 kernel: NUMA: NODE_DATA [mem 0x1bf7ee800-0x1bf7f3fff] Mar 17 17:24:55.311073 kernel: Zone ranges: Mar 17 17:24:55.311079 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Mar 17 17:24:55.311085 kernel: DMA32 empty Mar 17 17:24:55.311091 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Mar 17 17:24:55.311101 kernel: Movable zone start for each node Mar 17 17:24:55.311108 kernel: Early memory node ranges Mar 17 17:24:55.311114 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Mar 17 17:24:55.311121 kernel: node 0: [mem 0x0000000000824000-0x000000003e54ffff] Mar 17 17:24:55.311128 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] Mar 17 17:24:55.311135 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] Mar 17 17:24:55.311142 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] Mar 17 17:24:55.311148 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] Mar 17 17:24:55.311155 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Mar 17 17:24:55.311162 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Mar 17 17:24:55.311168 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Mar 17 17:24:55.311175 kernel: psci: probing for conduit method from ACPI. Mar 17 17:24:55.311182 kernel: psci: PSCIv1.1 detected in firmware. Mar 17 17:24:55.311189 kernel: psci: Using standard PSCI v0.2 function IDs Mar 17 17:24:55.311195 kernel: psci: MIGRATE_INFO_TYPE not supported. Mar 17 17:24:55.311202 kernel: psci: SMC Calling Convention v1.4 Mar 17 17:24:55.311209 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Mar 17 17:24:55.311217 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Mar 17 17:24:55.311235 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Mar 17 17:24:55.311242 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Mar 17 17:24:55.311248 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 17 17:24:55.311255 kernel: Detected PIPT I-cache on CPU0 Mar 17 17:24:55.311262 kernel: CPU features: detected: GIC system register CPU interface Mar 17 17:24:55.311269 kernel: CPU features: detected: Hardware dirty bit management Mar 17 17:24:55.311275 kernel: CPU features: detected: Spectre-BHB Mar 17 17:24:55.311282 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 17 17:24:55.311289 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 17 17:24:55.313336 kernel: CPU features: detected: ARM erratum 1418040 Mar 17 17:24:55.313357 kernel: CPU features: detected: ARM erratum 1542419 (kernel portion) Mar 17 17:24:55.313364 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 17 17:24:55.313371 kernel: alternatives: applying boot alternatives Mar 17 17:24:55.313379 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=31b104f73129b84fa679201ebe02fbfd197d071bbf0576d6ccc5c5442bcbb405 Mar 17 17:24:55.313387 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 17:24:55.313394 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 17 17:24:55.313400 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 17 17:24:55.313407 kernel: Fallback order for Node 0: 0 Mar 17 17:24:55.313413 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1032156 Mar 17 17:24:55.313420 kernel: Policy zone: Normal Mar 17 17:24:55.313427 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 17:24:55.313435 kernel: software IO TLB: area num 2. Mar 17 17:24:55.313442 kernel: software IO TLB: mapped [mem 0x0000000036620000-0x000000003a620000] (64MB) Mar 17 17:24:55.313449 kernel: Memory: 3982372K/4194160K available (10240K kernel code, 2186K rwdata, 8100K rodata, 39744K init, 897K bss, 211788K reserved, 0K cma-reserved) Mar 17 17:24:55.313455 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 17 17:24:55.313462 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 17 17:24:55.313469 kernel: rcu: RCU event tracing is enabled. Mar 17 17:24:55.313476 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 17 17:24:55.313483 kernel: Trampoline variant of Tasks RCU enabled. Mar 17 17:24:55.313490 kernel: Tracing variant of Tasks RCU enabled. Mar 17 17:24:55.313496 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 17:24:55.313503 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 17 17:24:55.313511 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 17 17:24:55.313518 kernel: GICv3: 960 SPIs implemented Mar 17 17:24:55.313524 kernel: GICv3: 0 Extended SPIs implemented Mar 17 17:24:55.313531 kernel: Root IRQ handler: gic_handle_irq Mar 17 17:24:55.313537 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 17 17:24:55.313544 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Mar 17 17:24:55.313551 kernel: ITS: No ITS available, not enabling LPIs Mar 17 17:24:55.313558 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 17 17:24:55.313564 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:24:55.313571 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 17 17:24:55.313578 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 17 17:24:55.313585 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 17 17:24:55.313593 kernel: Console: colour dummy device 80x25 Mar 17 17:24:55.313600 kernel: printk: console [tty1] enabled Mar 17 17:24:55.313607 kernel: ACPI: Core revision 20230628 Mar 17 17:24:55.313614 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 17 17:24:55.313621 kernel: pid_max: default: 32768 minimum: 301 Mar 17 17:24:55.313628 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 17 17:24:55.313635 kernel: landlock: Up and running. Mar 17 17:24:55.313642 kernel: SELinux: Initializing. Mar 17 17:24:55.313649 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:24:55.313657 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:24:55.313664 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 17 17:24:55.313671 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 17 17:24:55.313678 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3a8030, hints 0xe, misc 0x31e1 Mar 17 17:24:55.313685 kernel: Hyper-V: Host Build 10.0.22477.1619-1-0 Mar 17 17:24:55.313692 kernel: Hyper-V: enabling crash_kexec_post_notifiers Mar 17 17:24:55.313699 kernel: rcu: Hierarchical SRCU implementation. Mar 17 17:24:55.313712 kernel: rcu: Max phase no-delay instances is 400. Mar 17 17:24:55.313719 kernel: Remapping and enabling EFI services. Mar 17 17:24:55.313727 kernel: smp: Bringing up secondary CPUs ... Mar 17 17:24:55.313734 kernel: Detected PIPT I-cache on CPU1 Mar 17 17:24:55.313741 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Mar 17 17:24:55.313750 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:24:55.313757 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 17 17:24:55.313764 kernel: smp: Brought up 1 node, 2 CPUs Mar 17 17:24:55.313771 kernel: SMP: Total of 2 processors activated. Mar 17 17:24:55.313779 kernel: CPU features: detected: 32-bit EL0 Support Mar 17 17:24:55.313787 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Mar 17 17:24:55.313795 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 17 17:24:55.313802 kernel: CPU features: detected: CRC32 instructions Mar 17 17:24:55.313809 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 17 17:24:55.313816 kernel: CPU features: detected: LSE atomic instructions Mar 17 17:24:55.313824 kernel: CPU features: detected: Privileged Access Never Mar 17 17:24:55.313831 kernel: CPU: All CPU(s) started at EL1 Mar 17 17:24:55.313838 kernel: alternatives: applying system-wide alternatives Mar 17 17:24:55.313845 kernel: devtmpfs: initialized Mar 17 17:24:55.313854 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 17:24:55.313862 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 17 17:24:55.313869 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 17:24:55.313876 kernel: SMBIOS 3.1.0 present. Mar 17 17:24:55.313883 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 Mar 17 17:24:55.313890 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 17:24:55.313898 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 17 17:24:55.313905 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 17 17:24:55.313915 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 17 17:24:55.313922 kernel: audit: initializing netlink subsys (disabled) Mar 17 17:24:55.313929 kernel: audit: type=2000 audit(0.047:1): state=initialized audit_enabled=0 res=1 Mar 17 17:24:55.313937 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 17:24:55.313944 kernel: cpuidle: using governor menu Mar 17 17:24:55.313951 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 17 17:24:55.313958 kernel: ASID allocator initialised with 32768 entries Mar 17 17:24:55.313965 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 17:24:55.313973 kernel: Serial: AMBA PL011 UART driver Mar 17 17:24:55.313981 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 17 17:24:55.313988 kernel: Modules: 0 pages in range for non-PLT usage Mar 17 17:24:55.313995 kernel: Modules: 508944 pages in range for PLT usage Mar 17 17:24:55.314003 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 17 17:24:55.314010 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 17 17:24:55.314017 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 17 17:24:55.314024 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 17 17:24:55.314032 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 17:24:55.314039 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 17 17:24:55.314048 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 17 17:24:55.314055 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 17 17:24:55.314062 kernel: ACPI: Added _OSI(Module Device) Mar 17 17:24:55.314069 kernel: ACPI: Added _OSI(Processor Device) Mar 17 17:24:55.314076 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 17:24:55.314084 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 17:24:55.314091 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 17:24:55.314098 kernel: ACPI: Interpreter enabled Mar 17 17:24:55.314106 kernel: ACPI: Using GIC for interrupt routing Mar 17 17:24:55.314113 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Mar 17 17:24:55.314122 kernel: printk: console [ttyAMA0] enabled Mar 17 17:24:55.314129 kernel: printk: bootconsole [pl11] disabled Mar 17 17:24:55.314136 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Mar 17 17:24:55.314143 kernel: iommu: Default domain type: Translated Mar 17 17:24:55.314150 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 17 17:24:55.314157 kernel: efivars: Registered efivars operations Mar 17 17:24:55.314164 kernel: vgaarb: loaded Mar 17 17:24:55.314172 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 17 17:24:55.314179 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 17:24:55.314188 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 17:24:55.314195 kernel: pnp: PnP ACPI init Mar 17 17:24:55.314202 kernel: pnp: PnP ACPI: found 0 devices Mar 17 17:24:55.314210 kernel: NET: Registered PF_INET protocol family Mar 17 17:24:55.314217 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 17 17:24:55.314224 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 17 17:24:55.314232 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 17:24:55.314239 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 17 17:24:55.314247 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 17 17:24:55.314255 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 17 17:24:55.314262 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:24:55.314269 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:24:55.314277 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 17:24:55.314284 kernel: PCI: CLS 0 bytes, default 64 Mar 17 17:24:55.314291 kernel: kvm [1]: HYP mode not available Mar 17 17:24:55.314309 kernel: Initialise system trusted keyrings Mar 17 17:24:55.314316 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 17 17:24:55.314326 kernel: Key type asymmetric registered Mar 17 17:24:55.314333 kernel: Asymmetric key parser 'x509' registered Mar 17 17:24:55.314340 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 17 17:24:55.314347 kernel: io scheduler mq-deadline registered Mar 17 17:24:55.314354 kernel: io scheduler kyber registered Mar 17 17:24:55.314361 kernel: io scheduler bfq registered Mar 17 17:24:55.314369 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 17:24:55.314376 kernel: thunder_xcv, ver 1.0 Mar 17 17:24:55.314383 kernel: thunder_bgx, ver 1.0 Mar 17 17:24:55.314390 kernel: nicpf, ver 1.0 Mar 17 17:24:55.314399 kernel: nicvf, ver 1.0 Mar 17 17:24:55.314546 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 17 17:24:55.314617 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-17T17:24:54 UTC (1742232294) Mar 17 17:24:55.314628 kernel: efifb: probing for efifb Mar 17 17:24:55.314635 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Mar 17 17:24:55.314642 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Mar 17 17:24:55.314650 kernel: efifb: scrolling: redraw Mar 17 17:24:55.314659 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 17 17:24:55.314666 kernel: Console: switching to colour frame buffer device 128x48 Mar 17 17:24:55.314673 kernel: fb0: EFI VGA frame buffer device Mar 17 17:24:55.314680 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Mar 17 17:24:55.314688 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 17 17:24:55.314695 kernel: No ACPI PMU IRQ for CPU0 Mar 17 17:24:55.314702 kernel: No ACPI PMU IRQ for CPU1 Mar 17 17:24:55.314709 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 1 counters available Mar 17 17:24:55.314717 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 17 17:24:55.314725 kernel: watchdog: Hard watchdog permanently disabled Mar 17 17:24:55.314733 kernel: NET: Registered PF_INET6 protocol family Mar 17 17:24:55.314740 kernel: Segment Routing with IPv6 Mar 17 17:24:55.314747 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 17:24:55.314754 kernel: NET: Registered PF_PACKET protocol family Mar 17 17:24:55.314761 kernel: Key type dns_resolver registered Mar 17 17:24:55.314768 kernel: registered taskstats version 1 Mar 17 17:24:55.314775 kernel: Loading compiled-in X.509 certificates Mar 17 17:24:55.314783 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 74c9b4f5dfad711856d7363c976664fc02c1e24c' Mar 17 17:24:55.314790 kernel: Key type .fscrypt registered Mar 17 17:24:55.314798 kernel: Key type fscrypt-provisioning registered Mar 17 17:24:55.314805 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 17:24:55.314813 kernel: ima: Allocated hash algorithm: sha1 Mar 17 17:24:55.314820 kernel: ima: No architecture policies found Mar 17 17:24:55.314827 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 17 17:24:55.314834 kernel: clk: Disabling unused clocks Mar 17 17:24:55.314841 kernel: Freeing unused kernel memory: 39744K Mar 17 17:24:55.314849 kernel: Run /init as init process Mar 17 17:24:55.314857 kernel: with arguments: Mar 17 17:24:55.314865 kernel: /init Mar 17 17:24:55.314871 kernel: with environment: Mar 17 17:24:55.314878 kernel: HOME=/ Mar 17 17:24:55.314885 kernel: TERM=linux Mar 17 17:24:55.314892 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 17:24:55.314902 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 17 17:24:55.314911 systemd[1]: Detected virtualization microsoft. Mar 17 17:24:55.314920 systemd[1]: Detected architecture arm64. Mar 17 17:24:55.314928 systemd[1]: Running in initrd. Mar 17 17:24:55.314935 systemd[1]: No hostname configured, using default hostname. Mar 17 17:24:55.314943 systemd[1]: Hostname set to . Mar 17 17:24:55.314951 systemd[1]: Initializing machine ID from random generator. Mar 17 17:24:55.314958 systemd[1]: Queued start job for default target initrd.target. Mar 17 17:24:55.314966 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:24:55.314974 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:24:55.314983 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 17 17:24:55.314991 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:24:55.314999 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 17 17:24:55.315007 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 17 17:24:55.315016 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 17 17:24:55.315024 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 17 17:24:55.315032 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:24:55.315041 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:24:55.315048 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:24:55.315056 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:24:55.315064 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:24:55.315071 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:24:55.315079 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:24:55.315086 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:24:55.315094 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 17 17:24:55.315102 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 17 17:24:55.315111 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:24:55.315119 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:24:55.315127 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:24:55.315134 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:24:55.315142 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 17 17:24:55.315150 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:24:55.315158 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 17 17:24:55.315165 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 17:24:55.315175 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:24:55.315182 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:24:55.315209 systemd-journald[218]: Collecting audit messages is disabled. Mar 17 17:24:55.315229 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:24:55.315239 systemd-journald[218]: Journal started Mar 17 17:24:55.315257 systemd-journald[218]: Runtime Journal (/run/log/journal/512df49dcb784633a071ef45a5cc2063) is 8.0M, max 78.5M, 70.5M free. Mar 17 17:24:55.328371 systemd-modules-load[219]: Inserted module 'overlay' Mar 17 17:24:55.353394 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:24:55.353425 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 17:24:55.363422 systemd-modules-load[219]: Inserted module 'br_netfilter' Mar 17 17:24:55.368700 kernel: Bridge firewalling registered Mar 17 17:24:55.367910 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 17 17:24:55.375251 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:24:55.394314 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 17:24:55.404916 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:24:55.414750 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:24:55.436707 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:24:55.445476 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:24:55.467461 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 17:24:55.490468 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:24:55.497986 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:24:55.506725 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:24:55.518071 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:24:55.531328 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:24:55.559626 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 17 17:24:55.574175 dracut-cmdline[252]: dracut-dracut-053 Mar 17 17:24:55.576933 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:24:55.586183 dracut-cmdline[252]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=31b104f73129b84fa679201ebe02fbfd197d071bbf0576d6ccc5c5442bcbb405 Mar 17 17:24:55.624576 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:24:55.646272 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:24:55.663257 systemd-resolved[260]: Positive Trust Anchors: Mar 17 17:24:55.663271 systemd-resolved[260]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:24:55.663392 systemd-resolved[260]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:24:55.666543 systemd-resolved[260]: Defaulting to hostname 'linux'. Mar 17 17:24:55.667449 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:24:55.683147 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:24:55.774310 kernel: SCSI subsystem initialized Mar 17 17:24:55.781315 kernel: Loading iSCSI transport class v2.0-870. Mar 17 17:24:55.792328 kernel: iscsi: registered transport (tcp) Mar 17 17:24:55.809599 kernel: iscsi: registered transport (qla4xxx) Mar 17 17:24:55.809660 kernel: QLogic iSCSI HBA Driver Mar 17 17:24:55.843194 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 17 17:24:55.859605 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 17 17:24:55.892440 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 17:24:55.892507 kernel: device-mapper: uevent: version 1.0.3 Mar 17 17:24:55.898943 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 17 17:24:55.948325 kernel: raid6: neonx8 gen() 15747 MB/s Mar 17 17:24:55.968315 kernel: raid6: neonx4 gen() 15653 MB/s Mar 17 17:24:55.988305 kernel: raid6: neonx2 gen() 13196 MB/s Mar 17 17:24:56.009307 kernel: raid6: neonx1 gen() 10483 MB/s Mar 17 17:24:56.029306 kernel: raid6: int64x8 gen() 6956 MB/s Mar 17 17:24:56.050306 kernel: raid6: int64x4 gen() 7337 MB/s Mar 17 17:24:56.071306 kernel: raid6: int64x2 gen() 6125 MB/s Mar 17 17:24:56.094783 kernel: raid6: int64x1 gen() 5059 MB/s Mar 17 17:24:56.094806 kernel: raid6: using algorithm neonx8 gen() 15747 MB/s Mar 17 17:24:56.118299 kernel: raid6: .... xor() 11919 MB/s, rmw enabled Mar 17 17:24:56.118314 kernel: raid6: using neon recovery algorithm Mar 17 17:24:56.127308 kernel: xor: measuring software checksum speed Mar 17 17:24:56.133786 kernel: 8regs : 18637 MB/sec Mar 17 17:24:56.133798 kernel: 32regs : 19679 MB/sec Mar 17 17:24:56.137134 kernel: arm64_neon : 26874 MB/sec Mar 17 17:24:56.141697 kernel: xor: using function: arm64_neon (26874 MB/sec) Mar 17 17:24:56.192310 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 17 17:24:56.203580 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:24:56.226463 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:24:56.248262 systemd-udevd[438]: Using default interface naming scheme 'v255'. Mar 17 17:24:56.253608 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:24:56.274519 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 17 17:24:56.286607 dracut-pre-trigger[449]: rd.md=0: removing MD RAID activation Mar 17 17:24:56.314734 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:24:56.333911 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:24:56.372784 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:24:56.391507 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 17 17:24:56.417615 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 17 17:24:56.432526 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:24:56.448747 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:24:56.456506 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:24:56.486589 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 17 17:24:56.504610 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:24:56.516797 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:24:56.563972 kernel: hv_vmbus: Vmbus version:5.3 Mar 17 17:24:56.563997 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 17 17:24:56.564007 kernel: hv_vmbus: registering driver hid_hyperv Mar 17 17:24:56.564017 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 17 17:24:56.516967 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:24:56.586873 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input0 Mar 17 17:24:56.531772 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:24:56.619085 kernel: hv_vmbus: registering driver hyperv_keyboard Mar 17 17:24:56.619113 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Mar 17 17:24:56.619331 kernel: hv_vmbus: registering driver hv_netvsc Mar 17 17:24:56.619348 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input1 Mar 17 17:24:56.540399 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:24:56.651556 kernel: hv_vmbus: registering driver hv_storvsc Mar 17 17:24:56.540599 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:24:56.699290 kernel: PTP clock support registered Mar 17 17:24:56.699333 kernel: hv_utils: Registering HyperV Utility Driver Mar 17 17:24:56.699343 kernel: hv_vmbus: registering driver hv_utils Mar 17 17:24:56.699352 kernel: scsi host0: storvsc_host_t Mar 17 17:24:56.988943 kernel: scsi host1: storvsc_host_t Mar 17 17:24:56.989101 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Mar 17 17:24:56.989199 kernel: hv_utils: Heartbeat IC version 3.0 Mar 17 17:24:56.989209 kernel: hv_utils: TimeSync IC version 4.0 Mar 17 17:24:56.989218 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 0 Mar 17 17:24:56.989313 kernel: hv_utils: Shutdown IC version 3.2 Mar 17 17:24:56.570275 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:24:56.662576 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:24:56.711153 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:24:56.711322 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:24:56.988539 systemd-resolved[260]: Clock change detected. Flushing caches. Mar 17 17:24:57.052836 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Mar 17 17:24:57.059081 kernel: hv_netvsc 002248c2-2c2e-0022-48c2-2c2e002248c2 eth0: VF slot 1 added Mar 17 17:24:57.059205 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 17 17:24:57.059216 kernel: hv_vmbus: registering driver hv_pci Mar 17 17:24:57.059226 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Mar 17 17:24:57.023527 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:24:57.073582 kernel: hv_pci a59db578-9ebc-47e7-8fc5-4c7bcb135f35: PCI VMBus probing: Using version 0x10004 Mar 17 17:24:57.173092 kernel: hv_pci a59db578-9ebc-47e7-8fc5-4c7bcb135f35: PCI host bridge to bus 9ebc:00 Mar 17 17:24:57.173203 kernel: pci_bus 9ebc:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Mar 17 17:24:57.173296 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Mar 17 17:24:57.173393 kernel: pci_bus 9ebc:00: No busn resource found for root bus, will use [bus 00-ff] Mar 17 17:24:57.173474 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Mar 17 17:24:57.173555 kernel: pci 9ebc:00:02.0: [15b3:1018] type 00 class 0x020000 Mar 17 17:24:57.173648 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 17 17:24:57.173730 kernel: pci 9ebc:00:02.0: reg 0x10: [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 17 17:24:57.173829 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Mar 17 17:24:57.173915 kernel: pci 9ebc:00:02.0: enabling Extended Tags Mar 17 17:24:57.173996 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Mar 17 17:24:57.174081 kernel: pci 9ebc:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 9ebc:00:02.0 (capable of 126.016 Gb/s with 8.0 GT/s PCIe x16 link) Mar 17 17:24:57.174161 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:24:57.174171 kernel: pci_bus 9ebc:00: busn_res: [bus 00-ff] end is updated to 00 Mar 17 17:24:57.174245 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 17 17:24:57.174325 kernel: pci 9ebc:00:02.0: BAR 0: assigned [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 17 17:24:57.146964 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:24:57.172003 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:24:57.216754 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:24:57.241374 kernel: mlx5_core 9ebc:00:02.0: enabling device (0000 -> 0002) Mar 17 17:24:57.450688 kernel: mlx5_core 9ebc:00:02.0: firmware version: 16.30.1284 Mar 17 17:24:57.450854 kernel: hv_netvsc 002248c2-2c2e-0022-48c2-2c2e002248c2 eth0: VF registering: eth1 Mar 17 17:24:57.450953 kernel: mlx5_core 9ebc:00:02.0 eth1: joined to eth0 Mar 17 17:24:57.451046 kernel: mlx5_core 9ebc:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Mar 17 17:24:57.459842 kernel: mlx5_core 9ebc:00:02.0 enP40636s1: renamed from eth1 Mar 17 17:24:57.744405 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Mar 17 17:24:57.856838 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by (udev-worker) (482) Mar 17 17:24:57.872409 kernel: BTRFS: device fsid c0c482e3-6885-4a4e-b31c-6bc8f8c403e7 devid 1 transid 40 /dev/sda3 scanned by (udev-worker) (496) Mar 17 17:24:57.873779 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Mar 17 17:24:57.893636 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 17 17:24:57.910172 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Mar 17 17:24:57.917890 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Mar 17 17:24:57.947053 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 17 17:24:57.973220 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:24:57.980828 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:24:58.989941 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:24:58.990002 disk-uuid[605]: The operation has completed successfully. Mar 17 17:24:59.048488 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 17:24:59.049827 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 17 17:24:59.078008 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 17 17:24:59.091391 sh[691]: Success Mar 17 17:24:59.136929 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 17 17:24:59.335167 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 17 17:24:59.357949 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 17 17:24:59.367461 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 17 17:24:59.400493 kernel: BTRFS info (device dm-0): first mount of filesystem c0c482e3-6885-4a4e-b31c-6bc8f8c403e7 Mar 17 17:24:59.400552 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:24:59.400563 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 17 17:24:59.411647 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 17 17:24:59.415626 kernel: BTRFS info (device dm-0): using free space tree Mar 17 17:24:59.806324 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 17 17:24:59.811454 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 17 17:24:59.832120 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 17 17:24:59.840988 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 17 17:24:59.875125 kernel: BTRFS info (device sda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:24:59.875177 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:24:59.879541 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:24:59.902834 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 17:24:59.911438 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 17:24:59.922508 kernel: BTRFS info (device sda6): last unmount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:24:59.927494 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 17 17:24:59.943976 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 17 17:24:59.952466 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:24:59.970348 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:25:00.011004 systemd-networkd[875]: lo: Link UP Mar 17 17:25:00.011014 systemd-networkd[875]: lo: Gained carrier Mar 17 17:25:00.012564 systemd-networkd[875]: Enumeration completed Mar 17 17:25:00.015909 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:25:00.016628 systemd-networkd[875]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:25:00.016631 systemd-networkd[875]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:25:00.022390 systemd[1]: Reached target network.target - Network. Mar 17 17:25:00.107820 kernel: mlx5_core 9ebc:00:02.0 enP40636s1: Link up Mar 17 17:25:00.144828 kernel: hv_netvsc 002248c2-2c2e-0022-48c2-2c2e002248c2 eth0: Data path switched to VF: enP40636s1 Mar 17 17:25:00.145321 systemd-networkd[875]: enP40636s1: Link UP Mar 17 17:25:00.145413 systemd-networkd[875]: eth0: Link UP Mar 17 17:25:00.145538 systemd-networkd[875]: eth0: Gained carrier Mar 17 17:25:00.145547 systemd-networkd[875]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:25:00.168282 systemd-networkd[875]: enP40636s1: Gained carrier Mar 17 17:25:00.182863 systemd-networkd[875]: eth0: DHCPv4 address 10.200.20.36/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 17 17:25:00.889612 ignition[873]: Ignition 2.20.0 Mar 17 17:25:00.889623 ignition[873]: Stage: fetch-offline Mar 17 17:25:00.893269 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:25:00.889657 ignition[873]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:25:00.889664 ignition[873]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:25:00.889764 ignition[873]: parsed url from cmdline: "" Mar 17 17:25:00.889768 ignition[873]: no config URL provided Mar 17 17:25:00.889772 ignition[873]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 17:25:00.889779 ignition[873]: no config at "/usr/lib/ignition/user.ign" Mar 17 17:25:00.925179 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 17 17:25:00.889784 ignition[873]: failed to fetch config: resource requires networking Mar 17 17:25:00.889973 ignition[873]: Ignition finished successfully Mar 17 17:25:00.945675 ignition[885]: Ignition 2.20.0 Mar 17 17:25:00.945681 ignition[885]: Stage: fetch Mar 17 17:25:00.946039 ignition[885]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:25:00.946049 ignition[885]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:25:00.946162 ignition[885]: parsed url from cmdline: "" Mar 17 17:25:00.946174 ignition[885]: no config URL provided Mar 17 17:25:00.946179 ignition[885]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 17:25:00.946186 ignition[885]: no config at "/usr/lib/ignition/user.ign" Mar 17 17:25:00.946211 ignition[885]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Mar 17 17:25:01.046877 ignition[885]: GET result: OK Mar 17 17:25:01.047006 ignition[885]: config has been read from IMDS userdata Mar 17 17:25:01.047082 ignition[885]: parsing config with SHA512: 113cefce37b638a5822866ff76285cc272a1115be0031e6af3ae3213bdd9948559c4af48b465f163a121afcf37ba30c599de216d6eb4b8c3b1826db2575835ba Mar 17 17:25:01.051962 unknown[885]: fetched base config from "system" Mar 17 17:25:01.052428 ignition[885]: fetch: fetch complete Mar 17 17:25:01.051989 unknown[885]: fetched base config from "system" Mar 17 17:25:01.052447 ignition[885]: fetch: fetch passed Mar 17 17:25:01.051996 unknown[885]: fetched user config from "azure" Mar 17 17:25:01.052496 ignition[885]: Ignition finished successfully Mar 17 17:25:01.057127 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 17 17:25:01.095673 ignition[891]: Ignition 2.20.0 Mar 17 17:25:01.074972 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 17 17:25:01.095680 ignition[891]: Stage: kargs Mar 17 17:25:01.103444 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 17 17:25:01.095897 ignition[891]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:25:01.095907 ignition[891]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:25:01.127110 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 17 17:25:01.096978 ignition[891]: kargs: kargs passed Mar 17 17:25:01.097035 ignition[891]: Ignition finished successfully Mar 17 17:25:01.153889 ignition[897]: Ignition 2.20.0 Mar 17 17:25:01.156947 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 17 17:25:01.153896 ignition[897]: Stage: disks Mar 17 17:25:01.163162 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 17 17:25:01.154076 ignition[897]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:25:01.172694 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 17 17:25:01.154086 ignition[897]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:25:01.184768 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:25:01.154982 ignition[897]: disks: disks passed Mar 17 17:25:01.193252 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:25:01.155028 ignition[897]: Ignition finished successfully Mar 17 17:25:01.204372 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:25:01.231045 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 17 17:25:01.315755 systemd-fsck[907]: ROOT: clean, 14/7326000 files, 477710/7359488 blocks Mar 17 17:25:01.325355 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 17 17:25:01.341036 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 17 17:25:01.362962 systemd-networkd[875]: eth0: Gained IPv6LL Mar 17 17:25:01.405878 kernel: EXT4-fs (sda9): mounted filesystem 6b579bf2-7716-4d59-98eb-b92ea668693e r/w with ordered data mode. Quota mode: none. Mar 17 17:25:01.406420 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 17 17:25:01.411130 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 17 17:25:01.465886 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:25:01.475424 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 17 17:25:01.483007 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 17 17:25:01.520211 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (918) Mar 17 17:25:01.520236 kernel: BTRFS info (device sda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:25:01.500909 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 17:25:01.537916 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:25:01.500951 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:25:01.553339 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:25:01.534685 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 17 17:25:01.560080 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 17 17:25:01.575167 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 17:25:01.575065 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:25:02.066988 systemd-networkd[875]: enP40636s1: Gained IPv6LL Mar 17 17:25:02.167698 coreos-metadata[920]: Mar 17 17:25:02.167 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 17 17:25:02.177306 coreos-metadata[920]: Mar 17 17:25:02.177 INFO Fetch successful Mar 17 17:25:02.182671 coreos-metadata[920]: Mar 17 17:25:02.182 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Mar 17 17:25:02.204481 coreos-metadata[920]: Mar 17 17:25:02.204 INFO Fetch successful Mar 17 17:25:02.218649 coreos-metadata[920]: Mar 17 17:25:02.218 INFO wrote hostname ci-4152.2.2-a-f9f073f8c6 to /sysroot/etc/hostname Mar 17 17:25:02.227901 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 17 17:25:02.493901 initrd-setup-root[948]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 17:25:02.531847 initrd-setup-root[955]: cut: /sysroot/etc/group: No such file or directory Mar 17 17:25:02.551611 initrd-setup-root[962]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 17:25:02.576142 initrd-setup-root[969]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 17:25:03.503953 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 17 17:25:03.521109 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 17 17:25:03.529982 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 17 17:25:03.547399 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 17 17:25:03.555818 kernel: BTRFS info (device sda6): last unmount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:25:03.572714 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 17 17:25:03.585500 ignition[1037]: INFO : Ignition 2.20.0 Mar 17 17:25:03.585500 ignition[1037]: INFO : Stage: mount Mar 17 17:25:03.594338 ignition[1037]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:25:03.594338 ignition[1037]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:25:03.594338 ignition[1037]: INFO : mount: mount passed Mar 17 17:25:03.594338 ignition[1037]: INFO : Ignition finished successfully Mar 17 17:25:03.590893 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 17 17:25:03.615025 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 17 17:25:03.635171 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:25:03.656837 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (1049) Mar 17 17:25:03.669402 kernel: BTRFS info (device sda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:25:03.669422 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:25:03.673495 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:25:03.680829 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 17:25:03.681441 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:25:03.709851 ignition[1066]: INFO : Ignition 2.20.0 Mar 17 17:25:03.709851 ignition[1066]: INFO : Stage: files Mar 17 17:25:03.719090 ignition[1066]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:25:03.719090 ignition[1066]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:25:03.719090 ignition[1066]: DEBUG : files: compiled without relabeling support, skipping Mar 17 17:25:03.719090 ignition[1066]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 17:25:03.719090 ignition[1066]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 17:25:03.803163 ignition[1066]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 17:25:03.810703 ignition[1066]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 17:25:03.810703 ignition[1066]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 17:25:03.803559 unknown[1066]: wrote ssh authorized keys file for user: core Mar 17 17:25:03.861348 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 17 17:25:03.872133 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Mar 17 17:25:03.898497 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 17 17:25:04.022136 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 17 17:25:04.034989 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-arm64.raw: attempt #1 Mar 17 17:25:04.471288 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 17 17:25:04.714998 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 17 17:25:04.714998 ignition[1066]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 17 17:25:04.748092 ignition[1066]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:25:04.760243 ignition[1066]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:25:04.760243 ignition[1066]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 17 17:25:04.760243 ignition[1066]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 17 17:25:04.760243 ignition[1066]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 17 17:25:04.760243 ignition[1066]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:25:04.760243 ignition[1066]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:25:04.760243 ignition[1066]: INFO : files: files passed Mar 17 17:25:04.760243 ignition[1066]: INFO : Ignition finished successfully Mar 17 17:25:04.760122 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 17 17:25:04.818159 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 17 17:25:04.831020 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 17 17:25:04.841342 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 17:25:04.847763 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 17 17:25:04.879845 initrd-setup-root-after-ignition[1099]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:25:04.887746 initrd-setup-root-after-ignition[1095]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:25:04.887746 initrd-setup-root-after-ignition[1095]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:25:04.879950 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:25:04.894369 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 17 17:25:04.929087 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 17 17:25:04.963919 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 17:25:04.964037 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 17 17:25:04.977337 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 17 17:25:04.989734 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 17 17:25:05.000371 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 17 17:25:05.015089 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 17 17:25:05.035879 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:25:05.059144 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 17 17:25:05.076511 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:25:05.083679 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:25:05.096224 systemd[1]: Stopped target timers.target - Timer Units. Mar 17 17:25:05.107343 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 17:25:05.107517 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:25:05.123704 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 17 17:25:05.135994 systemd[1]: Stopped target basic.target - Basic System. Mar 17 17:25:05.147087 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 17 17:25:05.157922 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:25:05.170662 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 17 17:25:05.183318 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 17 17:25:05.195513 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:25:05.208381 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 17 17:25:05.221763 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 17 17:25:05.233047 systemd[1]: Stopped target swap.target - Swaps. Mar 17 17:25:05.243118 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 17:25:05.243289 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:25:05.258945 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:25:05.271003 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:25:05.283821 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 17 17:25:05.283934 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:25:05.298728 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 17:25:05.298922 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 17 17:25:05.317769 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 17:25:05.317954 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:25:05.330483 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 17:25:05.330633 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 17 17:25:05.342168 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 17 17:25:05.342322 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 17 17:25:05.375952 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 17 17:25:05.406902 ignition[1119]: INFO : Ignition 2.20.0 Mar 17 17:25:05.406902 ignition[1119]: INFO : Stage: umount Mar 17 17:25:05.406902 ignition[1119]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:25:05.406902 ignition[1119]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:25:05.406902 ignition[1119]: INFO : umount: umount passed Mar 17 17:25:05.406902 ignition[1119]: INFO : Ignition finished successfully Mar 17 17:25:05.393339 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 17:25:05.393527 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:25:05.404043 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 17 17:25:05.413478 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 17:25:05.413641 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:25:05.423975 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 17:25:05.424084 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:25:05.440861 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 17:25:05.440952 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 17 17:25:05.459918 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 17:25:05.460057 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 17 17:25:05.476652 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 17:25:05.476714 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 17 17:25:05.488794 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 17 17:25:05.488861 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 17 17:25:05.500288 systemd[1]: Stopped target network.target - Network. Mar 17 17:25:05.505895 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 17:25:05.505974 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:25:05.519726 systemd[1]: Stopped target paths.target - Path Units. Mar 17 17:25:05.536536 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 17:25:05.542241 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:25:05.550501 systemd[1]: Stopped target slices.target - Slice Units. Mar 17 17:25:05.560813 systemd[1]: Stopped target sockets.target - Socket Units. Mar 17 17:25:05.570873 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 17:25:05.570945 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:25:05.581569 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 17:25:05.581615 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:25:05.592698 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 17:25:05.592757 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 17 17:25:05.598288 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 17 17:25:05.598332 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 17 17:25:05.609285 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 17 17:25:05.620346 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 17 17:25:05.636070 systemd-networkd[875]: eth0: DHCPv6 lease lost Mar 17 17:25:05.639179 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 17:25:05.639849 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 17:25:05.834974 kernel: hv_netvsc 002248c2-2c2e-0022-48c2-2c2e002248c2 eth0: Data path switched from VF: enP40636s1 Mar 17 17:25:05.639942 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 17 17:25:05.648569 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 17:25:05.650745 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 17 17:25:05.661397 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 17:25:05.661599 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 17 17:25:05.676326 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 17:25:05.676385 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:25:05.704021 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 17 17:25:05.712950 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 17:25:05.713030 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:25:05.725186 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 17:25:05.725246 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:25:05.736513 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 17:25:05.736591 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 17 17:25:05.747074 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 17 17:25:05.747122 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:25:05.758891 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:25:05.810647 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 17:25:05.810795 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:25:05.831464 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 17:25:05.831523 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 17 17:25:05.840816 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 17:25:05.840865 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:25:05.853025 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 17:25:05.853082 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:25:05.869365 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 17:25:05.869425 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 17 17:25:05.886233 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:25:05.886300 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:25:05.931988 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 17 17:25:05.944884 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 17 17:25:05.944956 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:25:05.958639 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:25:05.958699 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:25:05.972223 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 17:25:05.972320 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 17 17:25:05.984396 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 17:25:05.984487 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 17 17:25:06.080640 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 17:25:06.080758 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 17 17:25:06.089690 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 17 17:25:06.100041 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 17:25:06.100104 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 17 17:25:06.127056 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 17 17:25:06.139167 systemd[1]: Switching root. Mar 17 17:25:06.243542 systemd-journald[218]: Journal stopped Mar 17 17:25:16.551183 systemd-journald[218]: Received SIGTERM from PID 1 (systemd). Mar 17 17:25:16.551208 kernel: SELinux: policy capability network_peer_controls=1 Mar 17 17:25:16.551219 kernel: SELinux: policy capability open_perms=1 Mar 17 17:25:16.551231 kernel: SELinux: policy capability extended_socket_class=1 Mar 17 17:25:16.551239 kernel: SELinux: policy capability always_check_network=0 Mar 17 17:25:16.551246 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 17 17:25:16.551254 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 17 17:25:16.551262 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 17 17:25:16.551270 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 17 17:25:16.551278 kernel: audit: type=1403 audit(1742232307.388:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 17 17:25:16.551288 systemd[1]: Successfully loaded SELinux policy in 153.592ms. Mar 17 17:25:16.551297 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 10.287ms. Mar 17 17:25:16.551307 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 17 17:25:16.551316 systemd[1]: Detected virtualization microsoft. Mar 17 17:25:16.551325 systemd[1]: Detected architecture arm64. Mar 17 17:25:16.551335 systemd[1]: Detected first boot. Mar 17 17:25:16.551344 systemd[1]: Hostname set to . Mar 17 17:25:16.551353 systemd[1]: Initializing machine ID from random generator. Mar 17 17:25:16.551362 zram_generator::config[1162]: No configuration found. Mar 17 17:25:16.551371 systemd[1]: Populated /etc with preset unit settings. Mar 17 17:25:16.551381 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 17 17:25:16.551391 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 17 17:25:16.551400 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 17 17:25:16.551409 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 17 17:25:16.551418 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 17 17:25:16.551428 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 17 17:25:16.551437 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 17 17:25:16.551446 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 17 17:25:16.551456 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 17 17:25:16.551465 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 17 17:25:16.551474 systemd[1]: Created slice user.slice - User and Session Slice. Mar 17 17:25:16.551483 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:25:16.551492 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:25:16.551501 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 17 17:25:16.551510 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 17 17:25:16.551520 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 17 17:25:16.551529 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:25:16.551539 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 17 17:25:16.551549 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:25:16.551558 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 17 17:25:16.551569 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 17 17:25:16.551578 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 17 17:25:16.551587 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 17 17:25:16.551596 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:25:16.551606 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:25:16.551616 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:25:16.551625 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:25:16.551634 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 17 17:25:16.551643 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 17 17:25:16.551653 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:25:16.551662 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:25:16.551673 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:25:16.551682 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 17 17:25:16.551691 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 17 17:25:16.551701 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 17 17:25:16.551710 systemd[1]: Mounting media.mount - External Media Directory... Mar 17 17:25:16.551719 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 17 17:25:16.551729 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 17 17:25:16.551739 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 17 17:25:16.551748 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 17 17:25:16.551757 systemd[1]: Reached target machines.target - Containers. Mar 17 17:25:16.551767 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 17 17:25:16.551776 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:25:16.551786 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:25:16.551795 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 17 17:25:16.551831 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:25:16.551843 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 17:25:16.551852 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:25:16.551862 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 17 17:25:16.551871 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:25:16.551881 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 17 17:25:16.551890 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 17 17:25:16.551899 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 17 17:25:16.551909 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 17 17:25:16.551919 systemd[1]: Stopped systemd-fsck-usr.service. Mar 17 17:25:16.551928 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:25:16.551938 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:25:16.551948 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 17 17:25:16.551957 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 17 17:25:16.551967 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:25:16.551992 systemd-journald[1241]: Collecting audit messages is disabled. Mar 17 17:25:16.552014 systemd-journald[1241]: Journal started Mar 17 17:25:16.552034 systemd-journald[1241]: Runtime Journal (/run/log/journal/d41336cba45946bc98b790884234ade7) is 8.0M, max 78.5M, 70.5M free. Mar 17 17:25:14.999847 systemd[1]: Queued start job for default target multi-user.target. Mar 17 17:25:15.248003 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 17 17:25:15.248349 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 17 17:25:15.248633 systemd[1]: systemd-journald.service: Consumed 3.078s CPU time. Mar 17 17:25:16.561666 systemd[1]: verity-setup.service: Deactivated successfully. Mar 17 17:25:16.561728 systemd[1]: Stopped verity-setup.service. Mar 17 17:25:16.579795 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:25:16.580590 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 17 17:25:16.586202 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 17 17:25:16.592375 systemd[1]: Mounted media.mount - External Media Directory. Mar 17 17:25:16.597468 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 17 17:25:16.603240 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 17 17:25:16.609163 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 17 17:25:16.615865 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:25:16.622904 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:25:16.623077 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:25:16.630339 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:25:16.630478 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:25:16.638828 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 17 17:25:16.650453 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 17 17:25:16.687193 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 17 17:25:16.696645 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 17 17:25:16.696711 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:25:16.697827 kernel: loop: module loaded Mar 17 17:25:16.703918 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 17 17:25:16.714971 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 17 17:25:16.722180 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 17 17:25:16.727459 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:25:16.902825 kernel: fuse: init (API version 7.39) Mar 17 17:25:16.903052 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 17 17:25:16.912436 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 17 17:25:16.922221 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 17:25:16.924115 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 17 17:25:16.938149 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 17 17:25:16.946918 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 17 17:25:16.947083 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 17 17:25:16.958936 kernel: ACPI: bus type drm_connector registered Mar 17 17:25:16.960790 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 17:25:16.968525 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 17:25:16.974684 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 17 17:25:16.974919 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 17 17:25:16.981003 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:25:16.981211 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:25:16.987450 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:25:16.993867 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 17 17:25:17.000881 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:25:17.014914 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 17 17:25:17.024990 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 17 17:25:17.031211 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 17:25:17.033020 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:25:17.042060 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 17 17:25:17.052520 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 17 17:25:17.059054 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 17 17:25:17.068901 udevadm[1290]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 17 17:25:17.242598 systemd-journald[1241]: Time spent on flushing to /var/log/journal/d41336cba45946bc98b790884234ade7 is 24.246ms for 899 entries. Mar 17 17:25:17.242598 systemd-journald[1241]: System Journal (/var/log/journal/d41336cba45946bc98b790884234ade7) is 8.0M, max 2.6G, 2.6G free. Mar 17 17:25:17.305429 systemd-journald[1241]: Received client request to flush runtime journal. Mar 17 17:25:17.291380 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 17 17:25:17.301310 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 17 17:25:17.308378 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 17 17:25:17.322737 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 17 17:25:17.332011 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 17 17:25:17.337556 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 17 17:25:17.345828 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:25:17.686831 kernel: loop0: detected capacity change from 0 to 116808 Mar 17 17:25:18.195993 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 17 17:25:18.196541 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 17 17:25:18.528759 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 17 17:25:18.541036 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:25:18.981383 systemd-tmpfiles[1312]: ACLs are not supported, ignoring. Mar 17 17:25:18.981401 systemd-tmpfiles[1312]: ACLs are not supported, ignoring. Mar 17 17:25:18.984997 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:25:19.809842 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 17 17:25:20.137933 kernel: loop1: detected capacity change from 0 to 28720 Mar 17 17:25:21.586820 kernel: loop2: detected capacity change from 0 to 113536 Mar 17 17:25:22.507576 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 17 17:25:22.520973 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:25:22.542368 systemd-udevd[1319]: Using default interface naming scheme 'v255'. Mar 17 17:25:23.309824 kernel: loop3: detected capacity change from 0 to 189592 Mar 17 17:25:23.349845 kernel: loop4: detected capacity change from 0 to 116808 Mar 17 17:25:23.360822 kernel: loop5: detected capacity change from 0 to 28720 Mar 17 17:25:23.368815 kernel: loop6: detected capacity change from 0 to 113536 Mar 17 17:25:23.377828 kernel: loop7: detected capacity change from 0 to 189592 Mar 17 17:25:23.381847 (sd-merge)[1322]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Mar 17 17:25:23.382269 (sd-merge)[1322]: Merged extensions into '/usr'. Mar 17 17:25:23.385895 systemd[1]: Reloading requested from client PID 1279 ('systemd-sysext') (unit systemd-sysext.service)... Mar 17 17:25:23.386019 systemd[1]: Reloading... Mar 17 17:25:23.463894 zram_generator::config[1347]: No configuration found. Mar 17 17:25:23.682535 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:25:23.747261 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 17 17:25:23.747573 systemd[1]: Reloading finished in 361 ms. Mar 17 17:25:23.780468 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:25:23.788961 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 17 17:25:23.805004 systemd[1]: Starting ensure-sysext.service... Mar 17 17:25:23.810197 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:25:23.816947 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:25:23.943848 kernel: mousedev: PS/2 mouse device common for all mice Mar 17 17:25:23.996589 systemd-tmpfiles[1441]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 17 17:25:24.004147 systemd-tmpfiles[1441]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 17 17:25:24.004836 systemd-tmpfiles[1441]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 17 17:25:24.005050 systemd-tmpfiles[1441]: ACLs are not supported, ignoring. Mar 17 17:25:24.005094 systemd-tmpfiles[1441]: ACLs are not supported, ignoring. Mar 17 17:25:24.007735 systemd-tmpfiles[1441]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 17:25:24.007741 systemd-tmpfiles[1441]: Skipping /boot Mar 17 17:25:24.026717 kernel: hv_vmbus: registering driver hv_balloon Mar 17 17:25:24.026827 kernel: hv_vmbus: registering driver hyperv_fb Mar 17 17:25:24.026853 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Mar 17 17:25:24.026876 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Mar 17 17:25:24.026898 kernel: hv_balloon: Memory hot add disabled on ARM64 Mar 17 17:25:24.026914 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Mar 17 17:25:24.014729 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 17 17:25:24.028845 systemd-tmpfiles[1441]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 17:25:24.029211 systemd-tmpfiles[1441]: Skipping /boot Mar 17 17:25:24.048921 kernel: Console: switching to colour dummy device 80x25 Mar 17 17:25:24.049321 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:25:24.061832 kernel: Console: switching to colour frame buffer device 128x48 Mar 17 17:25:24.080118 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:25:24.087450 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 17 17:25:24.097023 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 17 17:25:24.107033 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:25:24.114793 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 17 17:25:24.183199 systemd[1]: Reloading requested from client PID 1439 ('systemctl') (unit ensure-sysext.service)... Mar 17 17:25:24.183214 systemd[1]: Reloading... Mar 17 17:25:24.249848 zram_generator::config[1485]: No configuration found. Mar 17 17:25:24.374356 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:25:24.401178 systemd-resolved[1454]: Positive Trust Anchors: Mar 17 17:25:24.401702 systemd-resolved[1454]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:25:24.401795 systemd-resolved[1454]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:25:24.442012 systemd[1]: Reloading finished in 258 ms. Mar 17 17:25:24.460564 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 17 17:25:24.472846 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 17 17:25:24.492849 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 17 17:25:24.503390 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:25:24.510223 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:25:24.517208 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:25:24.525516 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:25:24.531495 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:25:24.533232 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:25:24.541147 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:25:24.542868 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:25:24.550100 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:25:24.550372 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:25:24.558088 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:25:24.558350 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:25:24.572971 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:25:24.581059 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:25:24.587927 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 17:25:24.595076 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:25:24.604096 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:25:24.609554 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:25:24.609728 systemd[1]: Reached target time-set.target - System Time Set. Mar 17 17:25:24.616181 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:25:24.617856 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:25:24.624507 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 17:25:24.624643 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 17:25:24.630663 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:25:24.630782 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:25:24.637859 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:25:24.637989 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:25:24.647883 systemd[1]: Finished ensure-sysext.service. Mar 17 17:25:24.654662 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 17:25:24.654743 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 17:25:26.470423 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (1584) Mar 17 17:25:26.538860 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 17 17:25:26.550435 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 17 17:25:26.560013 systemd-networkd[1440]: lo: Link UP Mar 17 17:25:26.560024 systemd-networkd[1440]: lo: Gained carrier Mar 17 17:25:26.561787 systemd-networkd[1440]: Enumeration completed Mar 17 17:25:26.561954 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:25:26.562158 systemd-networkd[1440]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:25:26.562162 systemd-networkd[1440]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:25:26.572437 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 17 17:25:26.615646 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 17 17:25:26.629845 kernel: mlx5_core 9ebc:00:02.0 enP40636s1: Link up Mar 17 17:25:26.655826 kernel: hv_netvsc 002248c2-2c2e-0022-48c2-2c2e002248c2 eth0: Data path switched to VF: enP40636s1 Mar 17 17:25:26.656146 systemd-networkd[1440]: enP40636s1: Link UP Mar 17 17:25:26.656228 systemd-networkd[1440]: eth0: Link UP Mar 17 17:25:26.656231 systemd-networkd[1440]: eth0: Gained carrier Mar 17 17:25:26.656245 systemd-networkd[1440]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:25:26.662964 systemd-networkd[1440]: enP40636s1: Gained carrier Mar 17 17:25:26.668938 systemd-networkd[1440]: eth0: DHCPv4 address 10.200.20.36/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 17 17:25:26.670734 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 17 17:25:26.683971 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 17 17:25:26.745098 augenrules[1657]: No rules Mar 17 17:25:26.746040 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:25:26.746936 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:25:26.756773 lvm[1652]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 17:25:26.788334 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 17 17:25:26.795244 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:25:26.808099 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 17 17:25:26.812368 lvm[1664]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 17:25:26.814953 systemd-resolved[1454]: Using system hostname 'ci-4152.2.2-a-f9f073f8c6'. Mar 17 17:25:26.816862 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:25:26.822814 systemd[1]: Reached target network.target - Network. Mar 17 17:25:26.827473 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:25:26.843420 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 17 17:25:27.076024 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:25:27.219170 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 17 17:25:27.227876 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 17:25:27.859252 systemd-networkd[1440]: eth0: Gained IPv6LL Mar 17 17:25:27.862869 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 17 17:25:27.870265 systemd[1]: Reached target network-online.target - Network is Online. Mar 17 17:25:27.986995 systemd-networkd[1440]: enP40636s1: Gained IPv6LL Mar 17 17:25:29.991462 ldconfig[1258]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 17 17:25:30.009225 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 17 17:25:30.022030 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 17 17:25:30.030488 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 17 17:25:30.037029 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:25:30.042730 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 17 17:25:30.049677 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 17 17:25:30.056841 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 17 17:25:30.064127 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 17 17:25:30.071750 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 17 17:25:30.078900 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 17 17:25:30.078934 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:25:30.083835 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:25:30.104316 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 17 17:25:30.111940 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 17 17:25:30.124484 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 17 17:25:30.130578 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 17 17:25:30.136299 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:25:30.141539 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:25:30.147062 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 17 17:25:30.147087 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 17 17:25:30.154910 systemd[1]: Starting chronyd.service - NTP client/server... Mar 17 17:25:30.162973 systemd[1]: Starting containerd.service - containerd container runtime... Mar 17 17:25:30.175005 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 17 17:25:30.181757 (chronyd)[1676]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS Mar 17 17:25:30.185441 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 17 17:25:30.193317 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 17 17:25:30.200456 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 17 17:25:30.210144 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 17 17:25:30.210190 systemd[1]: hv_fcopy_daemon.service - Hyper-V FCOPY daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_fcopy). Mar 17 17:25:30.212108 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Mar 17 17:25:30.212144 chronyd[1688]: chronyd version 4.6 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) Mar 17 17:25:30.219473 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Mar 17 17:25:30.220699 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:25:30.226290 KVP[1686]: KVP starting; pid is:1686 Mar 17 17:25:30.234078 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 17 17:25:30.239916 jq[1683]: false Mar 17 17:25:30.242331 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 17 17:25:30.247452 chronyd[1688]: Timezone right/UTC failed leap second check, ignoring Mar 17 17:25:30.247726 chronyd[1688]: Loaded seccomp filter (level 2) Mar 17 17:25:30.251576 KVP[1686]: KVP LIC Version: 3.1 Mar 17 17:25:30.257279 kernel: hv_utils: KVP IC version 4.0 Mar 17 17:25:30.251850 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 17 17:25:30.260831 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 17 17:25:30.268944 extend-filesystems[1684]: Found loop4 Mar 17 17:25:30.268944 extend-filesystems[1684]: Found loop5 Mar 17 17:25:30.268944 extend-filesystems[1684]: Found loop6 Mar 17 17:25:30.268944 extend-filesystems[1684]: Found loop7 Mar 17 17:25:30.268944 extend-filesystems[1684]: Found sda Mar 17 17:25:30.268944 extend-filesystems[1684]: Found sda1 Mar 17 17:25:30.268944 extend-filesystems[1684]: Found sda2 Mar 17 17:25:30.268944 extend-filesystems[1684]: Found sda3 Mar 17 17:25:30.268944 extend-filesystems[1684]: Found usr Mar 17 17:25:30.268944 extend-filesystems[1684]: Found sda4 Mar 17 17:25:30.268944 extend-filesystems[1684]: Found sda6 Mar 17 17:25:30.268944 extend-filesystems[1684]: Found sda7 Mar 17 17:25:30.268944 extend-filesystems[1684]: Found sda9 Mar 17 17:25:30.268944 extend-filesystems[1684]: Checking size of /dev/sda9 Mar 17 17:25:30.409431 extend-filesystems[1684]: Old size kept for /dev/sda9 Mar 17 17:25:30.409431 extend-filesystems[1684]: Found sr0 Mar 17 17:25:30.276995 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 17 17:25:30.426745 dbus-daemon[1679]: [system] SELinux support is enabled Mar 17 17:25:30.319015 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 17 17:25:30.328900 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 17 17:25:30.329902 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 17 17:25:30.331588 systemd[1]: Starting update-engine.service - Update Engine... Mar 17 17:25:30.481488 update_engine[1710]: I20250317 17:25:30.448495 1710 main.cc:92] Flatcar Update Engine starting Mar 17 17:25:30.481488 update_engine[1710]: I20250317 17:25:30.471051 1710 update_check_scheduler.cc:74] Next update check in 7m18s Mar 17 17:25:30.357766 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 17 17:25:30.481767 jq[1715]: true Mar 17 17:25:30.370935 systemd[1]: Started chronyd.service - NTP client/server. Mar 17 17:25:30.384165 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 17 17:25:30.384329 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 17 17:25:30.384575 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 17 17:25:30.384708 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 17 17:25:30.409907 systemd[1]: motdgen.service: Deactivated successfully. Mar 17 17:25:30.410116 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 17 17:25:30.426091 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 17 17:25:30.450926 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 17 17:25:30.462354 systemd-logind[1707]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Mar 17 17:25:30.467726 systemd-logind[1707]: New seat seat0. Mar 17 17:25:30.473020 systemd[1]: Started systemd-logind.service - User Login Management. Mar 17 17:25:30.484357 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 17 17:25:30.484574 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 17 17:25:30.541607 jq[1734]: true Mar 17 17:25:30.548823 (ntainerd)[1745]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 17 17:25:30.557455 coreos-metadata[1678]: Mar 17 17:25:30.549 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 17 17:25:30.557455 coreos-metadata[1678]: Mar 17 17:25:30.554 INFO Fetch successful Mar 17 17:25:30.557455 coreos-metadata[1678]: Mar 17 17:25:30.554 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Mar 17 17:25:30.563037 coreos-metadata[1678]: Mar 17 17:25:30.559 INFO Fetch successful Mar 17 17:25:30.563037 coreos-metadata[1678]: Mar 17 17:25:30.560 INFO Fetching http://168.63.129.16/machine/7648f1f9-7707-487f-9985-cd306b67513c/20537d66%2D6c25%2D4934%2D9ab4%2D753bd93b3847.%5Fci%2D4152.2.2%2Da%2Df9f073f8c6?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Mar 17 17:25:30.564088 coreos-metadata[1678]: Mar 17 17:25:30.564 INFO Fetch successful Mar 17 17:25:30.566051 coreos-metadata[1678]: Mar 17 17:25:30.565 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Mar 17 17:25:30.579364 coreos-metadata[1678]: Mar 17 17:25:30.578 INFO Fetch successful Mar 17 17:25:30.605593 dbus-daemon[1679]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 17 17:25:30.607037 systemd[1]: Started update-engine.service - Update Engine. Mar 17 17:25:30.614044 tar[1729]: linux-arm64/helm Mar 17 17:25:30.641523 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (1731) Mar 17 17:25:30.622006 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 17 17:25:30.622202 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 17 17:25:30.633930 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 17 17:25:30.634051 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 17 17:25:30.653941 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 17 17:25:30.691496 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 17 17:25:30.702654 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 17 17:25:30.797867 bash[1813]: Updated "/home/core/.ssh/authorized_keys" Mar 17 17:25:30.802195 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 17 17:25:30.819744 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 17 17:25:30.960935 locksmithd[1786]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 17 17:25:31.108027 containerd[1745]: time="2025-03-17T17:25:31.107144680Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Mar 17 17:25:31.190672 containerd[1745]: time="2025-03-17T17:25:31.190620160Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:25:31.195407 containerd[1745]: time="2025-03-17T17:25:31.195350520Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.83-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:25:31.196037 containerd[1745]: time="2025-03-17T17:25:31.196011000Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 17 17:25:31.196379 containerd[1745]: time="2025-03-17T17:25:31.196357720Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 17 17:25:31.199437 containerd[1745]: time="2025-03-17T17:25:31.198268680Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 17 17:25:31.199437 containerd[1745]: time="2025-03-17T17:25:31.198321360Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 17 17:25:31.199437 containerd[1745]: time="2025-03-17T17:25:31.198389440Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:25:31.199437 containerd[1745]: time="2025-03-17T17:25:31.198401720Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:25:31.199437 containerd[1745]: time="2025-03-17T17:25:31.198586320Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:25:31.199437 containerd[1745]: time="2025-03-17T17:25:31.198600600Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 17 17:25:31.199437 containerd[1745]: time="2025-03-17T17:25:31.198613840Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:25:31.199437 containerd[1745]: time="2025-03-17T17:25:31.198622440Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 17 17:25:31.199437 containerd[1745]: time="2025-03-17T17:25:31.198692720Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:25:31.199437 containerd[1745]: time="2025-03-17T17:25:31.198905360Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:25:31.199437 containerd[1745]: time="2025-03-17T17:25:31.199020760Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:25:31.199671 containerd[1745]: time="2025-03-17T17:25:31.199033880Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 17 17:25:31.199671 containerd[1745]: time="2025-03-17T17:25:31.199105440Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 17 17:25:31.199671 containerd[1745]: time="2025-03-17T17:25:31.199144080Z" level=info msg="metadata content store policy set" policy=shared Mar 17 17:25:31.222741 containerd[1745]: time="2025-03-17T17:25:31.220503000Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 17 17:25:31.222741 containerd[1745]: time="2025-03-17T17:25:31.220581880Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 17 17:25:31.222741 containerd[1745]: time="2025-03-17T17:25:31.220600000Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 17 17:25:31.222741 containerd[1745]: time="2025-03-17T17:25:31.220619240Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 17 17:25:31.222741 containerd[1745]: time="2025-03-17T17:25:31.220657240Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 17 17:25:31.222741 containerd[1745]: time="2025-03-17T17:25:31.220857920Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 17 17:25:31.222741 containerd[1745]: time="2025-03-17T17:25:31.221149400Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 17 17:25:31.222741 containerd[1745]: time="2025-03-17T17:25:31.221293200Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 17 17:25:31.222741 containerd[1745]: time="2025-03-17T17:25:31.221309960Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 17 17:25:31.222741 containerd[1745]: time="2025-03-17T17:25:31.221325640Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 17 17:25:31.222741 containerd[1745]: time="2025-03-17T17:25:31.221338680Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 17 17:25:31.222741 containerd[1745]: time="2025-03-17T17:25:31.221360600Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 17 17:25:31.222741 containerd[1745]: time="2025-03-17T17:25:31.221372840Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 17 17:25:31.222741 containerd[1745]: time="2025-03-17T17:25:31.221386680Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 17 17:25:31.223100 containerd[1745]: time="2025-03-17T17:25:31.221403160Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 17 17:25:31.223100 containerd[1745]: time="2025-03-17T17:25:31.221416480Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 17 17:25:31.223100 containerd[1745]: time="2025-03-17T17:25:31.221431240Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 17 17:25:31.223100 containerd[1745]: time="2025-03-17T17:25:31.221442560Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 17 17:25:31.223100 containerd[1745]: time="2025-03-17T17:25:31.221476080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223100 containerd[1745]: time="2025-03-17T17:25:31.221490160Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223100 containerd[1745]: time="2025-03-17T17:25:31.221502360Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223100 containerd[1745]: time="2025-03-17T17:25:31.221515000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223100 containerd[1745]: time="2025-03-17T17:25:31.221528680Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223100 containerd[1745]: time="2025-03-17T17:25:31.221541480Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223100 containerd[1745]: time="2025-03-17T17:25:31.221553080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223100 containerd[1745]: time="2025-03-17T17:25:31.221565400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223100 containerd[1745]: time="2025-03-17T17:25:31.221580080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223100 containerd[1745]: time="2025-03-17T17:25:31.221595280Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223325 containerd[1745]: time="2025-03-17T17:25:31.221606800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223325 containerd[1745]: time="2025-03-17T17:25:31.221618400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223325 containerd[1745]: time="2025-03-17T17:25:31.221631200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223325 containerd[1745]: time="2025-03-17T17:25:31.221645680Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 17 17:25:31.223325 containerd[1745]: time="2025-03-17T17:25:31.221670280Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223325 containerd[1745]: time="2025-03-17T17:25:31.221683520Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223325 containerd[1745]: time="2025-03-17T17:25:31.221693800Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 17 17:25:31.223325 containerd[1745]: time="2025-03-17T17:25:31.221742320Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 17 17:25:31.223325 containerd[1745]: time="2025-03-17T17:25:31.221761160Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 17 17:25:31.223325 containerd[1745]: time="2025-03-17T17:25:31.221771760Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 17 17:25:31.223325 containerd[1745]: time="2025-03-17T17:25:31.221783120Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 17 17:25:31.223325 containerd[1745]: time="2025-03-17T17:25:31.221794200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223325 containerd[1745]: time="2025-03-17T17:25:31.221832440Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 17 17:25:31.223325 containerd[1745]: time="2025-03-17T17:25:31.221844600Z" level=info msg="NRI interface is disabled by configuration." Mar 17 17:25:31.223597 containerd[1745]: time="2025-03-17T17:25:31.221855080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 17 17:25:31.223616 containerd[1745]: time="2025-03-17T17:25:31.222124400Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 17 17:25:31.223616 containerd[1745]: time="2025-03-17T17:25:31.222170320Z" level=info msg="Connect containerd service" Mar 17 17:25:31.223616 containerd[1745]: time="2025-03-17T17:25:31.222205800Z" level=info msg="using legacy CRI server" Mar 17 17:25:31.223616 containerd[1745]: time="2025-03-17T17:25:31.222212640Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 17 17:25:31.223616 containerd[1745]: time="2025-03-17T17:25:31.222325600Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 17 17:25:31.229757 containerd[1745]: time="2025-03-17T17:25:31.224856240Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 17:25:31.229757 containerd[1745]: time="2025-03-17T17:25:31.225174040Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 17 17:25:31.229757 containerd[1745]: time="2025-03-17T17:25:31.225210120Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 17 17:25:31.229757 containerd[1745]: time="2025-03-17T17:25:31.225241400Z" level=info msg="Start subscribing containerd event" Mar 17 17:25:31.229757 containerd[1745]: time="2025-03-17T17:25:31.225272840Z" level=info msg="Start recovering state" Mar 17 17:25:31.229757 containerd[1745]: time="2025-03-17T17:25:31.225330400Z" level=info msg="Start event monitor" Mar 17 17:25:31.229757 containerd[1745]: time="2025-03-17T17:25:31.225342160Z" level=info msg="Start snapshots syncer" Mar 17 17:25:31.229757 containerd[1745]: time="2025-03-17T17:25:31.225353040Z" level=info msg="Start cni network conf syncer for default" Mar 17 17:25:31.229757 containerd[1745]: time="2025-03-17T17:25:31.225360000Z" level=info msg="Start streaming server" Mar 17 17:25:31.225497 systemd[1]: Started containerd.service - containerd container runtime. Mar 17 17:25:31.237042 containerd[1745]: time="2025-03-17T17:25:31.236997480Z" level=info msg="containerd successfully booted in 0.131157s" Mar 17 17:25:31.398996 tar[1729]: linux-arm64/LICENSE Mar 17 17:25:31.399317 tar[1729]: linux-arm64/README.md Mar 17 17:25:31.419495 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 17 17:25:31.548019 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:25:31.555661 (kubelet)[1843]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:25:31.935059 sshd_keygen[1706]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 17 17:25:31.953335 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 17 17:25:31.956051 kubelet[1843]: E0317 17:25:31.956014 1843 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:25:31.959689 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:25:31.959839 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:25:31.967150 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 17 17:25:31.974070 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Mar 17 17:25:31.981326 systemd[1]: issuegen.service: Deactivated successfully. Mar 17 17:25:31.981612 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 17 17:25:32.000723 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 17 17:25:32.009068 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Mar 17 17:25:32.016982 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 17 17:25:32.031357 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 17 17:25:32.042394 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 17 17:25:32.049601 systemd[1]: Reached target getty.target - Login Prompts. Mar 17 17:25:32.054913 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 17 17:25:32.060670 systemd[1]: Startup finished in 670ms (kernel) + 12.210s (initrd) + 24.824s (userspace) = 37.704s. Mar 17 17:25:32.371268 login[1872]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Mar 17 17:25:32.372980 login[1873]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:25:32.382273 systemd-logind[1707]: New session 1 of user core. Mar 17 17:25:32.383093 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 17 17:25:32.391058 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 17 17:25:32.402910 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 17 17:25:32.408073 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 17 17:25:32.413549 (systemd)[1880]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 17 17:25:32.563175 systemd[1880]: Queued start job for default target default.target. Mar 17 17:25:32.571740 systemd[1880]: Created slice app.slice - User Application Slice. Mar 17 17:25:32.571942 systemd[1880]: Reached target paths.target - Paths. Mar 17 17:25:32.572019 systemd[1880]: Reached target timers.target - Timers. Mar 17 17:25:32.573269 systemd[1880]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 17 17:25:32.587594 systemd[1880]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 17 17:25:32.587718 systemd[1880]: Reached target sockets.target - Sockets. Mar 17 17:25:32.587732 systemd[1880]: Reached target basic.target - Basic System. Mar 17 17:25:32.587771 systemd[1880]: Reached target default.target - Main User Target. Mar 17 17:25:32.587819 systemd[1880]: Startup finished in 168ms. Mar 17 17:25:32.587949 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 17 17:25:32.589522 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 17 17:25:33.371649 login[1872]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:25:33.376464 systemd-logind[1707]: New session 2 of user core. Mar 17 17:25:33.381984 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 17 17:25:33.815261 waagent[1869]: 2025-03-17T17:25:33.815114Z INFO Daemon Daemon Azure Linux Agent Version: 2.9.1.1 Mar 17 17:25:33.824966 waagent[1869]: 2025-03-17T17:25:33.820960Z INFO Daemon Daemon OS: flatcar 4152.2.2 Mar 17 17:25:33.825812 waagent[1869]: 2025-03-17T17:25:33.825734Z INFO Daemon Daemon Python: 3.11.10 Mar 17 17:25:33.832511 waagent[1869]: 2025-03-17T17:25:33.830423Z INFO Daemon Daemon Run daemon Mar 17 17:25:33.834629 waagent[1869]: 2025-03-17T17:25:33.834575Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4152.2.2' Mar 17 17:25:33.843339 waagent[1869]: 2025-03-17T17:25:33.843265Z INFO Daemon Daemon Using waagent for provisioning Mar 17 17:25:33.848874 waagent[1869]: 2025-03-17T17:25:33.848819Z INFO Daemon Daemon Activate resource disk Mar 17 17:25:33.853433 waagent[1869]: 2025-03-17T17:25:33.853375Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Mar 17 17:25:33.865975 waagent[1869]: 2025-03-17T17:25:33.865907Z INFO Daemon Daemon Found device: None Mar 17 17:25:33.870513 waagent[1869]: 2025-03-17T17:25:33.870454Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Mar 17 17:25:33.878819 waagent[1869]: 2025-03-17T17:25:33.878755Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Mar 17 17:25:33.889591 waagent[1869]: 2025-03-17T17:25:33.889540Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 17 17:25:33.895130 waagent[1869]: 2025-03-17T17:25:33.895079Z INFO Daemon Daemon Running default provisioning handler Mar 17 17:25:33.906857 waagent[1869]: 2025-03-17T17:25:33.906287Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Mar 17 17:25:33.919946 waagent[1869]: 2025-03-17T17:25:33.919882Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Mar 17 17:25:33.929520 waagent[1869]: 2025-03-17T17:25:33.929460Z INFO Daemon Daemon cloud-init is enabled: False Mar 17 17:25:33.934350 waagent[1869]: 2025-03-17T17:25:33.934297Z INFO Daemon Daemon Copying ovf-env.xml Mar 17 17:25:33.989453 waagent[1869]: 2025-03-17T17:25:33.989354Z INFO Daemon Daemon Successfully mounted dvd Mar 17 17:25:34.020483 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Mar 17 17:25:34.023847 waagent[1869]: 2025-03-17T17:25:34.023076Z INFO Daemon Daemon Detect protocol endpoint Mar 17 17:25:34.028018 waagent[1869]: 2025-03-17T17:25:34.027956Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 17 17:25:34.034139 waagent[1869]: 2025-03-17T17:25:34.034079Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Mar 17 17:25:34.040321 waagent[1869]: 2025-03-17T17:25:34.040264Z INFO Daemon Daemon Test for route to 168.63.129.16 Mar 17 17:25:34.045680 waagent[1869]: 2025-03-17T17:25:34.045626Z INFO Daemon Daemon Route to 168.63.129.16 exists Mar 17 17:25:34.051072 waagent[1869]: 2025-03-17T17:25:34.051023Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Mar 17 17:25:34.089368 waagent[1869]: 2025-03-17T17:25:34.089272Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Mar 17 17:25:34.096131 waagent[1869]: 2025-03-17T17:25:34.096097Z INFO Daemon Daemon Wire protocol version:2012-11-30 Mar 17 17:25:34.101286 waagent[1869]: 2025-03-17T17:25:34.101226Z INFO Daemon Daemon Server preferred version:2015-04-05 Mar 17 17:25:34.238195 waagent[1869]: 2025-03-17T17:25:34.238089Z INFO Daemon Daemon Initializing goal state during protocol detection Mar 17 17:25:34.245108 waagent[1869]: 2025-03-17T17:25:34.245040Z INFO Daemon Daemon Forcing an update of the goal state. Mar 17 17:25:34.254483 waagent[1869]: 2025-03-17T17:25:34.254432Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 17 17:25:34.321928 waagent[1869]: 2025-03-17T17:25:34.321880Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.164 Mar 17 17:25:34.327558 waagent[1869]: 2025-03-17T17:25:34.327511Z INFO Daemon Mar 17 17:25:34.330207 waagent[1869]: 2025-03-17T17:25:34.330165Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 5ba0863a-2604-4ea9-9fe7-e5bce98f7b28 eTag: 11547549994911077115 source: Fabric] Mar 17 17:25:34.341897 waagent[1869]: 2025-03-17T17:25:34.341824Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Mar 17 17:25:34.348841 waagent[1869]: 2025-03-17T17:25:34.348776Z INFO Daemon Mar 17 17:25:34.351498 waagent[1869]: 2025-03-17T17:25:34.351454Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Mar 17 17:25:34.362878 waagent[1869]: 2025-03-17T17:25:34.362843Z INFO Daemon Daemon Downloading artifacts profile blob Mar 17 17:25:34.541163 waagent[1869]: 2025-03-17T17:25:34.541065Z INFO Daemon Downloaded certificate {'thumbprint': 'CF76A22CCED062FF8C39A5D3BDACD242FDD34149', 'hasPrivateKey': True} Mar 17 17:25:34.551624 waagent[1869]: 2025-03-17T17:25:34.551573Z INFO Daemon Downloaded certificate {'thumbprint': '41518CF6F2B5C7CDB6D029B351552E19A8139D74', 'hasPrivateKey': False} Mar 17 17:25:34.561673 waagent[1869]: 2025-03-17T17:25:34.561621Z INFO Daemon Fetch goal state completed Mar 17 17:25:34.612207 waagent[1869]: 2025-03-17T17:25:34.612105Z INFO Daemon Daemon Starting provisioning Mar 17 17:25:34.617598 waagent[1869]: 2025-03-17T17:25:34.617539Z INFO Daemon Daemon Handle ovf-env.xml. Mar 17 17:25:34.622291 waagent[1869]: 2025-03-17T17:25:34.622237Z INFO Daemon Daemon Set hostname [ci-4152.2.2-a-f9f073f8c6] Mar 17 17:25:34.650384 waagent[1869]: 2025-03-17T17:25:34.650303Z INFO Daemon Daemon Publish hostname [ci-4152.2.2-a-f9f073f8c6] Mar 17 17:25:34.656538 waagent[1869]: 2025-03-17T17:25:34.656474Z INFO Daemon Daemon Examine /proc/net/route for primary interface Mar 17 17:25:34.663038 waagent[1869]: 2025-03-17T17:25:34.662979Z INFO Daemon Daemon Primary interface is [eth0] Mar 17 17:25:34.706334 systemd-networkd[1440]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:25:34.706341 systemd-networkd[1440]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:25:34.706398 systemd-networkd[1440]: eth0: DHCP lease lost Mar 17 17:25:34.707879 waagent[1869]: 2025-03-17T17:25:34.707775Z INFO Daemon Daemon Create user account if not exists Mar 17 17:25:34.713915 waagent[1869]: 2025-03-17T17:25:34.713850Z INFO Daemon Daemon User core already exists, skip useradd Mar 17 17:25:34.714919 systemd-networkd[1440]: eth0: DHCPv6 lease lost Mar 17 17:25:34.719865 waagent[1869]: 2025-03-17T17:25:34.719775Z INFO Daemon Daemon Configure sudoer Mar 17 17:25:34.724430 waagent[1869]: 2025-03-17T17:25:34.724372Z INFO Daemon Daemon Configure sshd Mar 17 17:25:34.729301 waagent[1869]: 2025-03-17T17:25:34.729222Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Mar 17 17:25:34.742311 waagent[1869]: 2025-03-17T17:25:34.742240Z INFO Daemon Daemon Deploy ssh public key. Mar 17 17:25:34.755913 systemd-networkd[1440]: eth0: DHCPv4 address 10.200.20.36/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 17 17:25:35.888510 waagent[1869]: 2025-03-17T17:25:35.888454Z INFO Daemon Daemon Provisioning complete Mar 17 17:25:35.906162 waagent[1869]: 2025-03-17T17:25:35.906111Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Mar 17 17:25:35.912282 waagent[1869]: 2025-03-17T17:25:35.912217Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Mar 17 17:25:35.921860 waagent[1869]: 2025-03-17T17:25:35.921788Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.9.1.1 is the most current agent Mar 17 17:25:36.056552 waagent[1935]: 2025-03-17T17:25:36.056473Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.9.1.1) Mar 17 17:25:36.057400 waagent[1935]: 2025-03-17T17:25:36.057016Z INFO ExtHandler ExtHandler OS: flatcar 4152.2.2 Mar 17 17:25:36.057400 waagent[1935]: 2025-03-17T17:25:36.057090Z INFO ExtHandler ExtHandler Python: 3.11.10 Mar 17 17:25:36.101835 waagent[1935]: 2025-03-17T17:25:36.100908Z INFO ExtHandler ExtHandler Distro: flatcar-4152.2.2; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.10; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.20.1; Mar 17 17:25:36.101835 waagent[1935]: 2025-03-17T17:25:36.101186Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 17:25:36.101835 waagent[1935]: 2025-03-17T17:25:36.101259Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 17:25:36.110365 waagent[1935]: 2025-03-17T17:25:36.110298Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 17 17:25:36.116328 waagent[1935]: 2025-03-17T17:25:36.116284Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.164 Mar 17 17:25:36.116973 waagent[1935]: 2025-03-17T17:25:36.116926Z INFO ExtHandler Mar 17 17:25:36.117172 waagent[1935]: 2025-03-17T17:25:36.117139Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: d8e0ad1f-8c0f-40fa-888d-4437e6b97dd6 eTag: 11547549994911077115 source: Fabric] Mar 17 17:25:36.117589 waagent[1935]: 2025-03-17T17:25:36.117551Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 17 17:25:36.118297 waagent[1935]: 2025-03-17T17:25:36.118254Z INFO ExtHandler Mar 17 17:25:36.118438 waagent[1935]: 2025-03-17T17:25:36.118405Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Mar 17 17:25:36.122822 waagent[1935]: 2025-03-17T17:25:36.122769Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 17 17:25:36.208200 waagent[1935]: 2025-03-17T17:25:36.208067Z INFO ExtHandler Downloaded certificate {'thumbprint': 'CF76A22CCED062FF8C39A5D3BDACD242FDD34149', 'hasPrivateKey': True} Mar 17 17:25:36.208861 waagent[1935]: 2025-03-17T17:25:36.208710Z INFO ExtHandler Downloaded certificate {'thumbprint': '41518CF6F2B5C7CDB6D029B351552E19A8139D74', 'hasPrivateKey': False} Mar 17 17:25:36.209198 waagent[1935]: 2025-03-17T17:25:36.209150Z INFO ExtHandler Fetch goal state completed Mar 17 17:25:36.227551 waagent[1935]: 2025-03-17T17:25:36.227485Z INFO ExtHandler ExtHandler WALinuxAgent-2.9.1.1 running as process 1935 Mar 17 17:25:36.227718 waagent[1935]: 2025-03-17T17:25:36.227678Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Mar 17 17:25:36.229392 waagent[1935]: 2025-03-17T17:25:36.229341Z INFO ExtHandler ExtHandler Cgroup monitoring is not supported on ['flatcar', '4152.2.2', '', 'Flatcar Container Linux by Kinvolk'] Mar 17 17:25:36.229774 waagent[1935]: 2025-03-17T17:25:36.229736Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Mar 17 17:25:36.248126 waagent[1935]: 2025-03-17T17:25:36.248081Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Mar 17 17:25:36.248330 waagent[1935]: 2025-03-17T17:25:36.248290Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Mar 17 17:25:36.255036 waagent[1935]: 2025-03-17T17:25:36.254383Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Mar 17 17:25:36.260719 systemd[1]: Reloading requested from client PID 1950 ('systemctl') (unit waagent.service)... Mar 17 17:25:36.260731 systemd[1]: Reloading... Mar 17 17:25:36.331826 zram_generator::config[1987]: No configuration found. Mar 17 17:25:36.436614 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:25:36.517368 systemd[1]: Reloading finished in 256 ms. Mar 17 17:25:36.543650 waagent[1935]: 2025-03-17T17:25:36.543254Z INFO ExtHandler ExtHandler Executing systemctl daemon-reload for setting up waagent-network-setup.service Mar 17 17:25:36.548926 systemd[1]: Reloading requested from client PID 2038 ('systemctl') (unit waagent.service)... Mar 17 17:25:36.548949 systemd[1]: Reloading... Mar 17 17:25:36.636888 zram_generator::config[2075]: No configuration found. Mar 17 17:25:36.742977 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:25:36.824311 systemd[1]: Reloading finished in 275 ms. Mar 17 17:25:36.845842 waagent[1935]: 2025-03-17T17:25:36.844127Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Mar 17 17:25:36.845842 waagent[1935]: 2025-03-17T17:25:36.844315Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Mar 17 17:25:37.799019 waagent[1935]: 2025-03-17T17:25:37.798928Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Mar 17 17:25:37.799641 waagent[1935]: 2025-03-17T17:25:37.799582Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: configuration enabled [True], cgroups enabled [False], python supported: [True] Mar 17 17:25:37.800479 waagent[1935]: 2025-03-17T17:25:37.800391Z INFO ExtHandler ExtHandler Starting env monitor service. Mar 17 17:25:37.800961 waagent[1935]: 2025-03-17T17:25:37.800776Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Mar 17 17:25:37.801230 waagent[1935]: 2025-03-17T17:25:37.801184Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 17:25:37.801832 waagent[1935]: 2025-03-17T17:25:37.801293Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 17:25:37.801832 waagent[1935]: 2025-03-17T17:25:37.801375Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 17:25:37.801832 waagent[1935]: 2025-03-17T17:25:37.801520Z INFO EnvHandler ExtHandler Configure routes Mar 17 17:25:37.801832 waagent[1935]: 2025-03-17T17:25:37.801577Z INFO EnvHandler ExtHandler Gateway:None Mar 17 17:25:37.801832 waagent[1935]: 2025-03-17T17:25:37.801620Z INFO EnvHandler ExtHandler Routes:None Mar 17 17:25:37.802782 waagent[1935]: 2025-03-17T17:25:37.802284Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 17:25:37.802782 waagent[1935]: 2025-03-17T17:25:37.802515Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Mar 17 17:25:37.802782 waagent[1935]: 2025-03-17T17:25:37.802699Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Mar 17 17:25:37.802782 waagent[1935]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Mar 17 17:25:37.802782 waagent[1935]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Mar 17 17:25:37.802782 waagent[1935]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Mar 17 17:25:37.802782 waagent[1935]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Mar 17 17:25:37.802782 waagent[1935]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 17 17:25:37.802782 waagent[1935]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 17 17:25:37.803161 waagent[1935]: 2025-03-17T17:25:37.803109Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Mar 17 17:25:37.803343 waagent[1935]: 2025-03-17T17:25:37.803305Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Mar 17 17:25:37.803685 waagent[1935]: 2025-03-17T17:25:37.803658Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Mar 17 17:25:37.804455 waagent[1935]: 2025-03-17T17:25:37.803611Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Mar 17 17:25:37.804650 waagent[1935]: 2025-03-17T17:25:37.804608Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Mar 17 17:25:37.811781 waagent[1935]: 2025-03-17T17:25:37.811727Z INFO ExtHandler ExtHandler Mar 17 17:25:37.812222 waagent[1935]: 2025-03-17T17:25:37.811982Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 5ad2c1dc-f6f9-41c9-ab5a-9d009d0deb13 correlation ca5a42e8-6ea7-4019-b523-8d797eb5626b created: 2025-03-17T17:24:05.494759Z] Mar 17 17:25:37.812598 waagent[1935]: 2025-03-17T17:25:37.812551Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 17 17:25:37.813304 waagent[1935]: 2025-03-17T17:25:37.813264Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 1 ms] Mar 17 17:25:37.853658 waagent[1935]: 2025-03-17T17:25:37.853601Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.9.1.1 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: C7DCB5B4-A338-47A4-A2BC-F14704BDE022;DroppedPackets: 0;UpdateGSErrors: 0;AutoUpdate: 0] Mar 17 17:25:37.876731 waagent[1935]: 2025-03-17T17:25:37.876607Z INFO EnvHandler ExtHandler Successfully added Azure fabric firewall rules. Current Firewall rules: Mar 17 17:25:37.876731 waagent[1935]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 17:25:37.876731 waagent[1935]: pkts bytes target prot opt in out source destination Mar 17 17:25:37.876731 waagent[1935]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 17 17:25:37.876731 waagent[1935]: pkts bytes target prot opt in out source destination Mar 17 17:25:37.876731 waagent[1935]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 17:25:37.876731 waagent[1935]: pkts bytes target prot opt in out source destination Mar 17 17:25:37.876731 waagent[1935]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 17 17:25:37.876731 waagent[1935]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 17 17:25:37.876731 waagent[1935]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 17 17:25:37.878086 waagent[1935]: 2025-03-17T17:25:37.877962Z INFO MonitorHandler ExtHandler Network interfaces: Mar 17 17:25:37.878086 waagent[1935]: Executing ['ip', '-a', '-o', 'link']: Mar 17 17:25:37.878086 waagent[1935]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Mar 17 17:25:37.878086 waagent[1935]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:22:48:c2:2c:2e brd ff:ff:ff:ff:ff:ff Mar 17 17:25:37.878086 waagent[1935]: 3: enP40636s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:22:48:c2:2c:2e brd ff:ff:ff:ff:ff:ff\ altname enP40636p0s2 Mar 17 17:25:37.878086 waagent[1935]: Executing ['ip', '-4', '-a', '-o', 'address']: Mar 17 17:25:37.878086 waagent[1935]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Mar 17 17:25:37.878086 waagent[1935]: 2: eth0 inet 10.200.20.36/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Mar 17 17:25:37.878086 waagent[1935]: Executing ['ip', '-6', '-a', '-o', 'address']: Mar 17 17:25:37.878086 waagent[1935]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Mar 17 17:25:37.878086 waagent[1935]: 2: eth0 inet6 fe80::222:48ff:fec2:2c2e/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Mar 17 17:25:37.878086 waagent[1935]: 3: enP40636s1 inet6 fe80::222:48ff:fec2:2c2e/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Mar 17 17:25:37.913085 waagent[1935]: 2025-03-17T17:25:37.913005Z INFO EnvHandler ExtHandler Current Firewall rules: Mar 17 17:25:37.913085 waagent[1935]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 17:25:37.913085 waagent[1935]: pkts bytes target prot opt in out source destination Mar 17 17:25:37.913085 waagent[1935]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 17 17:25:37.913085 waagent[1935]: pkts bytes target prot opt in out source destination Mar 17 17:25:37.913085 waagent[1935]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 17:25:37.913085 waagent[1935]: pkts bytes target prot opt in out source destination Mar 17 17:25:37.913085 waagent[1935]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 17 17:25:37.913085 waagent[1935]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 17 17:25:37.913085 waagent[1935]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 17 17:25:37.913337 waagent[1935]: 2025-03-17T17:25:37.913303Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Mar 17 17:25:42.210533 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 17 17:25:42.217993 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:25:42.320516 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:25:42.330324 (kubelet)[2166]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:25:42.367298 kubelet[2166]: E0317 17:25:42.367218 2166 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:25:42.370004 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:25:42.370154 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:25:52.603566 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 17 17:25:52.611985 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:25:52.859019 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:25:52.863060 (kubelet)[2181]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:25:52.900416 kubelet[2181]: E0317 17:25:52.900357 2181 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:25:52.902930 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:25:52.903193 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:25:54.046246 chronyd[1688]: Selected source PHC0 Mar 17 17:26:03.103575 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 17 17:26:03.113001 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:26:03.551655 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:26:03.563203 (kubelet)[2197]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:26:03.598019 kubelet[2197]: E0317 17:26:03.597964 2197 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:26:03.600455 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:26:03.600737 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:26:12.136644 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Mar 17 17:26:12.900676 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 17 17:26:12.902398 systemd[1]: Started sshd@0-10.200.20.36:22-10.200.16.10:46886.service - OpenSSH per-connection server daemon (10.200.16.10:46886). Mar 17 17:26:13.465166 sshd[2205]: Accepted publickey for core from 10.200.16.10 port 46886 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:26:13.466473 sshd-session[2205]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:26:13.470306 systemd-logind[1707]: New session 3 of user core. Mar 17 17:26:13.480968 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 17 17:26:13.603376 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 17 17:26:13.612992 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:26:13.863082 systemd[1]: Started sshd@1-10.200.20.36:22-10.200.16.10:46890.service - OpenSSH per-connection server daemon (10.200.16.10:46890). Mar 17 17:26:13.912275 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:26:13.916442 (kubelet)[2220]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:26:13.958125 kubelet[2220]: E0317 17:26:13.958063 2220 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:26:13.960255 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:26:13.960400 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:26:14.301288 sshd[2213]: Accepted publickey for core from 10.200.16.10 port 46890 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:26:14.302562 sshd-session[2213]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:26:14.306925 systemd-logind[1707]: New session 4 of user core. Mar 17 17:26:14.317946 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 17 17:26:14.623404 sshd[2227]: Connection closed by 10.200.16.10 port 46890 Mar 17 17:26:14.623966 sshd-session[2213]: pam_unix(sshd:session): session closed for user core Mar 17 17:26:14.627258 systemd[1]: sshd@1-10.200.20.36:22-10.200.16.10:46890.service: Deactivated successfully. Mar 17 17:26:14.628714 systemd[1]: session-4.scope: Deactivated successfully. Mar 17 17:26:14.629322 systemd-logind[1707]: Session 4 logged out. Waiting for processes to exit. Mar 17 17:26:14.630405 systemd-logind[1707]: Removed session 4. Mar 17 17:26:14.712148 systemd[1]: Started sshd@2-10.200.20.36:22-10.200.16.10:46892.service - OpenSSH per-connection server daemon (10.200.16.10:46892). Mar 17 17:26:15.189247 sshd[2232]: Accepted publickey for core from 10.200.16.10 port 46892 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:26:15.190555 sshd-session[2232]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:26:15.194277 systemd-logind[1707]: New session 5 of user core. Mar 17 17:26:15.201991 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 17 17:26:15.356835 update_engine[1710]: I20250317 17:26:15.356244 1710 update_attempter.cc:509] Updating boot flags... Mar 17 17:26:15.408845 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (2250) Mar 17 17:26:15.526954 sshd[2234]: Connection closed by 10.200.16.10 port 46892 Mar 17 17:26:15.527455 sshd-session[2232]: pam_unix(sshd:session): session closed for user core Mar 17 17:26:15.530945 systemd[1]: sshd@2-10.200.20.36:22-10.200.16.10:46892.service: Deactivated successfully. Mar 17 17:26:15.534354 systemd[1]: session-5.scope: Deactivated successfully. Mar 17 17:26:15.535003 systemd-logind[1707]: Session 5 logged out. Waiting for processes to exit. Mar 17 17:26:15.535787 systemd-logind[1707]: Removed session 5. Mar 17 17:26:15.609183 systemd[1]: Started sshd@3-10.200.20.36:22-10.200.16.10:46896.service - OpenSSH per-connection server daemon (10.200.16.10:46896). Mar 17 17:26:16.079375 sshd[2302]: Accepted publickey for core from 10.200.16.10 port 46896 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:26:16.080665 sshd-session[2302]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:26:16.084237 systemd-logind[1707]: New session 6 of user core. Mar 17 17:26:16.093955 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 17 17:26:16.413714 sshd[2304]: Connection closed by 10.200.16.10 port 46896 Mar 17 17:26:16.414249 sshd-session[2302]: pam_unix(sshd:session): session closed for user core Mar 17 17:26:16.417457 systemd[1]: sshd@3-10.200.20.36:22-10.200.16.10:46896.service: Deactivated successfully. Mar 17 17:26:16.419005 systemd[1]: session-6.scope: Deactivated successfully. Mar 17 17:26:16.419724 systemd-logind[1707]: Session 6 logged out. Waiting for processes to exit. Mar 17 17:26:16.420738 systemd-logind[1707]: Removed session 6. Mar 17 17:26:16.509063 systemd[1]: Started sshd@4-10.200.20.36:22-10.200.16.10:46910.service - OpenSSH per-connection server daemon (10.200.16.10:46910). Mar 17 17:26:16.985628 sshd[2309]: Accepted publickey for core from 10.200.16.10 port 46910 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:26:16.986949 sshd-session[2309]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:26:16.991581 systemd-logind[1707]: New session 7 of user core. Mar 17 17:26:16.996987 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 17 17:26:17.420890 sudo[2312]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 17 17:26:17.421166 sudo[2312]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:26:17.435669 sudo[2312]: pam_unix(sudo:session): session closed for user root Mar 17 17:26:17.526395 sshd[2311]: Connection closed by 10.200.16.10 port 46910 Mar 17 17:26:17.527117 sshd-session[2309]: pam_unix(sshd:session): session closed for user core Mar 17 17:26:17.530550 systemd[1]: sshd@4-10.200.20.36:22-10.200.16.10:46910.service: Deactivated successfully. Mar 17 17:26:17.532102 systemd[1]: session-7.scope: Deactivated successfully. Mar 17 17:26:17.532766 systemd-logind[1707]: Session 7 logged out. Waiting for processes to exit. Mar 17 17:26:17.533643 systemd-logind[1707]: Removed session 7. Mar 17 17:26:17.602352 systemd[1]: Started sshd@5-10.200.20.36:22-10.200.16.10:46912.service - OpenSSH per-connection server daemon (10.200.16.10:46912). Mar 17 17:26:18.034169 sshd[2317]: Accepted publickey for core from 10.200.16.10 port 46912 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:26:18.035496 sshd-session[2317]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:26:18.039225 systemd-logind[1707]: New session 8 of user core. Mar 17 17:26:18.046943 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 17 17:26:18.277481 sudo[2321]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 17 17:26:18.277738 sudo[2321]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:26:18.280853 sudo[2321]: pam_unix(sudo:session): session closed for user root Mar 17 17:26:18.285419 sudo[2320]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 17 17:26:18.285679 sudo[2320]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:26:18.298397 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:26:18.320731 augenrules[2343]: No rules Mar 17 17:26:18.321911 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:26:18.322084 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:26:18.323428 sudo[2320]: pam_unix(sudo:session): session closed for user root Mar 17 17:26:18.402837 sshd[2319]: Connection closed by 10.200.16.10 port 46912 Mar 17 17:26:18.403321 sshd-session[2317]: pam_unix(sshd:session): session closed for user core Mar 17 17:26:18.406955 systemd[1]: sshd@5-10.200.20.36:22-10.200.16.10:46912.service: Deactivated successfully. Mar 17 17:26:18.408523 systemd[1]: session-8.scope: Deactivated successfully. Mar 17 17:26:18.409180 systemd-logind[1707]: Session 8 logged out. Waiting for processes to exit. Mar 17 17:26:18.410009 systemd-logind[1707]: Removed session 8. Mar 17 17:26:18.483169 systemd[1]: Started sshd@6-10.200.20.36:22-10.200.16.10:43262.service - OpenSSH per-connection server daemon (10.200.16.10:43262). Mar 17 17:26:18.923026 sshd[2351]: Accepted publickey for core from 10.200.16.10 port 43262 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:26:18.924302 sshd-session[2351]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:26:18.929017 systemd-logind[1707]: New session 9 of user core. Mar 17 17:26:18.931994 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 17 17:26:19.170999 sudo[2354]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 17 17:26:19.171659 sudo[2354]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:26:20.549043 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 17 17:26:20.549554 (dockerd)[2373]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 17 17:26:21.216047 dockerd[2373]: time="2025-03-17T17:26:21.215992641Z" level=info msg="Starting up" Mar 17 17:26:21.563507 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport958984866-merged.mount: Deactivated successfully. Mar 17 17:26:21.600231 dockerd[2373]: time="2025-03-17T17:26:21.600184296Z" level=info msg="Loading containers: start." Mar 17 17:26:21.775429 kernel: Initializing XFRM netlink socket Mar 17 17:26:21.851332 systemd-networkd[1440]: docker0: Link UP Mar 17 17:26:21.887893 dockerd[2373]: time="2025-03-17T17:26:21.887846167Z" level=info msg="Loading containers: done." Mar 17 17:26:21.899458 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2614304813-merged.mount: Deactivated successfully. Mar 17 17:26:21.911340 dockerd[2373]: time="2025-03-17T17:26:21.911291942Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 17 17:26:21.911470 dockerd[2373]: time="2025-03-17T17:26:21.911395582Z" level=info msg="Docker daemon" commit=8b539b8df24032dabeaaa099cf1d0535ef0286a3 containerd-snapshotter=false storage-driver=overlay2 version=27.2.1 Mar 17 17:26:21.911533 dockerd[2373]: time="2025-03-17T17:26:21.911509902Z" level=info msg="Daemon has completed initialization" Mar 17 17:26:21.974440 dockerd[2373]: time="2025-03-17T17:26:21.974341409Z" level=info msg="API listen on /run/docker.sock" Mar 17 17:26:21.974824 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 17 17:26:22.934034 containerd[1745]: time="2025-03-17T17:26:22.933948406Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\"" Mar 17 17:26:23.851087 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2926227507.mount: Deactivated successfully. Mar 17 17:26:24.103388 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 17 17:26:24.109017 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:26:24.225516 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:26:24.230257 (kubelet)[2575]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:26:24.264640 kubelet[2575]: E0317 17:26:24.264553 2575 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:26:24.267073 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:26:24.267351 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:26:25.636103 containerd[1745]: time="2025-03-17T17:26:25.636055626Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:25.643871 containerd[1745]: time="2025-03-17T17:26:25.643816804Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.7: active requests=0, bytes read=25552766" Mar 17 17:26:25.646991 containerd[1745]: time="2025-03-17T17:26:25.646934131Z" level=info msg="ImageCreate event name:\"sha256:26ae5fde2308729bfda71fa20aa73cb5a1a4490f107f62dc7e1c4c49823cc084\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:25.652318 containerd[1745]: time="2025-03-17T17:26:25.652251623Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:25.653969 containerd[1745]: time="2025-03-17T17:26:25.653655347Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.7\" with image id \"sha256:26ae5fde2308729bfda71fa20aa73cb5a1a4490f107f62dc7e1c4c49823cc084\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\", size \"25549566\" in 2.719663741s" Mar 17 17:26:25.653969 containerd[1745]: time="2025-03-17T17:26:25.653705787Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\" returns image reference \"sha256:26ae5fde2308729bfda71fa20aa73cb5a1a4490f107f62dc7e1c4c49823cc084\"" Mar 17 17:26:25.656792 containerd[1745]: time="2025-03-17T17:26:25.656556513Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\"" Mar 17 17:26:27.009531 containerd[1745]: time="2025-03-17T17:26:27.009481011Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:27.015185 containerd[1745]: time="2025-03-17T17:26:27.015138185Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.7: active requests=0, bytes read=22458978" Mar 17 17:26:27.020996 containerd[1745]: time="2025-03-17T17:26:27.020887800Z" level=info msg="ImageCreate event name:\"sha256:3f2886c2c7c101461e78c37591f8beb12ac073f8dcf5e32c95da9e9689d0c1d3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:27.028360 containerd[1745]: time="2025-03-17T17:26:27.028300539Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:27.029492 containerd[1745]: time="2025-03-17T17:26:27.029346862Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.7\" with image id \"sha256:3f2886c2c7c101461e78c37591f8beb12ac073f8dcf5e32c95da9e9689d0c1d3\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\", size \"23899774\" in 1.372741549s" Mar 17 17:26:27.029492 containerd[1745]: time="2025-03-17T17:26:27.029390622Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\" returns image reference \"sha256:3f2886c2c7c101461e78c37591f8beb12ac073f8dcf5e32c95da9e9689d0c1d3\"" Mar 17 17:26:27.030401 containerd[1745]: time="2025-03-17T17:26:27.030229144Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\"" Mar 17 17:26:28.561020 containerd[1745]: time="2025-03-17T17:26:28.560960245Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:28.563983 containerd[1745]: time="2025-03-17T17:26:28.563749172Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.7: active requests=0, bytes read=17125829" Mar 17 17:26:28.566879 containerd[1745]: time="2025-03-17T17:26:28.566849740Z" level=info msg="ImageCreate event name:\"sha256:3dd474fdc8c0d007008dd47bafecdd344fbdace928731ae8b09f58f633f4a30f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:28.572827 containerd[1745]: time="2025-03-17T17:26:28.572775395Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:28.574107 containerd[1745]: time="2025-03-17T17:26:28.573977359Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.7\" with image id \"sha256:3dd474fdc8c0d007008dd47bafecdd344fbdace928731ae8b09f58f633f4a30f\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\", size \"18566643\" in 1.543718495s" Mar 17 17:26:28.574107 containerd[1745]: time="2025-03-17T17:26:28.574009279Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\" returns image reference \"sha256:3dd474fdc8c0d007008dd47bafecdd344fbdace928731ae8b09f58f633f4a30f\"" Mar 17 17:26:28.574882 containerd[1745]: time="2025-03-17T17:26:28.574854041Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\"" Mar 17 17:26:29.719943 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3409126423.mount: Deactivated successfully. Mar 17 17:26:30.057428 containerd[1745]: time="2025-03-17T17:26:30.056653536Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:30.060052 containerd[1745]: time="2025-03-17T17:26:30.059983224Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.7: active requests=0, bytes read=26871915" Mar 17 17:26:30.063257 containerd[1745]: time="2025-03-17T17:26:30.063209552Z" level=info msg="ImageCreate event name:\"sha256:939054a0dc9c7c1596b061fc2380758139ce62751b44a0b21b3afc7abd7eb3ff\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:30.070671 containerd[1745]: time="2025-03-17T17:26:30.070637252Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:30.071690 containerd[1745]: time="2025-03-17T17:26:30.071270253Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.7\" with image id \"sha256:939054a0dc9c7c1596b061fc2380758139ce62751b44a0b21b3afc7abd7eb3ff\", repo tag \"registry.k8s.io/kube-proxy:v1.31.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\", size \"26870934\" in 1.496381732s" Mar 17 17:26:30.071690 containerd[1745]: time="2025-03-17T17:26:30.071303013Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\" returns image reference \"sha256:939054a0dc9c7c1596b061fc2380758139ce62751b44a0b21b3afc7abd7eb3ff\"" Mar 17 17:26:30.071814 containerd[1745]: time="2025-03-17T17:26:30.071763934Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 17 17:26:30.725867 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1944637511.mount: Deactivated successfully. Mar 17 17:26:34.353425 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Mar 17 17:26:34.362996 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:26:37.002989 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:26:37.006942 (kubelet)[2659]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:26:37.041318 kubelet[2659]: E0317 17:26:37.041260 2659 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:26:37.043604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:26:37.043760 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:26:43.684842 containerd[1745]: time="2025-03-17T17:26:43.684634133Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:43.727575 containerd[1745]: time="2025-03-17T17:26:43.727289876Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485381" Mar 17 17:26:43.770837 containerd[1745]: time="2025-03-17T17:26:43.770732781Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:43.819225 containerd[1745]: time="2025-03-17T17:26:43.819161498Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:43.820624 containerd[1745]: time="2025-03-17T17:26:43.820291821Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 13.748500447s" Mar 17 17:26:43.820624 containerd[1745]: time="2025-03-17T17:26:43.820327381Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" Mar 17 17:26:43.821387 containerd[1745]: time="2025-03-17T17:26:43.821349623Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 17 17:26:44.840538 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount167591603.mount: Deactivated successfully. Mar 17 17:26:45.572836 containerd[1745]: time="2025-03-17T17:26:45.572533980Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:45.577500 containerd[1745]: time="2025-03-17T17:26:45.577219832Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Mar 17 17:26:45.616378 containerd[1745]: time="2025-03-17T17:26:45.616322446Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:45.677052 containerd[1745]: time="2025-03-17T17:26:45.676993713Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:26:45.678105 containerd[1745]: time="2025-03-17T17:26:45.677744555Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 1.856266211s" Mar 17 17:26:45.678105 containerd[1745]: time="2025-03-17T17:26:45.677776795Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Mar 17 17:26:45.678643 containerd[1745]: time="2025-03-17T17:26:45.678460956Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Mar 17 17:26:47.103434 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Mar 17 17:26:47.115052 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:26:47.143511 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3113758290.mount: Deactivated successfully. Mar 17 17:26:47.204594 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:26:47.208795 (kubelet)[2718]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:26:47.241321 kubelet[2718]: E0317 17:26:47.241266 2718 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:26:47.243098 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:26:47.243218 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:26:57.353655 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. Mar 17 17:26:57.364002 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:27:00.992034 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:27:00.995208 (kubelet)[2744]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:27:01.032344 kubelet[2744]: E0317 17:27:01.032281 2744 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:27:01.034837 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:27:01.035113 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:27:07.194846 containerd[1745]: time="2025-03-17T17:27:07.194630686Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:27:07.204779 containerd[1745]: time="2025-03-17T17:27:07.204361427Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66406425" Mar 17 17:27:07.216211 containerd[1745]: time="2025-03-17T17:27:07.216147093Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:27:07.224183 containerd[1745]: time="2025-03-17T17:27:07.223150308Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:27:07.224183 containerd[1745]: time="2025-03-17T17:27:07.223952950Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 21.545446913s" Mar 17 17:27:07.224183 containerd[1745]: time="2025-03-17T17:27:07.223981710Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Mar 17 17:27:11.104371 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. Mar 17 17:27:11.112127 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:27:11.428980 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:27:11.431417 (kubelet)[2817]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:27:11.470360 kubelet[2817]: E0317 17:27:11.466808 2817 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:27:11.473216 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:27:11.473523 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:27:13.886465 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:27:13.893466 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:27:13.922450 systemd[1]: Reloading requested from client PID 2831 ('systemctl') (unit session-9.scope)... Mar 17 17:27:13.922598 systemd[1]: Reloading... Mar 17 17:27:14.037151 zram_generator::config[2871]: No configuration found. Mar 17 17:27:14.138394 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:27:14.218198 systemd[1]: Reloading finished in 295 ms. Mar 17 17:27:14.254242 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:27:14.257553 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:27:14.260373 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 17:27:14.260710 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:27:14.266134 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:27:14.551796 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:27:14.564198 (kubelet)[2940]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 17:27:14.600841 kubelet[2940]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:27:14.600841 kubelet[2940]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 17:27:14.600841 kubelet[2940]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:27:14.601189 kubelet[2940]: I0317 17:27:14.600841 2940 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 17:27:15.399062 kubelet[2940]: I0317 17:27:15.399018 2940 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 17 17:27:15.399062 kubelet[2940]: I0317 17:27:15.399053 2940 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 17:27:15.399299 kubelet[2940]: I0317 17:27:15.399279 2940 server.go:929] "Client rotation is on, will bootstrap in background" Mar 17 17:27:15.417675 kubelet[2940]: E0317 17:27:15.417630 2940 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.20.36:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:15.418701 kubelet[2940]: I0317 17:27:15.418670 2940 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:27:15.424403 kubelet[2940]: E0317 17:27:15.424361 2940 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 17 17:27:15.424403 kubelet[2940]: I0317 17:27:15.424397 2940 server.go:1403] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Mar 17 17:27:15.428201 kubelet[2940]: I0317 17:27:15.428175 2940 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 17:27:15.428932 kubelet[2940]: I0317 17:27:15.428910 2940 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 17 17:27:15.429086 kubelet[2940]: I0317 17:27:15.429055 2940 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 17:27:15.429266 kubelet[2940]: I0317 17:27:15.429087 2940 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4152.2.2-a-f9f073f8c6","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 17 17:27:15.429348 kubelet[2940]: I0317 17:27:15.429275 2940 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 17:27:15.429348 kubelet[2940]: I0317 17:27:15.429286 2940 container_manager_linux.go:300] "Creating device plugin manager" Mar 17 17:27:15.429429 kubelet[2940]: I0317 17:27:15.429411 2940 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:27:15.431425 kubelet[2940]: I0317 17:27:15.431116 2940 kubelet.go:408] "Attempting to sync node with API server" Mar 17 17:27:15.431425 kubelet[2940]: I0317 17:27:15.431145 2940 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 17:27:15.431425 kubelet[2940]: I0317 17:27:15.431171 2940 kubelet.go:314] "Adding apiserver pod source" Mar 17 17:27:15.431425 kubelet[2940]: I0317 17:27:15.431182 2940 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 17:27:15.435103 kubelet[2940]: W0317 17:27:15.434796 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.36:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152.2.2-a-f9f073f8c6&limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:15.435103 kubelet[2940]: E0317 17:27:15.434881 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.20.36:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152.2.2-a-f9f073f8c6&limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:15.435919 kubelet[2940]: I0317 17:27:15.435882 2940 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Mar 17 17:27:15.437708 kubelet[2940]: I0317 17:27:15.437601 2940 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 17:27:15.438241 kubelet[2940]: W0317 17:27:15.438196 2940 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 17 17:27:15.440887 kubelet[2940]: W0317 17:27:15.440690 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.20.36:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:15.440887 kubelet[2940]: E0317 17:27:15.440768 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.20.36:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:15.444347 kubelet[2940]: I0317 17:27:15.443938 2940 server.go:1269] "Started kubelet" Mar 17 17:27:15.445082 kubelet[2940]: I0317 17:27:15.445058 2940 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 17:27:15.448410 kubelet[2940]: E0317 17:27:15.447390 2940 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.36:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.36:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4152.2.2-a-f9f073f8c6.182da72b56cc49a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4152.2.2-a-f9f073f8c6,UID:ci-4152.2.2-a-f9f073f8c6,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4152.2.2-a-f9f073f8c6,},FirstTimestamp:2025-03-17 17:27:15.443911072 +0000 UTC m=+0.876647530,LastTimestamp:2025-03-17 17:27:15.443911072 +0000 UTC m=+0.876647530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4152.2.2-a-f9f073f8c6,}" Mar 17 17:27:15.449830 kubelet[2940]: I0317 17:27:15.449761 2940 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 17:27:15.450884 kubelet[2940]: I0317 17:27:15.450865 2940 server.go:460] "Adding debug handlers to kubelet server" Mar 17 17:27:15.451830 kubelet[2940]: I0317 17:27:15.451187 2940 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 17 17:27:15.451830 kubelet[2940]: E0317 17:27:15.451385 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:15.452015 kubelet[2940]: I0317 17:27:15.451967 2940 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 17:27:15.452290 kubelet[2940]: I0317 17:27:15.452274 2940 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 17:27:15.452582 kubelet[2940]: I0317 17:27:15.452565 2940 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 17 17:27:15.453047 kubelet[2940]: E0317 17:27:15.453019 2940 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.36:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152.2.2-a-f9f073f8c6?timeout=10s\": dial tcp 10.200.20.36:6443: connect: connection refused" interval="200ms" Mar 17 17:27:15.453844 kubelet[2940]: I0317 17:27:15.453797 2940 reconciler.go:26] "Reconciler: start to sync state" Mar 17 17:27:15.454008 kubelet[2940]: I0317 17:27:15.453996 2940 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 17 17:27:15.454540 kubelet[2940]: W0317 17:27:15.454503 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.20.36:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:15.454671 kubelet[2940]: E0317 17:27:15.454636 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.20.36:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:15.457443 kubelet[2940]: I0317 17:27:15.457420 2940 factory.go:221] Registration of the containerd container factory successfully Mar 17 17:27:15.457571 kubelet[2940]: I0317 17:27:15.457562 2940 factory.go:221] Registration of the systemd container factory successfully Mar 17 17:27:15.457695 kubelet[2940]: I0317 17:27:15.457680 2940 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 17:27:15.463348 kubelet[2940]: E0317 17:27:15.463319 2940 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 17:27:15.480581 kubelet[2940]: I0317 17:27:15.480438 2940 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 17:27:15.482081 kubelet[2940]: I0317 17:27:15.482058 2940 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 17:27:15.482191 kubelet[2940]: I0317 17:27:15.482182 2940 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 17:27:15.482254 kubelet[2940]: I0317 17:27:15.482246 2940 kubelet.go:2321] "Starting kubelet main sync loop" Mar 17 17:27:15.482539 kubelet[2940]: E0317 17:27:15.482326 2940 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 17:27:15.485003 kubelet[2940]: W0317 17:27:15.484964 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.36:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:15.485090 kubelet[2940]: E0317 17:27:15.485007 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.20.36:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:15.498320 kubelet[2940]: I0317 17:27:15.498015 2940 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 17:27:15.498320 kubelet[2940]: I0317 17:27:15.498037 2940 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 17:27:15.498320 kubelet[2940]: I0317 17:27:15.498058 2940 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:27:15.506689 kubelet[2940]: I0317 17:27:15.506658 2940 policy_none.go:49] "None policy: Start" Mar 17 17:27:15.507632 kubelet[2940]: I0317 17:27:15.507329 2940 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 17:27:15.507632 kubelet[2940]: I0317 17:27:15.507357 2940 state_mem.go:35] "Initializing new in-memory state store" Mar 17 17:27:15.518125 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 17 17:27:15.533969 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 17 17:27:15.536846 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 17 17:27:15.547557 kubelet[2940]: I0317 17:27:15.547525 2940 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 17:27:15.547789 kubelet[2940]: I0317 17:27:15.547749 2940 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 17 17:27:15.547789 kubelet[2940]: I0317 17:27:15.547760 2940 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 17:27:15.548209 kubelet[2940]: I0317 17:27:15.548181 2940 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 17:27:15.550002 kubelet[2940]: E0317 17:27:15.549973 2940 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:15.594021 systemd[1]: Created slice kubepods-burstable-podaa7211033565ff63381058a6f57992d1.slice - libcontainer container kubepods-burstable-podaa7211033565ff63381058a6f57992d1.slice. Mar 17 17:27:15.604575 systemd[1]: Created slice kubepods-burstable-pode16d9001deeba9da6265de6431bbca74.slice - libcontainer container kubepods-burstable-pode16d9001deeba9da6265de6431bbca74.slice. Mar 17 17:27:15.615155 systemd[1]: Created slice kubepods-burstable-podb966c0f571028ce266df8ec03bac8517.slice - libcontainer container kubepods-burstable-podb966c0f571028ce266df8ec03bac8517.slice. Mar 17 17:27:15.650057 kubelet[2940]: I0317 17:27:15.649599 2940 kubelet_node_status.go:72] "Attempting to register node" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:15.650057 kubelet[2940]: E0317 17:27:15.649955 2940 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.36:6443/api/v1/nodes\": dial tcp 10.200.20.36:6443: connect: connection refused" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:15.655824 kubelet[2940]: E0317 17:27:15.654115 2940 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.36:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152.2.2-a-f9f073f8c6?timeout=10s\": dial tcp 10.200.20.36:6443: connect: connection refused" interval="400ms" Mar 17 17:27:15.657188 kubelet[2940]: I0317 17:27:15.657163 2940 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/aa7211033565ff63381058a6f57992d1-ca-certs\") pod \"kube-apiserver-ci-4152.2.2-a-f9f073f8c6\" (UID: \"aa7211033565ff63381058a6f57992d1\") " pod="kube-system/kube-apiserver-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:15.657310 kubelet[2940]: I0317 17:27:15.657296 2940 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e16d9001deeba9da6265de6431bbca74-ca-certs\") pod \"kube-controller-manager-ci-4152.2.2-a-f9f073f8c6\" (UID: \"e16d9001deeba9da6265de6431bbca74\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:15.657383 kubelet[2940]: I0317 17:27:15.657372 2940 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e16d9001deeba9da6265de6431bbca74-flexvolume-dir\") pod \"kube-controller-manager-ci-4152.2.2-a-f9f073f8c6\" (UID: \"e16d9001deeba9da6265de6431bbca74\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:15.657450 kubelet[2940]: I0317 17:27:15.657438 2940 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e16d9001deeba9da6265de6431bbca74-k8s-certs\") pod \"kube-controller-manager-ci-4152.2.2-a-f9f073f8c6\" (UID: \"e16d9001deeba9da6265de6431bbca74\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:15.657523 kubelet[2940]: I0317 17:27:15.657511 2940 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e16d9001deeba9da6265de6431bbca74-kubeconfig\") pod \"kube-controller-manager-ci-4152.2.2-a-f9f073f8c6\" (UID: \"e16d9001deeba9da6265de6431bbca74\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:15.657592 kubelet[2940]: I0317 17:27:15.657575 2940 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e16d9001deeba9da6265de6431bbca74-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4152.2.2-a-f9f073f8c6\" (UID: \"e16d9001deeba9da6265de6431bbca74\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:15.657659 kubelet[2940]: I0317 17:27:15.657646 2940 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b966c0f571028ce266df8ec03bac8517-kubeconfig\") pod \"kube-scheduler-ci-4152.2.2-a-f9f073f8c6\" (UID: \"b966c0f571028ce266df8ec03bac8517\") " pod="kube-system/kube-scheduler-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:15.657730 kubelet[2940]: I0317 17:27:15.657718 2940 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/aa7211033565ff63381058a6f57992d1-k8s-certs\") pod \"kube-apiserver-ci-4152.2.2-a-f9f073f8c6\" (UID: \"aa7211033565ff63381058a6f57992d1\") " pod="kube-system/kube-apiserver-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:15.657823 kubelet[2940]: I0317 17:27:15.657791 2940 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/aa7211033565ff63381058a6f57992d1-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4152.2.2-a-f9f073f8c6\" (UID: \"aa7211033565ff63381058a6f57992d1\") " pod="kube-system/kube-apiserver-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:15.852209 kubelet[2940]: I0317 17:27:15.852180 2940 kubelet_node_status.go:72] "Attempting to register node" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:15.852512 kubelet[2940]: E0317 17:27:15.852472 2940 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.36:6443/api/v1/nodes\": dial tcp 10.200.20.36:6443: connect: connection refused" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:15.902930 containerd[1745]: time="2025-03-17T17:27:15.902694684Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4152.2.2-a-f9f073f8c6,Uid:aa7211033565ff63381058a6f57992d1,Namespace:kube-system,Attempt:0,}" Mar 17 17:27:15.914532 containerd[1745]: time="2025-03-17T17:27:15.914437711Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4152.2.2-a-f9f073f8c6,Uid:e16d9001deeba9da6265de6431bbca74,Namespace:kube-system,Attempt:0,}" Mar 17 17:27:15.918044 containerd[1745]: time="2025-03-17T17:27:15.918004959Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4152.2.2-a-f9f073f8c6,Uid:b966c0f571028ce266df8ec03bac8517,Namespace:kube-system,Attempt:0,}" Mar 17 17:27:16.055076 kubelet[2940]: E0317 17:27:16.055026 2940 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.36:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152.2.2-a-f9f073f8c6?timeout=10s\": dial tcp 10.200.20.36:6443: connect: connection refused" interval="800ms" Mar 17 17:27:16.254452 kubelet[2940]: I0317 17:27:16.254350 2940 kubelet_node_status.go:72] "Attempting to register node" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:16.255098 kubelet[2940]: E0317 17:27:16.255068 2940 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.36:6443/api/v1/nodes\": dial tcp 10.200.20.36:6443: connect: connection refused" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:16.369064 kubelet[2940]: W0317 17:27:16.369003 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.20.36:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:16.369198 kubelet[2940]: E0317 17:27:16.369073 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.20.36:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:16.473161 kubelet[2940]: W0317 17:27:16.473058 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.36:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152.2.2-a-f9f073f8c6&limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:16.473161 kubelet[2940]: E0317 17:27:16.473132 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.20.36:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152.2.2-a-f9f073f8c6&limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:16.501817 kubelet[2940]: W0317 17:27:16.501693 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.20.36:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:16.501817 kubelet[2940]: E0317 17:27:16.501763 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.20.36:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:16.855730 kubelet[2940]: E0317 17:27:16.855684 2940 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.36:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152.2.2-a-f9f073f8c6?timeout=10s\": dial tcp 10.200.20.36:6443: connect: connection refused" interval="1.6s" Mar 17 17:27:16.900327 kubelet[2940]: W0317 17:27:16.900227 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.36:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:16.900327 kubelet[2940]: E0317 17:27:16.900295 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.20.36:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:17.056931 kubelet[2940]: I0317 17:27:17.056900 2940 kubelet_node_status.go:72] "Attempting to register node" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:17.057245 kubelet[2940]: E0317 17:27:17.057210 2940 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.36:6443/api/v1/nodes\": dial tcp 10.200.20.36:6443: connect: connection refused" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:17.542409 kubelet[2940]: E0317 17:27:17.542359 2940 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.20.36:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:18.127916 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2515945171.mount: Deactivated successfully. Mar 17 17:27:18.377776 containerd[1745]: time="2025-03-17T17:27:18.377723717Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:27:18.456896 kubelet[2940]: E0317 17:27:18.456770 2940 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.36:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152.2.2-a-f9f073f8c6?timeout=10s\": dial tcp 10.200.20.36:6443: connect: connection refused" interval="3.2s" Mar 17 17:27:18.473159 containerd[1745]: time="2025-03-17T17:27:18.473091295Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269173" Mar 17 17:27:18.520382 containerd[1745]: time="2025-03-17T17:27:18.520334884Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:27:18.573375 containerd[1745]: time="2025-03-17T17:27:18.573201565Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:27:18.633316 containerd[1745]: time="2025-03-17T17:27:18.633244702Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 17 17:27:18.659334 kubelet[2940]: I0317 17:27:18.659291 2940 kubelet_node_status.go:72] "Attempting to register node" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:18.659658 kubelet[2940]: E0317 17:27:18.659627 2940 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.36:6443/api/v1/nodes\": dial tcp 10.200.20.36:6443: connect: connection refused" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:18.724792 containerd[1745]: time="2025-03-17T17:27:18.723968030Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:27:19.047784 kubelet[2940]: W0317 17:27:19.047671 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.36:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152.2.2-a-f9f073f8c6&limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:19.047784 kubelet[2940]: E0317 17:27:19.047721 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.20.36:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152.2.2-a-f9f073f8c6&limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:19.372958 kubelet[2940]: W0317 17:27:19.372919 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.20.36:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:19.373113 kubelet[2940]: E0317 17:27:19.372965 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.20.36:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:19.514170 kubelet[2940]: W0317 17:27:19.514134 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.20.36:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:19.514544 kubelet[2940]: E0317 17:27:19.514176 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.20.36:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:19.635901 kubelet[2940]: W0317 17:27:19.635759 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.36:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:19.635901 kubelet[2940]: E0317 17:27:19.635820 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.20.36:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:21.653468 kubelet[2940]: E0317 17:27:21.653426 2940 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.20.36:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:21.657914 kubelet[2940]: E0317 17:27:21.657872 2940 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.36:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152.2.2-a-f9f073f8c6?timeout=10s\": dial tcp 10.200.20.36:6443: connect: connection refused" interval="6.4s" Mar 17 17:27:21.861339 kubelet[2940]: I0317 17:27:21.861290 2940 kubelet_node_status.go:72] "Attempting to register node" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:21.861714 kubelet[2940]: E0317 17:27:21.861687 2940 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.36:6443/api/v1/nodes\": dial tcp 10.200.20.36:6443: connect: connection refused" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:22.027296 containerd[1745]: time="2025-03-17T17:27:22.027188081Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:27:22.028206 containerd[1745]: time="2025-03-17T17:27:22.027754683Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 6.124982079s" Mar 17 17:27:22.075953 containerd[1745]: time="2025-03-17T17:27:22.075845633Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 17 17:27:22.122909 containerd[1745]: time="2025-03-17T17:27:22.122866461Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 6.208354229s" Mar 17 17:27:22.325317 containerd[1745]: time="2025-03-17T17:27:22.325160342Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 6.407068863s" Mar 17 17:27:22.635573 kubelet[2940]: E0317 17:27:22.635466 2940 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.36:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.36:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4152.2.2-a-f9f073f8c6.182da72b56cc49a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4152.2.2-a-f9f073f8c6,UID:ci-4152.2.2-a-f9f073f8c6,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4152.2.2-a-f9f073f8c6,},FirstTimestamp:2025-03-17 17:27:15.443911072 +0000 UTC m=+0.876647530,LastTimestamp:2025-03-17 17:27:15.443911072 +0000 UTC m=+0.876647530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4152.2.2-a-f9f073f8c6,}" Mar 17 17:27:22.903308 kubelet[2940]: W0317 17:27:22.903156 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.36:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:22.903308 kubelet[2940]: E0317 17:27:22.903205 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.20.36:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:22.913829 kubelet[2940]: W0317 17:27:22.913780 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.36:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152.2.2-a-f9f073f8c6&limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:22.913919 kubelet[2940]: E0317 17:27:22.913845 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.20.36:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152.2.2-a-f9f073f8c6&limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:23.493562 containerd[1745]: time="2025-03-17T17:27:23.493327663Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:27:23.493562 containerd[1745]: time="2025-03-17T17:27:23.493409863Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:27:23.493562 containerd[1745]: time="2025-03-17T17:27:23.493426263Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:27:23.493562 containerd[1745]: time="2025-03-17T17:27:23.493514863Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:27:23.495563 containerd[1745]: time="2025-03-17T17:27:23.495282266Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:27:23.495563 containerd[1745]: time="2025-03-17T17:27:23.495327466Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:27:23.495563 containerd[1745]: time="2025-03-17T17:27:23.495337946Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:27:23.495563 containerd[1745]: time="2025-03-17T17:27:23.495425067Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:27:23.495563 containerd[1745]: time="2025-03-17T17:27:23.495072226Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:27:23.502255 containerd[1745]: time="2025-03-17T17:27:23.497711551Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:27:23.502255 containerd[1745]: time="2025-03-17T17:27:23.497758591Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:27:23.502255 containerd[1745]: time="2025-03-17T17:27:23.497880591Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:27:23.546020 systemd[1]: Started cri-containerd-8605d37ca69991c13994138a78d553c8a9a329d181aba461ea32325d7433f4b7.scope - libcontainer container 8605d37ca69991c13994138a78d553c8a9a329d181aba461ea32325d7433f4b7. Mar 17 17:27:23.551532 systemd[1]: Started cri-containerd-0f3efe1032163a2de2ab02cb31d18b01a0f2fe133c000d0d26bad1677238ed30.scope - libcontainer container 0f3efe1032163a2de2ab02cb31d18b01a0f2fe133c000d0d26bad1677238ed30. Mar 17 17:27:23.553659 systemd[1]: Started cri-containerd-9106146997412f7ba643afa5313b16bee0b17455c73bc9434f66717da109e21b.scope - libcontainer container 9106146997412f7ba643afa5313b16bee0b17455c73bc9434f66717da109e21b. Mar 17 17:27:23.589164 containerd[1745]: time="2025-03-17T17:27:23.588953043Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4152.2.2-a-f9f073f8c6,Uid:e16d9001deeba9da6265de6431bbca74,Namespace:kube-system,Attempt:0,} returns sandbox id \"8605d37ca69991c13994138a78d553c8a9a329d181aba461ea32325d7433f4b7\"" Mar 17 17:27:23.597630 containerd[1745]: time="2025-03-17T17:27:23.597392459Z" level=info msg="CreateContainer within sandbox \"8605d37ca69991c13994138a78d553c8a9a329d181aba461ea32325d7433f4b7\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 17 17:27:23.611901 containerd[1745]: time="2025-03-17T17:27:23.611847886Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4152.2.2-a-f9f073f8c6,Uid:aa7211033565ff63381058a6f57992d1,Namespace:kube-system,Attempt:0,} returns sandbox id \"0f3efe1032163a2de2ab02cb31d18b01a0f2fe133c000d0d26bad1677238ed30\"" Mar 17 17:27:23.612599 containerd[1745]: time="2025-03-17T17:27:23.612212207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4152.2.2-a-f9f073f8c6,Uid:b966c0f571028ce266df8ec03bac8517,Namespace:kube-system,Attempt:0,} returns sandbox id \"9106146997412f7ba643afa5313b16bee0b17455c73bc9434f66717da109e21b\"" Mar 17 17:27:23.615343 containerd[1745]: time="2025-03-17T17:27:23.615288012Z" level=info msg="CreateContainer within sandbox \"9106146997412f7ba643afa5313b16bee0b17455c73bc9434f66717da109e21b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 17 17:27:23.617138 containerd[1745]: time="2025-03-17T17:27:23.617047016Z" level=info msg="CreateContainer within sandbox \"0f3efe1032163a2de2ab02cb31d18b01a0f2fe133c000d0d26bad1677238ed30\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 17 17:27:23.826026 kubelet[2940]: W0317 17:27:23.825429 2940 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.20.36:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.20.36:6443: connect: connection refused Mar 17 17:27:23.826026 kubelet[2940]: E0317 17:27:23.825479 2940 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.20.36:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.36:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:27:24.076523 containerd[1745]: time="2025-03-17T17:27:24.076307441Z" level=info msg="CreateContainer within sandbox \"8605d37ca69991c13994138a78d553c8a9a329d181aba461ea32325d7433f4b7\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b47aaba43213fdf346075caf4b740227a2c4a6cab87c5a8e1e3ff7f659e4a97c\"" Mar 17 17:27:24.077300 containerd[1745]: time="2025-03-17T17:27:24.077266963Z" level=info msg="StartContainer for \"b47aaba43213fdf346075caf4b740227a2c4a6cab87c5a8e1e3ff7f659e4a97c\"" Mar 17 17:27:24.102029 systemd[1]: Started cri-containerd-b47aaba43213fdf346075caf4b740227a2c4a6cab87c5a8e1e3ff7f659e4a97c.scope - libcontainer container b47aaba43213fdf346075caf4b740227a2c4a6cab87c5a8e1e3ff7f659e4a97c. Mar 17 17:27:24.174472 containerd[1745]: time="2025-03-17T17:27:24.174425866Z" level=info msg="StartContainer for \"b47aaba43213fdf346075caf4b740227a2c4a6cab87c5a8e1e3ff7f659e4a97c\" returns successfully" Mar 17 17:27:24.424672 containerd[1745]: time="2025-03-17T17:27:24.424420577Z" level=info msg="CreateContainer within sandbox \"9106146997412f7ba643afa5313b16bee0b17455c73bc9434f66717da109e21b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"bfeea4a77b234a695b63da62cc1041edfc2fc368205b810bbce89a0877d4a3a7\"" Mar 17 17:27:24.425528 containerd[1745]: time="2025-03-17T17:27:24.425349379Z" level=info msg="StartContainer for \"bfeea4a77b234a695b63da62cc1041edfc2fc368205b810bbce89a0877d4a3a7\"" Mar 17 17:27:24.452025 systemd[1]: Started cri-containerd-bfeea4a77b234a695b63da62cc1041edfc2fc368205b810bbce89a0877d4a3a7.scope - libcontainer container bfeea4a77b234a695b63da62cc1041edfc2fc368205b810bbce89a0877d4a3a7. Mar 17 17:27:24.520227 containerd[1745]: time="2025-03-17T17:27:24.520133517Z" level=info msg="StartContainer for \"bfeea4a77b234a695b63da62cc1041edfc2fc368205b810bbce89a0877d4a3a7\" returns successfully" Mar 17 17:27:24.576963 containerd[1745]: time="2025-03-17T17:27:24.576587104Z" level=info msg="CreateContainer within sandbox \"0f3efe1032163a2de2ab02cb31d18b01a0f2fe133c000d0d26bad1677238ed30\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"bcca009413db1abd87edae6a8fb1aa241e20edb113e445d61bca374c8c8235d0\"" Mar 17 17:27:24.579877 containerd[1745]: time="2025-03-17T17:27:24.577366465Z" level=info msg="StartContainer for \"bcca009413db1abd87edae6a8fb1aa241e20edb113e445d61bca374c8c8235d0\"" Mar 17 17:27:24.613390 systemd[1]: Started cri-containerd-bcca009413db1abd87edae6a8fb1aa241e20edb113e445d61bca374c8c8235d0.scope - libcontainer container bcca009413db1abd87edae6a8fb1aa241e20edb113e445d61bca374c8c8235d0. Mar 17 17:27:24.667302 containerd[1745]: time="2025-03-17T17:27:24.667242035Z" level=info msg="StartContainer for \"bcca009413db1abd87edae6a8fb1aa241e20edb113e445d61bca374c8c8235d0\" returns successfully" Mar 17 17:27:25.550812 kubelet[2940]: E0317 17:27:25.550717 2940 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:27.504534 kubelet[2940]: E0317 17:27:27.504485 2940 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4152.2.2-a-f9f073f8c6" not found Mar 17 17:27:27.913666 kubelet[2940]: E0317 17:27:27.913592 2940 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4152.2.2-a-f9f073f8c6" not found Mar 17 17:27:28.062796 kubelet[2940]: E0317 17:27:28.062726 2940 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4152.2.2-a-f9f073f8c6\" not found" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:28.264656 kubelet[2940]: I0317 17:27:28.264430 2940 kubelet_node_status.go:72] "Attempting to register node" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:28.274479 kubelet[2940]: I0317 17:27:28.274395 2940 kubelet_node_status.go:75] "Successfully registered node" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:28.274479 kubelet[2940]: E0317 17:27:28.274434 2940 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-4152.2.2-a-f9f073f8c6\": node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:28.290262 kubelet[2940]: E0317 17:27:28.290224 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:28.391825 kubelet[2940]: E0317 17:27:28.390914 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:28.491293 kubelet[2940]: E0317 17:27:28.491242 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:28.592400 kubelet[2940]: E0317 17:27:28.592272 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:28.693108 kubelet[2940]: E0317 17:27:28.693064 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:28.793680 kubelet[2940]: E0317 17:27:28.793636 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:28.894329 kubelet[2940]: E0317 17:27:28.894288 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:28.995344 kubelet[2940]: E0317 17:27:28.995300 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:29.096105 kubelet[2940]: E0317 17:27:29.095971 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:29.197041 kubelet[2940]: E0317 17:27:29.196926 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:29.297390 kubelet[2940]: E0317 17:27:29.297348 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:29.397835 kubelet[2940]: E0317 17:27:29.397788 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:29.491970 systemd[1]: Reloading requested from client PID 3214 ('systemctl') (unit session-9.scope)... Mar 17 17:27:29.492288 systemd[1]: Reloading... Mar 17 17:27:29.498154 kubelet[2940]: E0317 17:27:29.497928 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:29.580912 zram_generator::config[3254]: No configuration found. Mar 17 17:27:29.598999 kubelet[2940]: E0317 17:27:29.598947 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:29.681663 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:27:29.699759 kubelet[2940]: E0317 17:27:29.699704 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:29.772040 systemd[1]: Reloading finished in 279 ms. Mar 17 17:27:29.800206 kubelet[2940]: E0317 17:27:29.800008 2940 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:29.807260 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:27:29.823354 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 17:27:29.823555 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:27:29.823604 systemd[1]: kubelet.service: Consumed 1.184s CPU time, 114.7M memory peak, 0B memory swap peak. Mar 17 17:27:29.831093 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:27:29.918368 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:27:29.923306 (kubelet)[3318]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 17:27:29.967768 kubelet[3318]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:27:29.967768 kubelet[3318]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 17:27:29.967768 kubelet[3318]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:27:29.968116 kubelet[3318]: I0317 17:27:29.967828 3318 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 17:27:29.977856 kubelet[3318]: I0317 17:27:29.977641 3318 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 17 17:27:29.977856 kubelet[3318]: I0317 17:27:29.977676 3318 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 17:27:29.978222 kubelet[3318]: I0317 17:27:29.978161 3318 server.go:929] "Client rotation is on, will bootstrap in background" Mar 17 17:27:29.979545 kubelet[3318]: I0317 17:27:29.979521 3318 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 17 17:27:29.982049 kubelet[3318]: I0317 17:27:29.981888 3318 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:27:29.986824 kubelet[3318]: E0317 17:27:29.985953 3318 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 17 17:27:29.986824 kubelet[3318]: I0317 17:27:29.985995 3318 server.go:1403] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Mar 17 17:27:29.990383 kubelet[3318]: I0317 17:27:29.990351 3318 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 17:27:29.990486 kubelet[3318]: I0317 17:27:29.990472 3318 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 17 17:27:29.990599 kubelet[3318]: I0317 17:27:29.990572 3318 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 17:27:29.990757 kubelet[3318]: I0317 17:27:29.990599 3318 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4152.2.2-a-f9f073f8c6","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 17 17:27:29.990902 kubelet[3318]: I0317 17:27:29.990764 3318 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 17:27:29.990902 kubelet[3318]: I0317 17:27:29.990772 3318 container_manager_linux.go:300] "Creating device plugin manager" Mar 17 17:27:29.990902 kubelet[3318]: I0317 17:27:29.990811 3318 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:27:29.990969 kubelet[3318]: I0317 17:27:29.990934 3318 kubelet.go:408] "Attempting to sync node with API server" Mar 17 17:27:29.990969 kubelet[3318]: I0317 17:27:29.990945 3318 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 17:27:29.990969 kubelet[3318]: I0317 17:27:29.990964 3318 kubelet.go:314] "Adding apiserver pod source" Mar 17 17:27:29.991423 kubelet[3318]: I0317 17:27:29.991397 3318 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 17:27:29.993196 kubelet[3318]: I0317 17:27:29.993158 3318 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Mar 17 17:27:29.993622 kubelet[3318]: I0317 17:27:29.993599 3318 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 17:27:29.994044 kubelet[3318]: I0317 17:27:29.994021 3318 server.go:1269] "Started kubelet" Mar 17 17:27:30.002569 kubelet[3318]: I0317 17:27:29.999574 3318 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 17:27:30.004939 kubelet[3318]: I0317 17:27:30.004866 3318 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 17:27:30.006160 kubelet[3318]: I0317 17:27:30.006105 3318 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 17:27:30.006772 kubelet[3318]: I0317 17:27:30.006362 3318 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 17:27:30.010153 kubelet[3318]: I0317 17:27:30.010117 3318 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 17 17:27:30.011893 kubelet[3318]: I0317 17:27:30.011870 3318 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 17 17:27:30.012113 kubelet[3318]: E0317 17:27:30.012092 3318 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-f9f073f8c6\" not found" Mar 17 17:27:30.014469 kubelet[3318]: I0317 17:27:30.014453 3318 server.go:460] "Adding debug handlers to kubelet server" Mar 17 17:27:30.017486 kubelet[3318]: I0317 17:27:30.017455 3318 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 17 17:27:30.017628 kubelet[3318]: I0317 17:27:30.017610 3318 reconciler.go:26] "Reconciler: start to sync state" Mar 17 17:27:30.032473 kubelet[3318]: I0317 17:27:30.032375 3318 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 17:27:30.039420 kubelet[3318]: I0317 17:27:30.039391 3318 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 17:27:30.039556 kubelet[3318]: I0317 17:27:30.039546 3318 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 17:27:30.039613 kubelet[3318]: I0317 17:27:30.039605 3318 kubelet.go:2321] "Starting kubelet main sync loop" Mar 17 17:27:30.039704 kubelet[3318]: E0317 17:27:30.039687 3318 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 17:27:30.040155 kubelet[3318]: I0317 17:27:30.033745 3318 factory.go:221] Registration of the containerd container factory successfully Mar 17 17:27:30.040247 kubelet[3318]: I0317 17:27:30.040236 3318 factory.go:221] Registration of the systemd container factory successfully Mar 17 17:27:30.040407 kubelet[3318]: I0317 17:27:30.040388 3318 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 17:27:30.048907 kubelet[3318]: E0317 17:27:30.048871 3318 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 17:27:30.088672 kubelet[3318]: I0317 17:27:30.088644 3318 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 17:27:30.088672 kubelet[3318]: I0317 17:27:30.088663 3318 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 17:27:30.088878 kubelet[3318]: I0317 17:27:30.088685 3318 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:27:30.088878 kubelet[3318]: I0317 17:27:30.088869 3318 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 17 17:27:30.088919 kubelet[3318]: I0317 17:27:30.088882 3318 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 17 17:27:30.088919 kubelet[3318]: I0317 17:27:30.088899 3318 policy_none.go:49] "None policy: Start" Mar 17 17:27:30.089646 kubelet[3318]: I0317 17:27:30.089624 3318 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 17:27:30.089724 kubelet[3318]: I0317 17:27:30.089655 3318 state_mem.go:35] "Initializing new in-memory state store" Mar 17 17:27:30.089963 kubelet[3318]: I0317 17:27:30.089945 3318 state_mem.go:75] "Updated machine memory state" Mar 17 17:27:30.095450 kubelet[3318]: I0317 17:27:30.094840 3318 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 17:27:30.095450 kubelet[3318]: I0317 17:27:30.094998 3318 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 17 17:27:30.095450 kubelet[3318]: I0317 17:27:30.095008 3318 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 17:27:30.095450 kubelet[3318]: I0317 17:27:30.095284 3318 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 17:27:30.154169 kubelet[3318]: W0317 17:27:30.154137 3318 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 17 17:27:30.158989 kubelet[3318]: W0317 17:27:30.158961 3318 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 17 17:27:30.159491 kubelet[3318]: W0317 17:27:30.159390 3318 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 17 17:27:30.198127 kubelet[3318]: I0317 17:27:30.198101 3318 kubelet_node_status.go:72] "Attempting to register node" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:30.212796 kubelet[3318]: I0317 17:27:30.212755 3318 kubelet_node_status.go:111] "Node was previously registered" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:30.212939 kubelet[3318]: I0317 17:27:30.212864 3318 kubelet_node_status.go:75] "Successfully registered node" node="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:30.219660 kubelet[3318]: I0317 17:27:30.219436 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/aa7211033565ff63381058a6f57992d1-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4152.2.2-a-f9f073f8c6\" (UID: \"aa7211033565ff63381058a6f57992d1\") " pod="kube-system/kube-apiserver-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:30.219660 kubelet[3318]: I0317 17:27:30.219474 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e16d9001deeba9da6265de6431bbca74-kubeconfig\") pod \"kube-controller-manager-ci-4152.2.2-a-f9f073f8c6\" (UID: \"e16d9001deeba9da6265de6431bbca74\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:30.219660 kubelet[3318]: I0317 17:27:30.219497 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e16d9001deeba9da6265de6431bbca74-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4152.2.2-a-f9f073f8c6\" (UID: \"e16d9001deeba9da6265de6431bbca74\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:30.219660 kubelet[3318]: I0317 17:27:30.219515 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b966c0f571028ce266df8ec03bac8517-kubeconfig\") pod \"kube-scheduler-ci-4152.2.2-a-f9f073f8c6\" (UID: \"b966c0f571028ce266df8ec03bac8517\") " pod="kube-system/kube-scheduler-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:30.219660 kubelet[3318]: I0317 17:27:30.219531 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/aa7211033565ff63381058a6f57992d1-ca-certs\") pod \"kube-apiserver-ci-4152.2.2-a-f9f073f8c6\" (UID: \"aa7211033565ff63381058a6f57992d1\") " pod="kube-system/kube-apiserver-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:30.219905 kubelet[3318]: I0317 17:27:30.219544 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/aa7211033565ff63381058a6f57992d1-k8s-certs\") pod \"kube-apiserver-ci-4152.2.2-a-f9f073f8c6\" (UID: \"aa7211033565ff63381058a6f57992d1\") " pod="kube-system/kube-apiserver-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:30.219905 kubelet[3318]: I0317 17:27:30.219560 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e16d9001deeba9da6265de6431bbca74-ca-certs\") pod \"kube-controller-manager-ci-4152.2.2-a-f9f073f8c6\" (UID: \"e16d9001deeba9da6265de6431bbca74\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:30.219905 kubelet[3318]: I0317 17:27:30.219577 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e16d9001deeba9da6265de6431bbca74-flexvolume-dir\") pod \"kube-controller-manager-ci-4152.2.2-a-f9f073f8c6\" (UID: \"e16d9001deeba9da6265de6431bbca74\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:30.219905 kubelet[3318]: I0317 17:27:30.219592 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e16d9001deeba9da6265de6431bbca74-k8s-certs\") pod \"kube-controller-manager-ci-4152.2.2-a-f9f073f8c6\" (UID: \"e16d9001deeba9da6265de6431bbca74\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-f9f073f8c6" Mar 17 17:27:30.992374 kubelet[3318]: I0317 17:27:30.992272 3318 apiserver.go:52] "Watching apiserver" Mar 17 17:27:31.018666 kubelet[3318]: I0317 17:27:31.018601 3318 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 17 17:27:31.110896 kubelet[3318]: I0317 17:27:31.110794 3318 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4152.2.2-a-f9f073f8c6" podStartSLOduration=1.110776637 podStartE2EDuration="1.110776637s" podCreationTimestamp="2025-03-17 17:27:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:27:31.094516507 +0000 UTC m=+1.167895339" watchObservedRunningTime="2025-03-17 17:27:31.110776637 +0000 UTC m=+1.184155469" Mar 17 17:27:31.123763 kubelet[3318]: I0317 17:27:31.123694 3318 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4152.2.2-a-f9f073f8c6" podStartSLOduration=1.123675005 podStartE2EDuration="1.123675005s" podCreationTimestamp="2025-03-17 17:27:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:27:31.110926757 +0000 UTC m=+1.184305549" watchObservedRunningTime="2025-03-17 17:27:31.123675005 +0000 UTC m=+1.197053837" Mar 17 17:27:31.137751 kubelet[3318]: I0317 17:27:31.137404 3318 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4152.2.2-a-f9f073f8c6" podStartSLOduration=1.137387893 podStartE2EDuration="1.137387893s" podCreationTimestamp="2025-03-17 17:27:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:27:31.124858966 +0000 UTC m=+1.198237798" watchObservedRunningTime="2025-03-17 17:27:31.137387893 +0000 UTC m=+1.210766725" Mar 17 17:27:35.493312 kubelet[3318]: I0317 17:27:35.493246 3318 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 17 17:27:35.494164 kubelet[3318]: I0317 17:27:35.493947 3318 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 17 17:27:35.494207 containerd[1745]: time="2025-03-17T17:27:35.493617867Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 17 17:27:35.949545 sudo[2354]: pam_unix(sudo:session): session closed for user root Mar 17 17:27:36.018834 sshd[2353]: Connection closed by 10.200.16.10 port 43262 Mar 17 17:27:36.019346 sshd-session[2351]: pam_unix(sshd:session): session closed for user core Mar 17 17:27:36.022176 systemd[1]: sshd@6-10.200.20.36:22-10.200.16.10:43262.service: Deactivated successfully. Mar 17 17:27:36.023966 systemd[1]: session-9.scope: Deactivated successfully. Mar 17 17:27:36.024242 systemd[1]: session-9.scope: Consumed 7.584s CPU time, 153.5M memory peak, 0B memory swap peak. Mar 17 17:27:36.025586 systemd-logind[1707]: Session 9 logged out. Waiting for processes to exit. Mar 17 17:27:36.027947 systemd-logind[1707]: Removed session 9. Mar 17 17:27:36.055160 systemd[1]: Created slice kubepods-besteffort-pod9f35c4ed_7939_4c27_8dea_4e5be9721398.slice - libcontainer container kubepods-besteffort-pod9f35c4ed_7939_4c27_8dea_4e5be9721398.slice. Mar 17 17:27:36.159016 kubelet[3318]: I0317 17:27:36.158965 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9f35c4ed-7939-4c27-8dea-4e5be9721398-xtables-lock\") pod \"kube-proxy-9w6fk\" (UID: \"9f35c4ed-7939-4c27-8dea-4e5be9721398\") " pod="kube-system/kube-proxy-9w6fk" Mar 17 17:27:36.159186 kubelet[3318]: I0317 17:27:36.159009 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f35c4ed-7939-4c27-8dea-4e5be9721398-lib-modules\") pod \"kube-proxy-9w6fk\" (UID: \"9f35c4ed-7939-4c27-8dea-4e5be9721398\") " pod="kube-system/kube-proxy-9w6fk" Mar 17 17:27:36.159186 kubelet[3318]: I0317 17:27:36.159060 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7r2x\" (UniqueName: \"kubernetes.io/projected/9f35c4ed-7939-4c27-8dea-4e5be9721398-kube-api-access-m7r2x\") pod \"kube-proxy-9w6fk\" (UID: \"9f35c4ed-7939-4c27-8dea-4e5be9721398\") " pod="kube-system/kube-proxy-9w6fk" Mar 17 17:27:36.159186 kubelet[3318]: I0317 17:27:36.159082 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/9f35c4ed-7939-4c27-8dea-4e5be9721398-kube-proxy\") pod \"kube-proxy-9w6fk\" (UID: \"9f35c4ed-7939-4c27-8dea-4e5be9721398\") " pod="kube-system/kube-proxy-9w6fk" Mar 17 17:27:36.365883 containerd[1745]: time="2025-03-17T17:27:36.365780957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9w6fk,Uid:9f35c4ed-7939-4c27-8dea-4e5be9721398,Namespace:kube-system,Attempt:0,}" Mar 17 17:27:36.436127 containerd[1745]: time="2025-03-17T17:27:36.435302831Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:27:36.436127 containerd[1745]: time="2025-03-17T17:27:36.435367711Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:27:36.436127 containerd[1745]: time="2025-03-17T17:27:36.435484992Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:27:36.436127 containerd[1745]: time="2025-03-17T17:27:36.435606472Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:27:36.456029 systemd[1]: Started cri-containerd-6674c955327a39b29001feb0ece28fcd454e528a2fc2725ce59e13762088bcbd.scope - libcontainer container 6674c955327a39b29001feb0ece28fcd454e528a2fc2725ce59e13762088bcbd. Mar 17 17:27:36.490489 containerd[1745]: time="2025-03-17T17:27:36.490083952Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9w6fk,Uid:9f35c4ed-7939-4c27-8dea-4e5be9721398,Namespace:kube-system,Attempt:0,} returns sandbox id \"6674c955327a39b29001feb0ece28fcd454e528a2fc2725ce59e13762088bcbd\"" Mar 17 17:27:36.496091 containerd[1745]: time="2025-03-17T17:27:36.496045406Z" level=info msg="CreateContainer within sandbox \"6674c955327a39b29001feb0ece28fcd454e528a2fc2725ce59e13762088bcbd\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 17 17:27:36.547728 systemd[1]: Created slice kubepods-besteffort-pod8acce87a_b22a_49d9_a2b0_9c8e88a01699.slice - libcontainer container kubepods-besteffort-pod8acce87a_b22a_49d9_a2b0_9c8e88a01699.slice. Mar 17 17:27:36.555534 containerd[1745]: time="2025-03-17T17:27:36.555458577Z" level=info msg="CreateContainer within sandbox \"6674c955327a39b29001feb0ece28fcd454e528a2fc2725ce59e13762088bcbd\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"7f4d31e729f920ecb99828c785eec307d65a5a36761453f95b7ff50fbdfcfac9\"" Mar 17 17:27:36.556589 containerd[1745]: time="2025-03-17T17:27:36.556264939Z" level=info msg="StartContainer for \"7f4d31e729f920ecb99828c785eec307d65a5a36761453f95b7ff50fbdfcfac9\"" Mar 17 17:27:36.561670 kubelet[3318]: I0317 17:27:36.561104 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8acce87a-b22a-49d9-a2b0-9c8e88a01699-var-lib-calico\") pod \"tigera-operator-64ff5465b7-jk46h\" (UID: \"8acce87a-b22a-49d9-a2b0-9c8e88a01699\") " pod="tigera-operator/tigera-operator-64ff5465b7-jk46h" Mar 17 17:27:36.561670 kubelet[3318]: I0317 17:27:36.561490 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x746x\" (UniqueName: \"kubernetes.io/projected/8acce87a-b22a-49d9-a2b0-9c8e88a01699-kube-api-access-x746x\") pod \"tigera-operator-64ff5465b7-jk46h\" (UID: \"8acce87a-b22a-49d9-a2b0-9c8e88a01699\") " pod="tigera-operator/tigera-operator-64ff5465b7-jk46h" Mar 17 17:27:36.580971 systemd[1]: Started cri-containerd-7f4d31e729f920ecb99828c785eec307d65a5a36761453f95b7ff50fbdfcfac9.scope - libcontainer container 7f4d31e729f920ecb99828c785eec307d65a5a36761453f95b7ff50fbdfcfac9. Mar 17 17:27:36.618290 containerd[1745]: time="2025-03-17T17:27:36.618180476Z" level=info msg="StartContainer for \"7f4d31e729f920ecb99828c785eec307d65a5a36761453f95b7ff50fbdfcfac9\" returns successfully" Mar 17 17:27:36.850830 containerd[1745]: time="2025-03-17T17:27:36.850737791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-jk46h,Uid:8acce87a-b22a-49d9-a2b0-9c8e88a01699,Namespace:tigera-operator,Attempt:0,}" Mar 17 17:27:36.923798 containerd[1745]: time="2025-03-17T17:27:36.923475112Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:27:36.923798 containerd[1745]: time="2025-03-17T17:27:36.923530672Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:27:36.923798 containerd[1745]: time="2025-03-17T17:27:36.923545632Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:27:36.924680 containerd[1745]: time="2025-03-17T17:27:36.924601314Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:27:36.939078 systemd[1]: Started cri-containerd-af9879052dc2bbd8fbd7b64d9e2cebe9802bc6f631b94adf4ab00c11af650a9d.scope - libcontainer container af9879052dc2bbd8fbd7b64d9e2cebe9802bc6f631b94adf4ab00c11af650a9d. Mar 17 17:27:36.971005 containerd[1745]: time="2025-03-17T17:27:36.970909017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-jk46h,Uid:8acce87a-b22a-49d9-a2b0-9c8e88a01699,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"af9879052dc2bbd8fbd7b64d9e2cebe9802bc6f631b94adf4ab00c11af650a9d\"" Mar 17 17:27:36.973902 containerd[1745]: time="2025-03-17T17:27:36.973666743Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 17 17:27:39.218840 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2943917457.mount: Deactivated successfully. Mar 17 17:27:39.697895 containerd[1745]: time="2025-03-17T17:27:39.697844801Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:27:39.701146 containerd[1745]: time="2025-03-17T17:27:39.700882766Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=19271115" Mar 17 17:27:39.708245 containerd[1745]: time="2025-03-17T17:27:39.708190300Z" level=info msg="ImageCreate event name:\"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:27:39.714459 containerd[1745]: time="2025-03-17T17:27:39.714401672Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:27:39.715661 containerd[1745]: time="2025-03-17T17:27:39.715099353Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"19267110\" in 2.74137685s" Mar 17 17:27:39.715661 containerd[1745]: time="2025-03-17T17:27:39.715132193Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\"" Mar 17 17:27:39.717813 containerd[1745]: time="2025-03-17T17:27:39.717765198Z" level=info msg="CreateContainer within sandbox \"af9879052dc2bbd8fbd7b64d9e2cebe9802bc6f631b94adf4ab00c11af650a9d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 17 17:27:39.774507 containerd[1745]: time="2025-03-17T17:27:39.774437903Z" level=info msg="CreateContainer within sandbox \"af9879052dc2bbd8fbd7b64d9e2cebe9802bc6f631b94adf4ab00c11af650a9d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"884832efc9e0ef13ecf0736dfb5e84277b6366d0837a38b55e9a3850a90b37ff\"" Mar 17 17:27:39.775229 containerd[1745]: time="2025-03-17T17:27:39.775106224Z" level=info msg="StartContainer for \"884832efc9e0ef13ecf0736dfb5e84277b6366d0837a38b55e9a3850a90b37ff\"" Mar 17 17:27:39.803002 systemd[1]: Started cri-containerd-884832efc9e0ef13ecf0736dfb5e84277b6366d0837a38b55e9a3850a90b37ff.scope - libcontainer container 884832efc9e0ef13ecf0736dfb5e84277b6366d0837a38b55e9a3850a90b37ff. Mar 17 17:27:39.833925 containerd[1745]: time="2025-03-17T17:27:39.833877854Z" level=info msg="StartContainer for \"884832efc9e0ef13ecf0736dfb5e84277b6366d0837a38b55e9a3850a90b37ff\" returns successfully" Mar 17 17:27:39.882279 kubelet[3318]: I0317 17:27:39.882212 3318 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-9w6fk" podStartSLOduration=3.882192624 podStartE2EDuration="3.882192624s" podCreationTimestamp="2025-03-17 17:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:27:37.101214305 +0000 UTC m=+7.174593137" watchObservedRunningTime="2025-03-17 17:27:39.882192624 +0000 UTC m=+9.955571456" Mar 17 17:27:40.137529 kubelet[3318]: I0317 17:27:40.137432 3318 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-64ff5465b7-jk46h" podStartSLOduration=1.393935124 podStartE2EDuration="4.137405538s" podCreationTimestamp="2025-03-17 17:27:36 +0000 UTC" firstStartedPulling="2025-03-17 17:27:36.9723871 +0000 UTC m=+7.045765932" lastFinishedPulling="2025-03-17 17:27:39.715857514 +0000 UTC m=+9.789236346" observedRunningTime="2025-03-17 17:27:40.114944457 +0000 UTC m=+10.188323329" watchObservedRunningTime="2025-03-17 17:27:40.137405538 +0000 UTC m=+10.210784370" Mar 17 17:27:44.608656 systemd[1]: Created slice kubepods-besteffort-pod4a224bd1_62a5_4206_a3bd_268c000e2b35.slice - libcontainer container kubepods-besteffort-pod4a224bd1_62a5_4206_a3bd_268c000e2b35.slice. Mar 17 17:27:44.611033 kubelet[3318]: I0317 17:27:44.611003 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hbrx\" (UniqueName: \"kubernetes.io/projected/4a224bd1-62a5-4206-a3bd-268c000e2b35-kube-api-access-7hbrx\") pod \"calico-typha-6fb66c74c9-kr8vp\" (UID: \"4a224bd1-62a5-4206-a3bd-268c000e2b35\") " pod="calico-system/calico-typha-6fb66c74c9-kr8vp" Mar 17 17:27:44.611419 kubelet[3318]: I0317 17:27:44.611043 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a224bd1-62a5-4206-a3bd-268c000e2b35-tigera-ca-bundle\") pod \"calico-typha-6fb66c74c9-kr8vp\" (UID: \"4a224bd1-62a5-4206-a3bd-268c000e2b35\") " pod="calico-system/calico-typha-6fb66c74c9-kr8vp" Mar 17 17:27:44.611419 kubelet[3318]: I0317 17:27:44.611061 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4a224bd1-62a5-4206-a3bd-268c000e2b35-typha-certs\") pod \"calico-typha-6fb66c74c9-kr8vp\" (UID: \"4a224bd1-62a5-4206-a3bd-268c000e2b35\") " pod="calico-system/calico-typha-6fb66c74c9-kr8vp" Mar 17 17:27:44.917220 containerd[1745]: time="2025-03-17T17:27:44.914878060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6fb66c74c9-kr8vp,Uid:4a224bd1-62a5-4206-a3bd-268c000e2b35,Namespace:calico-system,Attempt:0,}" Mar 17 17:27:45.010097 systemd[1]: Created slice kubepods-besteffort-pod031e49b1_a116_452c_85f0_1d6e8a4d91c8.slice - libcontainer container kubepods-besteffort-pod031e49b1_a116_452c_85f0_1d6e8a4d91c8.slice. Mar 17 17:27:45.013595 kubelet[3318]: I0317 17:27:45.013568 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/031e49b1-a116-452c-85f0-1d6e8a4d91c8-flexvol-driver-host\") pod \"calico-node-x9tvk\" (UID: \"031e49b1-a116-452c-85f0-1d6e8a4d91c8\") " pod="calico-system/calico-node-x9tvk" Mar 17 17:27:45.013750 kubelet[3318]: I0317 17:27:45.013737 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/031e49b1-a116-452c-85f0-1d6e8a4d91c8-policysync\") pod \"calico-node-x9tvk\" (UID: \"031e49b1-a116-452c-85f0-1d6e8a4d91c8\") " pod="calico-system/calico-node-x9tvk" Mar 17 17:27:45.013971 kubelet[3318]: I0317 17:27:45.013920 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/031e49b1-a116-452c-85f0-1d6e8a4d91c8-var-lib-calico\") pod \"calico-node-x9tvk\" (UID: \"031e49b1-a116-452c-85f0-1d6e8a4d91c8\") " pod="calico-system/calico-node-x9tvk" Mar 17 17:27:45.014121 kubelet[3318]: I0317 17:27:45.014090 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/031e49b1-a116-452c-85f0-1d6e8a4d91c8-cni-bin-dir\") pod \"calico-node-x9tvk\" (UID: \"031e49b1-a116-452c-85f0-1d6e8a4d91c8\") " pod="calico-system/calico-node-x9tvk" Mar 17 17:27:45.014231 kubelet[3318]: I0317 17:27:45.014219 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/031e49b1-a116-452c-85f0-1d6e8a4d91c8-cni-log-dir\") pod \"calico-node-x9tvk\" (UID: \"031e49b1-a116-452c-85f0-1d6e8a4d91c8\") " pod="calico-system/calico-node-x9tvk" Mar 17 17:27:45.014370 kubelet[3318]: I0317 17:27:45.014327 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/031e49b1-a116-452c-85f0-1d6e8a4d91c8-tigera-ca-bundle\") pod \"calico-node-x9tvk\" (UID: \"031e49b1-a116-452c-85f0-1d6e8a4d91c8\") " pod="calico-system/calico-node-x9tvk" Mar 17 17:27:45.014476 kubelet[3318]: I0317 17:27:45.014464 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/031e49b1-a116-452c-85f0-1d6e8a4d91c8-lib-modules\") pod \"calico-node-x9tvk\" (UID: \"031e49b1-a116-452c-85f0-1d6e8a4d91c8\") " pod="calico-system/calico-node-x9tvk" Mar 17 17:27:45.014614 kubelet[3318]: I0317 17:27:45.014576 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/031e49b1-a116-452c-85f0-1d6e8a4d91c8-xtables-lock\") pod \"calico-node-x9tvk\" (UID: \"031e49b1-a116-452c-85f0-1d6e8a4d91c8\") " pod="calico-system/calico-node-x9tvk" Mar 17 17:27:45.014767 kubelet[3318]: I0317 17:27:45.014708 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/031e49b1-a116-452c-85f0-1d6e8a4d91c8-cni-net-dir\") pod \"calico-node-x9tvk\" (UID: \"031e49b1-a116-452c-85f0-1d6e8a4d91c8\") " pod="calico-system/calico-node-x9tvk" Mar 17 17:27:45.014767 kubelet[3318]: I0317 17:27:45.014735 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwrl6\" (UniqueName: \"kubernetes.io/projected/031e49b1-a116-452c-85f0-1d6e8a4d91c8-kube-api-access-zwrl6\") pod \"calico-node-x9tvk\" (UID: \"031e49b1-a116-452c-85f0-1d6e8a4d91c8\") " pod="calico-system/calico-node-x9tvk" Mar 17 17:27:45.015005 kubelet[3318]: I0317 17:27:45.014925 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/031e49b1-a116-452c-85f0-1d6e8a4d91c8-node-certs\") pod \"calico-node-x9tvk\" (UID: \"031e49b1-a116-452c-85f0-1d6e8a4d91c8\") " pod="calico-system/calico-node-x9tvk" Mar 17 17:27:45.015133 kubelet[3318]: I0317 17:27:45.015086 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/031e49b1-a116-452c-85f0-1d6e8a4d91c8-var-run-calico\") pod \"calico-node-x9tvk\" (UID: \"031e49b1-a116-452c-85f0-1d6e8a4d91c8\") " pod="calico-system/calico-node-x9tvk" Mar 17 17:27:45.125477 kubelet[3318]: E0317 17:27:45.125345 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.125477 kubelet[3318]: W0317 17:27:45.125371 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.125477 kubelet[3318]: E0317 17:27:45.125399 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.125793 kubelet[3318]: E0317 17:27:45.125779 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.125978 kubelet[3318]: W0317 17:27:45.125835 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.125978 kubelet[3318]: E0317 17:27:45.125851 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.126329 kubelet[3318]: E0317 17:27:45.126245 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.126329 kubelet[3318]: W0317 17:27:45.126257 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.126459 kubelet[3318]: E0317 17:27:45.126430 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.126599 kubelet[3318]: E0317 17:27:45.126588 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.126671 kubelet[3318]: W0317 17:27:45.126661 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.126826 kubelet[3318]: E0317 17:27:45.126789 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.127069 kubelet[3318]: E0317 17:27:45.126977 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.127069 kubelet[3318]: W0317 17:27:45.126988 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.127209 kubelet[3318]: E0317 17:27:45.127163 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.127472 kubelet[3318]: E0317 17:27:45.127364 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.127472 kubelet[3318]: W0317 17:27:45.127379 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.127557 kubelet[3318]: E0317 17:27:45.127510 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.127725 kubelet[3318]: E0317 17:27:45.127710 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.127886 kubelet[3318]: W0317 17:27:45.127863 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.128155 kubelet[3318]: E0317 17:27:45.128026 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.128430 kubelet[3318]: E0317 17:27:45.128413 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.128595 kubelet[3318]: W0317 17:27:45.128494 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.128643 kubelet[3318]: E0317 17:27:45.128630 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.128919 kubelet[3318]: E0317 17:27:45.128855 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.128919 kubelet[3318]: W0317 17:27:45.128869 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.129282 kubelet[3318]: E0317 17:27:45.129036 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.129428 kubelet[3318]: E0317 17:27:45.129411 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.129498 kubelet[3318]: W0317 17:27:45.129485 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.129877 kubelet[3318]: E0317 17:27:45.129846 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.130841 kubelet[3318]: E0317 17:27:45.130105 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.130841 kubelet[3318]: W0317 17:27:45.130119 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.130841 kubelet[3318]: E0317 17:27:45.130151 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.131111 kubelet[3318]: E0317 17:27:45.131020 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.131111 kubelet[3318]: W0317 17:27:45.131038 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.131111 kubelet[3318]: E0317 17:27:45.131090 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.131550 kubelet[3318]: E0317 17:27:45.131420 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.131550 kubelet[3318]: W0317 17:27:45.131437 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.131550 kubelet[3318]: E0317 17:27:45.131530 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.131990 kubelet[3318]: E0317 17:27:45.131876 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.131990 kubelet[3318]: W0317 17:27:45.131889 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.131990 kubelet[3318]: E0317 17:27:45.131971 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.133053 kubelet[3318]: E0317 17:27:45.132902 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.133053 kubelet[3318]: W0317 17:27:45.132915 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.133053 kubelet[3318]: E0317 17:27:45.132999 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.133232 kubelet[3318]: E0317 17:27:45.133163 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.133232 kubelet[3318]: W0317 17:27:45.133172 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.133415 kubelet[3318]: E0317 17:27:45.133356 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.133528 kubelet[3318]: E0317 17:27:45.133518 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.133618 kubelet[3318]: W0317 17:27:45.133570 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.133705 kubelet[3318]: E0317 17:27:45.133658 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.134020 kubelet[3318]: E0317 17:27:45.133905 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.134020 kubelet[3318]: W0317 17:27:45.133918 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.134020 kubelet[3318]: E0317 17:27:45.133994 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.134351 kubelet[3318]: E0317 17:27:45.134274 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.134351 kubelet[3318]: W0317 17:27:45.134285 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.134548 kubelet[3318]: E0317 17:27:45.134430 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.134723 kubelet[3318]: E0317 17:27:45.134630 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.134723 kubelet[3318]: W0317 17:27:45.134641 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.134930 kubelet[3318]: E0317 17:27:45.134917 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.135195 containerd[1745]: time="2025-03-17T17:27:45.134979091Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:27:45.135249 kubelet[3318]: E0317 17:27:45.135095 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.135249 kubelet[3318]: W0317 17:27:45.135102 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.135502 kubelet[3318]: E0317 17:27:45.135312 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.135502 kubelet[3318]: E0317 17:27:45.135409 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.135502 kubelet[3318]: W0317 17:27:45.135417 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.135600 containerd[1745]: time="2025-03-17T17:27:45.135093572Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:27:45.135600 containerd[1745]: time="2025-03-17T17:27:45.135112332Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:27:45.135677 kubelet[3318]: E0317 17:27:45.135663 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.135957 kubelet[3318]: E0317 17:27:45.135879 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.135957 kubelet[3318]: W0317 17:27:45.135890 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.136324 containerd[1745]: time="2025-03-17T17:27:45.135716253Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:27:45.136386 kubelet[3318]: E0317 17:27:45.136086 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.136580 kubelet[3318]: E0317 17:27:45.136566 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.136650 kubelet[3318]: W0317 17:27:45.136639 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.137237 kubelet[3318]: E0317 17:27:45.136931 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.137237 kubelet[3318]: W0317 17:27:45.136944 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.137237 kubelet[3318]: E0317 17:27:45.136955 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.137237 kubelet[3318]: E0317 17:27:45.136973 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.139713 kubelet[3318]: E0317 17:27:45.139698 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.141381 kubelet[3318]: W0317 17:27:45.141320 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.141381 kubelet[3318]: E0317 17:27:45.141350 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.164058 systemd[1]: Started cri-containerd-5a10922a4a4baf8cb6bb63fb09b1f34fd4b04271aa5ae8112ab881065a2295ab.scope - libcontainer container 5a10922a4a4baf8cb6bb63fb09b1f34fd4b04271aa5ae8112ab881065a2295ab. Mar 17 17:27:45.173979 kubelet[3318]: E0317 17:27:45.173887 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.174774 kubelet[3318]: W0317 17:27:45.174734 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.174903 kubelet[3318]: E0317 17:27:45.174776 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.221228 containerd[1745]: time="2025-03-17T17:27:45.221193876Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6fb66c74c9-kr8vp,Uid:4a224bd1-62a5-4206-a3bd-268c000e2b35,Namespace:calico-system,Attempt:0,} returns sandbox id \"5a10922a4a4baf8cb6bb63fb09b1f34fd4b04271aa5ae8112ab881065a2295ab\"" Mar 17 17:27:45.223152 containerd[1745]: time="2025-03-17T17:27:45.222951520Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 17 17:27:45.314585 containerd[1745]: time="2025-03-17T17:27:45.314246356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-x9tvk,Uid:031e49b1-a116-452c-85f0-1d6e8a4d91c8,Namespace:calico-system,Attempt:0,}" Mar 17 17:27:45.345125 kubelet[3318]: E0317 17:27:45.344751 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:27:45.406577 kubelet[3318]: E0317 17:27:45.406549 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.406735 kubelet[3318]: W0317 17:27:45.406718 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.406909 kubelet[3318]: E0317 17:27:45.406791 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.407077 kubelet[3318]: E0317 17:27:45.407065 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.407229 kubelet[3318]: W0317 17:27:45.407141 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.407229 kubelet[3318]: E0317 17:27:45.407157 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.407360 kubelet[3318]: E0317 17:27:45.407339 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.407502 kubelet[3318]: W0317 17:27:45.407412 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.407502 kubelet[3318]: E0317 17:27:45.407427 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.407621 kubelet[3318]: E0317 17:27:45.407610 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.407675 kubelet[3318]: W0317 17:27:45.407665 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.407774 kubelet[3318]: E0317 17:27:45.407763 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.408080 kubelet[3318]: E0317 17:27:45.407995 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.408080 kubelet[3318]: W0317 17:27:45.408006 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.408080 kubelet[3318]: E0317 17:27:45.408016 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.408237 kubelet[3318]: E0317 17:27:45.408226 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.408375 kubelet[3318]: W0317 17:27:45.408287 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.408375 kubelet[3318]: E0317 17:27:45.408302 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.408505 kubelet[3318]: E0317 17:27:45.408493 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.408566 kubelet[3318]: W0317 17:27:45.408555 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.408621 kubelet[3318]: E0317 17:27:45.408611 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.408911 kubelet[3318]: E0317 17:27:45.408818 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.408911 kubelet[3318]: W0317 17:27:45.408829 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.408911 kubelet[3318]: E0317 17:27:45.408838 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.409069 kubelet[3318]: E0317 17:27:45.409057 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.409124 kubelet[3318]: W0317 17:27:45.409114 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.409233 kubelet[3318]: E0317 17:27:45.409162 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.409327 kubelet[3318]: E0317 17:27:45.409318 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.409386 kubelet[3318]: W0317 17:27:45.409374 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.409447 kubelet[3318]: E0317 17:27:45.409436 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.409708 kubelet[3318]: E0317 17:27:45.409623 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.409708 kubelet[3318]: W0317 17:27:45.409635 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.409708 kubelet[3318]: E0317 17:27:45.409644 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.409888 kubelet[3318]: E0317 17:27:45.409876 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.409938 kubelet[3318]: W0317 17:27:45.409928 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.410077 kubelet[3318]: E0317 17:27:45.409991 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.410176 kubelet[3318]: E0317 17:27:45.410165 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.410231 kubelet[3318]: W0317 17:27:45.410221 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.410289 kubelet[3318]: E0317 17:27:45.410278 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.410599 kubelet[3318]: E0317 17:27:45.410586 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.410749 kubelet[3318]: W0317 17:27:45.410655 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.410749 kubelet[3318]: E0317 17:27:45.410671 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.410933 kubelet[3318]: E0317 17:27:45.410920 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.410985 kubelet[3318]: W0317 17:27:45.410975 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.411046 kubelet[3318]: E0317 17:27:45.411035 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.411358 kubelet[3318]: E0317 17:27:45.411231 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.411358 kubelet[3318]: W0317 17:27:45.411243 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.411358 kubelet[3318]: E0317 17:27:45.411251 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.411572 kubelet[3318]: E0317 17:27:45.411559 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.411733 kubelet[3318]: W0317 17:27:45.411626 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.411733 kubelet[3318]: E0317 17:27:45.411640 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.412061 kubelet[3318]: E0317 17:27:45.411978 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.412061 kubelet[3318]: W0317 17:27:45.411991 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.412061 kubelet[3318]: E0317 17:27:45.412001 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.412216 kubelet[3318]: E0317 17:27:45.412204 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.412350 kubelet[3318]: W0317 17:27:45.412261 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.412350 kubelet[3318]: E0317 17:27:45.412277 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.412481 kubelet[3318]: E0317 17:27:45.412469 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.412539 kubelet[3318]: W0317 17:27:45.412528 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.412589 kubelet[3318]: E0317 17:27:45.412579 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.426338 kubelet[3318]: E0317 17:27:45.426168 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.426338 kubelet[3318]: W0317 17:27:45.426193 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.426338 kubelet[3318]: E0317 17:27:45.426209 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.426338 kubelet[3318]: I0317 17:27:45.426236 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6c1365d-cb18-415b-8a89-1e9f3710a559-registration-dir\") pod \"csi-node-driver-zckwm\" (UID: \"f6c1365d-cb18-415b-8a89-1e9f3710a559\") " pod="calico-system/csi-node-driver-zckwm" Mar 17 17:27:45.427157 kubelet[3318]: E0317 17:27:45.426404 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.427157 kubelet[3318]: W0317 17:27:45.426414 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.427157 kubelet[3318]: E0317 17:27:45.426424 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.427157 kubelet[3318]: I0317 17:27:45.426437 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/f6c1365d-cb18-415b-8a89-1e9f3710a559-varrun\") pod \"csi-node-driver-zckwm\" (UID: \"f6c1365d-cb18-415b-8a89-1e9f3710a559\") " pod="calico-system/csi-node-driver-zckwm" Mar 17 17:27:45.427157 kubelet[3318]: E0317 17:27:45.427099 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.427157 kubelet[3318]: W0317 17:27:45.427113 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.427157 kubelet[3318]: E0317 17:27:45.427127 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.427157 kubelet[3318]: I0317 17:27:45.427144 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6c1365d-cb18-415b-8a89-1e9f3710a559-socket-dir\") pod \"csi-node-driver-zckwm\" (UID: \"f6c1365d-cb18-415b-8a89-1e9f3710a559\") " pod="calico-system/csi-node-driver-zckwm" Mar 17 17:27:45.427321 kubelet[3318]: E0317 17:27:45.427302 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.427321 kubelet[3318]: W0317 17:27:45.427311 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.427364 kubelet[3318]: E0317 17:27:45.427319 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.427364 kubelet[3318]: I0317 17:27:45.427333 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cpgl\" (UniqueName: \"kubernetes.io/projected/f6c1365d-cb18-415b-8a89-1e9f3710a559-kube-api-access-9cpgl\") pod \"csi-node-driver-zckwm\" (UID: \"f6c1365d-cb18-415b-8a89-1e9f3710a559\") " pod="calico-system/csi-node-driver-zckwm" Mar 17 17:27:45.428003 kubelet[3318]: E0317 17:27:45.427650 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.428003 kubelet[3318]: W0317 17:27:45.427663 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.428003 kubelet[3318]: E0317 17:27:45.427749 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.428003 kubelet[3318]: I0317 17:27:45.427768 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6c1365d-cb18-415b-8a89-1e9f3710a559-kubelet-dir\") pod \"csi-node-driver-zckwm\" (UID: \"f6c1365d-cb18-415b-8a89-1e9f3710a559\") " pod="calico-system/csi-node-driver-zckwm" Mar 17 17:27:45.428115 kubelet[3318]: E0317 17:27:45.428031 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.428115 kubelet[3318]: W0317 17:27:45.428042 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.428115 kubelet[3318]: E0317 17:27:45.428059 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.428567 kubelet[3318]: E0317 17:27:45.428281 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.428567 kubelet[3318]: W0317 17:27:45.428296 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.428567 kubelet[3318]: E0317 17:27:45.428342 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.428656 kubelet[3318]: E0317 17:27:45.428605 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.428656 kubelet[3318]: W0317 17:27:45.428614 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.428656 kubelet[3318]: E0317 17:27:45.428631 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.429093 kubelet[3318]: E0317 17:27:45.428883 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.429093 kubelet[3318]: W0317 17:27:45.428897 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.429093 kubelet[3318]: E0317 17:27:45.429002 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.429595 kubelet[3318]: E0317 17:27:45.429572 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.429595 kubelet[3318]: W0317 17:27:45.429588 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.429686 kubelet[3318]: E0317 17:27:45.429668 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.429852 kubelet[3318]: E0317 17:27:45.429835 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.429852 kubelet[3318]: W0317 17:27:45.429848 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.429941 kubelet[3318]: E0317 17:27:45.429925 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.430092 kubelet[3318]: E0317 17:27:45.430076 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.430092 kubelet[3318]: W0317 17:27:45.430087 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.430178 kubelet[3318]: E0317 17:27:45.430161 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.430265 kubelet[3318]: E0317 17:27:45.430249 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.430265 kubelet[3318]: W0317 17:27:45.430262 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.430320 kubelet[3318]: E0317 17:27:45.430272 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.430485 kubelet[3318]: E0317 17:27:45.430468 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.430485 kubelet[3318]: W0317 17:27:45.430481 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.430571 kubelet[3318]: E0317 17:27:45.430490 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.430655 kubelet[3318]: E0317 17:27:45.430638 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.430655 kubelet[3318]: W0317 17:27:45.430652 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.430720 kubelet[3318]: E0317 17:27:45.430661 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.529428 kubelet[3318]: E0317 17:27:45.528693 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.529428 kubelet[3318]: W0317 17:27:45.528720 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.529428 kubelet[3318]: E0317 17:27:45.528738 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.529428 kubelet[3318]: E0317 17:27:45.529004 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.529428 kubelet[3318]: W0317 17:27:45.529014 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.529428 kubelet[3318]: E0317 17:27:45.529031 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.529428 kubelet[3318]: E0317 17:27:45.529266 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.529428 kubelet[3318]: W0317 17:27:45.529275 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.529428 kubelet[3318]: E0317 17:27:45.529291 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.529994 kubelet[3318]: E0317 17:27:45.529980 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.530175 kubelet[3318]: W0317 17:27:45.530032 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.530175 kubelet[3318]: E0317 17:27:45.530116 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.531229 kubelet[3318]: E0317 17:27:45.531045 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.531229 kubelet[3318]: W0317 17:27:45.531066 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.531229 kubelet[3318]: E0317 17:27:45.531097 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.531423 kubelet[3318]: E0317 17:27:45.531303 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.531423 kubelet[3318]: W0317 17:27:45.531314 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.531423 kubelet[3318]: E0317 17:27:45.531339 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.531583 kubelet[3318]: E0317 17:27:45.531474 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.531772 kubelet[3318]: W0317 17:27:45.531505 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.531772 kubelet[3318]: E0317 17:27:45.531650 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.532127 kubelet[3318]: E0317 17:27:45.531818 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.532127 kubelet[3318]: W0317 17:27:45.531829 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.532127 kubelet[3318]: E0317 17:27:45.531855 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.532127 kubelet[3318]: E0317 17:27:45.531997 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.532127 kubelet[3318]: W0317 17:27:45.532005 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.532127 kubelet[3318]: E0317 17:27:45.532021 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.532453 kubelet[3318]: E0317 17:27:45.532237 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.532453 kubelet[3318]: W0317 17:27:45.532250 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.532453 kubelet[3318]: E0317 17:27:45.532267 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.532897 kubelet[3318]: E0317 17:27:45.532660 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.532897 kubelet[3318]: W0317 17:27:45.532673 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.532897 kubelet[3318]: E0317 17:27:45.532694 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.533205 kubelet[3318]: E0317 17:27:45.533074 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.533205 kubelet[3318]: W0317 17:27:45.533091 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.533479 kubelet[3318]: E0317 17:27:45.533421 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.533739 kubelet[3318]: E0317 17:27:45.533647 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.533739 kubelet[3318]: W0317 17:27:45.533659 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.534225 kubelet[3318]: E0317 17:27:45.534128 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.534225 kubelet[3318]: W0317 17:27:45.534146 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.534644 kubelet[3318]: E0317 17:27:45.534524 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.534644 kubelet[3318]: W0317 17:27:45.534537 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.534766 kubelet[3318]: E0317 17:27:45.534662 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.534766 kubelet[3318]: E0317 17:27:45.534689 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.534766 kubelet[3318]: E0317 17:27:45.534726 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.535130 kubelet[3318]: E0317 17:27:45.534929 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.535130 kubelet[3318]: W0317 17:27:45.534991 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.535441 kubelet[3318]: E0317 17:27:45.535254 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.535631 kubelet[3318]: E0317 17:27:45.535537 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.535631 kubelet[3318]: W0317 17:27:45.535552 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.535631 kubelet[3318]: E0317 17:27:45.535567 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.535838 kubelet[3318]: E0317 17:27:45.535783 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.535945 kubelet[3318]: W0317 17:27:45.535795 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.535945 kubelet[3318]: E0317 17:27:45.535928 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.536237 kubelet[3318]: E0317 17:27:45.536177 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.536237 kubelet[3318]: W0317 17:27:45.536192 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.536237 kubelet[3318]: E0317 17:27:45.536221 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.536883 kubelet[3318]: E0317 17:27:45.536673 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.536883 kubelet[3318]: W0317 17:27:45.536689 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.536883 kubelet[3318]: E0317 17:27:45.536720 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.537625 kubelet[3318]: E0317 17:27:45.537148 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.537625 kubelet[3318]: W0317 17:27:45.537164 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.537625 kubelet[3318]: E0317 17:27:45.537181 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.538816 kubelet[3318]: E0317 17:27:45.538771 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.538894 kubelet[3318]: W0317 17:27:45.538826 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.538894 kubelet[3318]: E0317 17:27:45.538853 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.539178 kubelet[3318]: E0317 17:27:45.539070 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.539178 kubelet[3318]: W0317 17:27:45.539085 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.539178 kubelet[3318]: E0317 17:27:45.539130 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.539867 kubelet[3318]: E0317 17:27:45.539419 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.539867 kubelet[3318]: W0317 17:27:45.539437 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.539867 kubelet[3318]: E0317 17:27:45.539454 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.539867 kubelet[3318]: E0317 17:27:45.539676 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.539867 kubelet[3318]: W0317 17:27:45.539687 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.539867 kubelet[3318]: E0317 17:27:45.539699 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.540888 containerd[1745]: time="2025-03-17T17:27:45.540758681Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:27:45.540977 containerd[1745]: time="2025-03-17T17:27:45.540919161Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:27:45.541522 containerd[1745]: time="2025-03-17T17:27:45.541456203Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:27:45.541727 containerd[1745]: time="2025-03-17T17:27:45.541687283Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:27:45.558991 systemd[1]: Started cri-containerd-95e1b53ae2405324d8d733788d61c4a22aee2172140ca3981acda8be92b1b446.scope - libcontainer container 95e1b53ae2405324d8d733788d61c4a22aee2172140ca3981acda8be92b1b446. Mar 17 17:27:45.578524 containerd[1745]: time="2025-03-17T17:27:45.578483562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-x9tvk,Uid:031e49b1-a116-452c-85f0-1d6e8a4d91c8,Namespace:calico-system,Attempt:0,} returns sandbox id \"95e1b53ae2405324d8d733788d61c4a22aee2172140ca3981acda8be92b1b446\"" Mar 17 17:27:45.635585 kubelet[3318]: E0317 17:27:45.635550 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.635585 kubelet[3318]: W0317 17:27:45.635573 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.635585 kubelet[3318]: E0317 17:27:45.635591 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:45.660432 kubelet[3318]: E0317 17:27:45.660401 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:45.660432 kubelet[3318]: W0317 17:27:45.660425 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:45.660588 kubelet[3318]: E0317 17:27:45.660446 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:47.040810 kubelet[3318]: E0317 17:27:47.040746 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:27:48.819648 containerd[1745]: time="2025-03-17T17:27:48.819303612Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:27:48.879500 containerd[1745]: time="2025-03-17T17:27:48.879285941Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=28363957" Mar 17 17:27:48.931060 containerd[1745]: time="2025-03-17T17:27:48.930987611Z" level=info msg="ImageCreate event name:\"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:27:48.974066 containerd[1745]: time="2025-03-17T17:27:48.973997503Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:27:48.974829 containerd[1745]: time="2025-03-17T17:27:48.974430864Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"29733706\" in 3.751446064s" Mar 17 17:27:48.974829 containerd[1745]: time="2025-03-17T17:27:48.974468344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\"" Mar 17 17:27:48.975962 containerd[1745]: time="2025-03-17T17:27:48.975480626Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 17 17:27:48.991491 containerd[1745]: time="2025-03-17T17:27:48.991350980Z" level=info msg="CreateContainer within sandbox \"5a10922a4a4baf8cb6bb63fb09b1f34fd4b04271aa5ae8112ab881065a2295ab\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 17 17:27:49.040955 kubelet[3318]: E0317 17:27:49.040865 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:27:49.373977 containerd[1745]: time="2025-03-17T17:27:49.373920438Z" level=info msg="CreateContainer within sandbox \"5a10922a4a4baf8cb6bb63fb09b1f34fd4b04271aa5ae8112ab881065a2295ab\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"df17e8ebc3a2eb23c54f4512ae320414a63285c54dba23c03c38f2e256d5882e\"" Mar 17 17:27:49.374496 containerd[1745]: time="2025-03-17T17:27:49.374476879Z" level=info msg="StartContainer for \"df17e8ebc3a2eb23c54f4512ae320414a63285c54dba23c03c38f2e256d5882e\"" Mar 17 17:27:49.401978 systemd[1]: Started cri-containerd-df17e8ebc3a2eb23c54f4512ae320414a63285c54dba23c03c38f2e256d5882e.scope - libcontainer container df17e8ebc3a2eb23c54f4512ae320414a63285c54dba23c03c38f2e256d5882e. Mar 17 17:27:49.477344 containerd[1745]: time="2025-03-17T17:27:49.477300179Z" level=info msg="StartContainer for \"df17e8ebc3a2eb23c54f4512ae320414a63285c54dba23c03c38f2e256d5882e\" returns successfully" Mar 17 17:27:50.134338 kubelet[3318]: I0317 17:27:50.133581 3318 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6fb66c74c9-kr8vp" podStartSLOduration=2.380770755 podStartE2EDuration="6.133565662s" podCreationTimestamp="2025-03-17 17:27:44 +0000 UTC" firstStartedPulling="2025-03-17 17:27:45.222547839 +0000 UTC m=+15.295926671" lastFinishedPulling="2025-03-17 17:27:48.975342746 +0000 UTC m=+19.048721578" observedRunningTime="2025-03-17 17:27:50.133232141 +0000 UTC m=+20.206610973" watchObservedRunningTime="2025-03-17 17:27:50.133565662 +0000 UTC m=+20.206944494" Mar 17 17:27:50.146208 kubelet[3318]: E0317 17:27:50.146049 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.146208 kubelet[3318]: W0317 17:27:50.146077 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.146208 kubelet[3318]: E0317 17:27:50.146096 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.147199 kubelet[3318]: E0317 17:27:50.146766 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.147199 kubelet[3318]: W0317 17:27:50.146782 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.147199 kubelet[3318]: E0317 17:27:50.147134 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.148482 kubelet[3318]: E0317 17:27:50.147506 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.148482 kubelet[3318]: W0317 17:27:50.147518 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.148482 kubelet[3318]: E0317 17:27:50.147530 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.148482 kubelet[3318]: E0317 17:27:50.147938 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.148482 kubelet[3318]: W0317 17:27:50.147950 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.148482 kubelet[3318]: E0317 17:27:50.147962 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.149434 kubelet[3318]: E0317 17:27:50.149374 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.149878 kubelet[3318]: W0317 17:27:50.149598 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.149878 kubelet[3318]: E0317 17:27:50.149618 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.150832 kubelet[3318]: E0317 17:27:50.150088 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.150832 kubelet[3318]: W0317 17:27:50.150101 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.150832 kubelet[3318]: E0317 17:27:50.150113 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.150832 kubelet[3318]: E0317 17:27:50.150400 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.150832 kubelet[3318]: W0317 17:27:50.150411 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.150832 kubelet[3318]: E0317 17:27:50.150424 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.151149 kubelet[3318]: E0317 17:27:50.151124 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.151209 kubelet[3318]: W0317 17:27:50.151148 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.151209 kubelet[3318]: E0317 17:27:50.151163 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.151480 kubelet[3318]: E0317 17:27:50.151384 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.151480 kubelet[3318]: W0317 17:27:50.151401 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.151480 kubelet[3318]: E0317 17:27:50.151414 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.151616 kubelet[3318]: E0317 17:27:50.151597 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.151616 kubelet[3318]: W0317 17:27:50.151611 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.151664 kubelet[3318]: E0317 17:27:50.151620 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.151926 kubelet[3318]: E0317 17:27:50.151907 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.151926 kubelet[3318]: W0317 17:27:50.151922 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.152010 kubelet[3318]: E0317 17:27:50.151932 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.152121 kubelet[3318]: E0317 17:27:50.152101 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.152121 kubelet[3318]: W0317 17:27:50.152116 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.152170 kubelet[3318]: E0317 17:27:50.152125 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.152366 kubelet[3318]: E0317 17:27:50.152337 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.152366 kubelet[3318]: W0317 17:27:50.152353 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.152441 kubelet[3318]: E0317 17:27:50.152362 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.152625 kubelet[3318]: E0317 17:27:50.152600 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.152625 kubelet[3318]: W0317 17:27:50.152614 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.152625 kubelet[3318]: E0317 17:27:50.152623 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.152877 kubelet[3318]: E0317 17:27:50.152815 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.152877 kubelet[3318]: W0317 17:27:50.152828 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.152877 kubelet[3318]: E0317 17:27:50.152839 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.166296 kubelet[3318]: E0317 17:27:50.166269 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.166296 kubelet[3318]: W0317 17:27:50.166289 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.166485 kubelet[3318]: E0317 17:27:50.166309 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.166554 kubelet[3318]: E0317 17:27:50.166496 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.166554 kubelet[3318]: W0317 17:27:50.166504 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.166554 kubelet[3318]: E0317 17:27:50.166518 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.166729 kubelet[3318]: E0317 17:27:50.166704 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.166729 kubelet[3318]: W0317 17:27:50.166717 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.166729 kubelet[3318]: E0317 17:27:50.166728 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.166927 kubelet[3318]: E0317 17:27:50.166912 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.166927 kubelet[3318]: W0317 17:27:50.166925 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.167494 kubelet[3318]: E0317 17:27:50.166941 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.167494 kubelet[3318]: E0317 17:27:50.167083 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.167494 kubelet[3318]: W0317 17:27:50.167091 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.167494 kubelet[3318]: E0317 17:27:50.167104 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.167494 kubelet[3318]: E0317 17:27:50.167213 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.167494 kubelet[3318]: W0317 17:27:50.167220 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.167494 kubelet[3318]: E0317 17:27:50.167227 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.167494 kubelet[3318]: E0317 17:27:50.167367 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.167494 kubelet[3318]: W0317 17:27:50.167373 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.167494 kubelet[3318]: E0317 17:27:50.167381 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.168244 kubelet[3318]: E0317 17:27:50.168165 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.168244 kubelet[3318]: W0317 17:27:50.168187 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.168244 kubelet[3318]: E0317 17:27:50.168206 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.168810 kubelet[3318]: E0317 17:27:50.168758 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.168810 kubelet[3318]: W0317 17:27:50.168773 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.169001 kubelet[3318]: E0317 17:27:50.168794 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.169278 kubelet[3318]: E0317 17:27:50.169174 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.169278 kubelet[3318]: W0317 17:27:50.169185 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.169278 kubelet[3318]: E0317 17:27:50.169204 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.170013 kubelet[3318]: E0317 17:27:50.169913 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.170013 kubelet[3318]: W0317 17:27:50.169926 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.170013 kubelet[3318]: E0317 17:27:50.169938 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.170175 kubelet[3318]: E0317 17:27:50.170162 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.170401 kubelet[3318]: W0317 17:27:50.170271 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.170401 kubelet[3318]: E0317 17:27:50.170289 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.170593 kubelet[3318]: E0317 17:27:50.170577 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.170662 kubelet[3318]: W0317 17:27:50.170650 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.170834 kubelet[3318]: E0317 17:27:50.170711 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.171121 kubelet[3318]: E0317 17:27:50.171106 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.171300 kubelet[3318]: W0317 17:27:50.171202 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.171300 kubelet[3318]: E0317 17:27:50.171220 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.171522 kubelet[3318]: E0317 17:27:50.171506 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.171655 kubelet[3318]: W0317 17:27:50.171575 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.171655 kubelet[3318]: E0317 17:27:50.171591 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.171943 kubelet[3318]: E0317 17:27:50.171849 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.171943 kubelet[3318]: W0317 17:27:50.171861 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.171943 kubelet[3318]: E0317 17:27:50.171872 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.172231 kubelet[3318]: E0317 17:27:50.172217 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.172593 kubelet[3318]: W0317 17:27:50.172321 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.172593 kubelet[3318]: E0317 17:27:50.172346 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:50.172721 kubelet[3318]: E0317 17:27:50.172707 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:50.172785 kubelet[3318]: W0317 17:27:50.172773 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:50.172860 kubelet[3318]: E0317 17:27:50.172849 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.040175 kubelet[3318]: E0317 17:27:51.040115 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:27:51.120453 kubelet[3318]: I0317 17:27:51.120418 3318 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:27:51.159421 kubelet[3318]: E0317 17:27:51.159319 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.159421 kubelet[3318]: W0317 17:27:51.159343 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.159421 kubelet[3318]: E0317 17:27:51.159363 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.159998 kubelet[3318]: E0317 17:27:51.159577 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.159998 kubelet[3318]: W0317 17:27:51.159586 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.159998 kubelet[3318]: E0317 17:27:51.159597 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.159998 kubelet[3318]: E0317 17:27:51.159751 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.159998 kubelet[3318]: W0317 17:27:51.159759 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.159998 kubelet[3318]: E0317 17:27:51.159767 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.159998 kubelet[3318]: E0317 17:27:51.159961 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.159998 kubelet[3318]: W0317 17:27:51.159981 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.159998 kubelet[3318]: E0317 17:27:51.159990 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.160338 kubelet[3318]: E0317 17:27:51.160180 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.160338 kubelet[3318]: W0317 17:27:51.160189 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.160338 kubelet[3318]: E0317 17:27:51.160199 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.160435 kubelet[3318]: E0317 17:27:51.160352 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.160435 kubelet[3318]: W0317 17:27:51.160359 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.160435 kubelet[3318]: E0317 17:27:51.160367 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.160522 kubelet[3318]: E0317 17:27:51.160506 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.160522 kubelet[3318]: W0317 17:27:51.160513 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.160522 kubelet[3318]: E0317 17:27:51.160521 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.160680 kubelet[3318]: E0317 17:27:51.160666 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.160680 kubelet[3318]: W0317 17:27:51.160678 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.160733 kubelet[3318]: E0317 17:27:51.160686 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.160877 kubelet[3318]: E0317 17:27:51.160859 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.160877 kubelet[3318]: W0317 17:27:51.160875 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.160942 kubelet[3318]: E0317 17:27:51.160896 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.161044 kubelet[3318]: E0317 17:27:51.161031 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.161077 kubelet[3318]: W0317 17:27:51.161051 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.161077 kubelet[3318]: E0317 17:27:51.161060 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.161215 kubelet[3318]: E0317 17:27:51.161193 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.161215 kubelet[3318]: W0317 17:27:51.161213 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.161279 kubelet[3318]: E0317 17:27:51.161222 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.161374 kubelet[3318]: E0317 17:27:51.161361 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.161374 kubelet[3318]: W0317 17:27:51.161373 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.161427 kubelet[3318]: E0317 17:27:51.161380 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.161558 kubelet[3318]: E0317 17:27:51.161533 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.161558 kubelet[3318]: W0317 17:27:51.161550 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.161616 kubelet[3318]: E0317 17:27:51.161560 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.161712 kubelet[3318]: E0317 17:27:51.161701 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.161712 kubelet[3318]: W0317 17:27:51.161711 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.161771 kubelet[3318]: E0317 17:27:51.161719 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.161903 kubelet[3318]: E0317 17:27:51.161890 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.161903 kubelet[3318]: W0317 17:27:51.161901 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.161959 kubelet[3318]: E0317 17:27:51.161910 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.172700 kubelet[3318]: E0317 17:27:51.172665 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.172700 kubelet[3318]: W0317 17:27:51.172691 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.173091 kubelet[3318]: E0317 17:27:51.172719 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.173298 kubelet[3318]: E0317 17:27:51.173279 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.173298 kubelet[3318]: W0317 17:27:51.173295 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.173472 kubelet[3318]: E0317 17:27:51.173377 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.173623 kubelet[3318]: E0317 17:27:51.173604 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.173623 kubelet[3318]: W0317 17:27:51.173618 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.173790 kubelet[3318]: E0317 17:27:51.173633 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.173856 kubelet[3318]: E0317 17:27:51.173833 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.173856 kubelet[3318]: W0317 17:27:51.173841 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.173901 kubelet[3318]: E0317 17:27:51.173858 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.175791 kubelet[3318]: E0317 17:27:51.174175 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.175791 kubelet[3318]: W0317 17:27:51.174196 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.175791 kubelet[3318]: E0317 17:27:51.174219 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.176047 kubelet[3318]: E0317 17:27:51.175926 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.176359 kubelet[3318]: W0317 17:27:51.176103 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.176359 kubelet[3318]: E0317 17:27:51.176129 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.176922 kubelet[3318]: E0317 17:27:51.176873 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.177150 kubelet[3318]: W0317 17:27:51.176999 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.177150 kubelet[3318]: E0317 17:27:51.177075 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.177463 kubelet[3318]: E0317 17:27:51.177287 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.177463 kubelet[3318]: W0317 17:27:51.177307 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.177463 kubelet[3318]: E0317 17:27:51.177326 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.177779 kubelet[3318]: E0317 17:27:51.177758 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.177779 kubelet[3318]: W0317 17:27:51.177774 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.177873 kubelet[3318]: E0317 17:27:51.177788 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.178202 kubelet[3318]: E0317 17:27:51.178177 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.178309 kubelet[3318]: W0317 17:27:51.178204 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.178309 kubelet[3318]: E0317 17:27:51.178243 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.178550 kubelet[3318]: E0317 17:27:51.178363 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.178550 kubelet[3318]: W0317 17:27:51.178371 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.178550 kubelet[3318]: E0317 17:27:51.178445 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.178550 kubelet[3318]: E0317 17:27:51.178549 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.178654 kubelet[3318]: W0317 17:27:51.178558 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.178912 kubelet[3318]: E0317 17:27:51.178717 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.178912 kubelet[3318]: E0317 17:27:51.178786 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.178912 kubelet[3318]: W0317 17:27:51.178796 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.178912 kubelet[3318]: E0317 17:27:51.178846 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.179428 kubelet[3318]: E0317 17:27:51.179404 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.179428 kubelet[3318]: W0317 17:27:51.179425 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.179507 kubelet[3318]: E0317 17:27:51.179443 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.179828 kubelet[3318]: E0317 17:27:51.179794 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.179828 kubelet[3318]: W0317 17:27:51.179826 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.179899 kubelet[3318]: E0317 17:27:51.179848 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.180186 kubelet[3318]: E0317 17:27:51.180164 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.180186 kubelet[3318]: W0317 17:27:51.180180 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.180405 kubelet[3318]: E0317 17:27:51.180387 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.181065 kubelet[3318]: E0317 17:27:51.181042 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.181065 kubelet[3318]: W0317 17:27:51.181060 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.181065 kubelet[3318]: E0317 17:27:51.181075 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.181329 kubelet[3318]: E0317 17:27:51.181263 3318 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:27:51.181329 kubelet[3318]: W0317 17:27:51.181277 3318 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:27:51.181329 kubelet[3318]: E0317 17:27:51.181287 3318 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:27:51.419640 containerd[1745]: time="2025-03-17T17:27:51.418934169Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:27:51.464072 containerd[1745]: time="2025-03-17T17:27:51.463993665Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5120152" Mar 17 17:27:51.469197 containerd[1745]: time="2025-03-17T17:27:51.469148516Z" level=info msg="ImageCreate event name:\"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:27:52.127884 containerd[1745]: time="2025-03-17T17:27:52.127371523Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:27:52.128550 containerd[1745]: time="2025-03-17T17:27:52.128331965Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6489869\" in 3.152816579s" Mar 17 17:27:52.128550 containerd[1745]: time="2025-03-17T17:27:52.128365725Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\"" Mar 17 17:27:52.132035 containerd[1745]: time="2025-03-17T17:27:52.131859693Z" level=info msg="CreateContainer within sandbox \"95e1b53ae2405324d8d733788d61c4a22aee2172140ca3981acda8be92b1b446\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 17:27:53.040483 kubelet[3318]: E0317 17:27:53.040428 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:27:53.216369 waagent[1935]: 2025-03-17T17:27:53.216053Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 2] Mar 17 17:27:53.223210 waagent[1935]: 2025-03-17T17:27:53.223160Z INFO ExtHandler Mar 17 17:27:53.223319 waagent[1935]: 2025-03-17T17:27:53.223284Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: b511f3ee-1bb2-4854-8ecf-d60bc1a8ee3d eTag: 9100134080214235302 source: Fabric] Mar 17 17:27:53.223672 waagent[1935]: 2025-03-17T17:27:53.223631Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 17 17:27:53.224274 waagent[1935]: 2025-03-17T17:27:53.224226Z INFO ExtHandler Mar 17 17:27:53.224341 waagent[1935]: 2025-03-17T17:27:53.224311Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 2] Mar 17 17:27:53.295252 waagent[1935]: 2025-03-17T17:27:53.295131Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 17 17:27:53.380722 waagent[1935]: 2025-03-17T17:27:53.380615Z INFO ExtHandler Downloaded certificate {'thumbprint': 'CF76A22CCED062FF8C39A5D3BDACD242FDD34149', 'hasPrivateKey': True} Mar 17 17:27:53.381146 waagent[1935]: 2025-03-17T17:27:53.381102Z INFO ExtHandler Downloaded certificate {'thumbprint': '41518CF6F2B5C7CDB6D029B351552E19A8139D74', 'hasPrivateKey': False} Mar 17 17:27:53.381538 waagent[1935]: 2025-03-17T17:27:53.381490Z INFO ExtHandler Fetch goal state completed Mar 17 17:27:53.381892 waagent[1935]: 2025-03-17T17:27:53.381853Z INFO ExtHandler ExtHandler Mar 17 17:27:53.381966 waagent[1935]: 2025-03-17T17:27:53.381934Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_2 channel: WireServer source: Fabric activity: 7d722e79-b2aa-4857-8ab0-41711e861435 correlation ca5a42e8-6ea7-4019-b523-8d797eb5626b created: 2025-03-17T17:27:26.683228Z] Mar 17 17:27:53.382260 waagent[1935]: 2025-03-17T17:27:53.382221Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 17 17:27:53.382803 waagent[1935]: 2025-03-17T17:27:53.382760Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_2 0 ms] Mar 17 17:27:54.369787 containerd[1745]: time="2025-03-17T17:27:54.369723993Z" level=info msg="CreateContainer within sandbox \"95e1b53ae2405324d8d733788d61c4a22aee2172140ca3981acda8be92b1b446\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"2757f770a3e420b71cbfe9867b789e88d115be1fe84a81fd2af1f72a2f8cf5e0\"" Mar 17 17:27:54.371897 containerd[1745]: time="2025-03-17T17:27:54.370228194Z" level=info msg="StartContainer for \"2757f770a3e420b71cbfe9867b789e88d115be1fe84a81fd2af1f72a2f8cf5e0\"" Mar 17 17:27:54.398767 systemd[1]: run-containerd-runc-k8s.io-2757f770a3e420b71cbfe9867b789e88d115be1fe84a81fd2af1f72a2f8cf5e0-runc.m1sbi4.mount: Deactivated successfully. Mar 17 17:27:54.410047 systemd[1]: Started cri-containerd-2757f770a3e420b71cbfe9867b789e88d115be1fe84a81fd2af1f72a2f8cf5e0.scope - libcontainer container 2757f770a3e420b71cbfe9867b789e88d115be1fe84a81fd2af1f72a2f8cf5e0. Mar 17 17:27:54.444008 containerd[1745]: time="2025-03-17T17:27:54.443948051Z" level=info msg="StartContainer for \"2757f770a3e420b71cbfe9867b789e88d115be1fe84a81fd2af1f72a2f8cf5e0\" returns successfully" Mar 17 17:27:54.449413 systemd[1]: cri-containerd-2757f770a3e420b71cbfe9867b789e88d115be1fe84a81fd2af1f72a2f8cf5e0.scope: Deactivated successfully. Mar 17 17:27:54.470493 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2757f770a3e420b71cbfe9867b789e88d115be1fe84a81fd2af1f72a2f8cf5e0-rootfs.mount: Deactivated successfully. Mar 17 17:27:55.567111 kubelet[3318]: E0317 17:27:55.040009 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:27:56.524502 kubelet[3318]: I0317 17:27:56.524182 3318 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:27:57.040712 kubelet[3318]: E0317 17:27:57.040597 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:27:58.044852 containerd[1745]: time="2025-03-17T17:27:58.044761933Z" level=info msg="shim disconnected" id=2757f770a3e420b71cbfe9867b789e88d115be1fe84a81fd2af1f72a2f8cf5e0 namespace=k8s.io Mar 17 17:27:58.044852 containerd[1745]: time="2025-03-17T17:27:58.044844733Z" level=warning msg="cleaning up after shim disconnected" id=2757f770a3e420b71cbfe9867b789e88d115be1fe84a81fd2af1f72a2f8cf5e0 namespace=k8s.io Mar 17 17:27:58.044852 containerd[1745]: time="2025-03-17T17:27:58.044852893Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:27:58.056005 containerd[1745]: time="2025-03-17T17:27:58.055949754Z" level=warning msg="cleanup warnings time=\"2025-03-17T17:27:58Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Mar 17 17:27:58.138205 containerd[1745]: time="2025-03-17T17:27:58.138164306Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 17 17:27:59.040774 kubelet[3318]: E0317 17:27:59.040722 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:28:01.041210 kubelet[3318]: E0317 17:28:01.041146 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:28:01.570149 containerd[1745]: time="2025-03-17T17:28:01.570094658Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:01.575658 containerd[1745]: time="2025-03-17T17:28:01.575596790Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=91227396" Mar 17 17:28:01.620700 containerd[1745]: time="2025-03-17T17:28:01.620633245Z" level=info msg="ImageCreate event name:\"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:01.626713 containerd[1745]: time="2025-03-17T17:28:01.626637898Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:01.627954 containerd[1745]: time="2025-03-17T17:28:01.627408379Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"92597153\" in 3.489035152s" Mar 17 17:28:01.627954 containerd[1745]: time="2025-03-17T17:28:01.627445819Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\"" Mar 17 17:28:01.631004 containerd[1745]: time="2025-03-17T17:28:01.630856867Z" level=info msg="CreateContainer within sandbox \"95e1b53ae2405324d8d733788d61c4a22aee2172140ca3981acda8be92b1b446\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 17:28:01.965776 containerd[1745]: time="2025-03-17T17:28:01.965648654Z" level=info msg="CreateContainer within sandbox \"95e1b53ae2405324d8d733788d61c4a22aee2172140ca3981acda8be92b1b446\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"7232668e3fef886b8f75e8a82ae971634dac5c5ffa5fd58d1b92c48b8c48cb59\"" Mar 17 17:28:01.966340 containerd[1745]: time="2025-03-17T17:28:01.966273815Z" level=info msg="StartContainer for \"7232668e3fef886b8f75e8a82ae971634dac5c5ffa5fd58d1b92c48b8c48cb59\"" Mar 17 17:28:01.995978 systemd[1]: Started cri-containerd-7232668e3fef886b8f75e8a82ae971634dac5c5ffa5fd58d1b92c48b8c48cb59.scope - libcontainer container 7232668e3fef886b8f75e8a82ae971634dac5c5ffa5fd58d1b92c48b8c48cb59. Mar 17 17:28:02.027044 containerd[1745]: time="2025-03-17T17:28:02.026993383Z" level=info msg="StartContainer for \"7232668e3fef886b8f75e8a82ae971634dac5c5ffa5fd58d1b92c48b8c48cb59\" returns successfully" Mar 17 17:28:03.040906 kubelet[3318]: E0317 17:28:03.040829 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:28:05.040369 kubelet[3318]: E0317 17:28:05.040300 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:28:07.040861 kubelet[3318]: E0317 17:28:07.040772 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:28:09.040893 kubelet[3318]: E0317 17:28:09.040838 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:28:11.040985 kubelet[3318]: E0317 17:28:11.040934 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:28:12.001379 containerd[1745]: time="2025-03-17T17:28:12.001320493Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 17:28:12.004058 systemd[1]: cri-containerd-7232668e3fef886b8f75e8a82ae971634dac5c5ffa5fd58d1b92c48b8c48cb59.scope: Deactivated successfully. Mar 17 17:28:12.023908 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7232668e3fef886b8f75e8a82ae971634dac5c5ffa5fd58d1b92c48b8c48cb59-rootfs.mount: Deactivated successfully. Mar 17 17:28:12.073124 kubelet[3318]: I0317 17:28:12.073088 3318 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Mar 17 17:28:12.154024 systemd[1]: Created slice kubepods-burstable-pod0186ea99_0088_4544_b92c_7659bf548a6e.slice - libcontainer container kubepods-burstable-pod0186ea99_0088_4544_b92c_7659bf548a6e.slice. Mar 17 17:28:14.175364 containerd[1745]: time="2025-03-17T17:28:14.174333098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zckwm,Uid:f6c1365d-cb18-415b-8a89-1e9f3710a559,Namespace:calico-system,Attempt:0,}" Mar 17 17:28:14.175597 kubelet[3318]: W0317 17:28:12.160397 3318 reflector.go:561] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4152.2.2-a-f9f073f8c6" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4152.2.2-a-f9f073f8c6' and this object Mar 17 17:28:14.175597 kubelet[3318]: E0317 17:28:12.161173 3318 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4152.2.2-a-f9f073f8c6\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4152.2.2-a-f9f073f8c6' and this object" logger="UnhandledError" Mar 17 17:28:14.175597 kubelet[3318]: W0317 17:28:12.160586 3318 reflector.go:561] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4152.2.2-a-f9f073f8c6" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4152.2.2-a-f9f073f8c6' and this object Mar 17 17:28:14.175597 kubelet[3318]: E0317 17:28:12.161204 3318 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ci-4152.2.2-a-f9f073f8c6\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4152.2.2-a-f9f073f8c6' and this object" logger="UnhandledError" Mar 17 17:28:12.165568 systemd[1]: Created slice kubepods-burstable-pod6743fb2a_96da_4c19_b66f_02242ba2b410.slice - libcontainer container kubepods-burstable-pod6743fb2a_96da_4c19_b66f_02242ba2b410.slice. Mar 17 17:28:14.177321 kubelet[3318]: I0317 17:28:12.213259 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6743fb2a-96da-4c19-b66f-02242ba2b410-config-volume\") pod \"coredns-6f6b679f8f-d87wb\" (UID: \"6743fb2a-96da-4c19-b66f-02242ba2b410\") " pod="kube-system/coredns-6f6b679f8f-d87wb" Mar 17 17:28:14.177321 kubelet[3318]: I0317 17:28:12.213307 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e5b40ef-0b7d-4233-9b50-e552e3a1bd38-tigera-ca-bundle\") pod \"calico-kube-controllers-6fcc8c87fb-p2fc6\" (UID: \"1e5b40ef-0b7d-4233-9b50-e552e3a1bd38\") " pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" Mar 17 17:28:14.177321 kubelet[3318]: I0317 17:28:12.213327 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0186ea99-0088-4544-b92c-7659bf548a6e-config-volume\") pod \"coredns-6f6b679f8f-4tzzq\" (UID: \"0186ea99-0088-4544-b92c-7659bf548a6e\") " pod="kube-system/coredns-6f6b679f8f-4tzzq" Mar 17 17:28:14.177321 kubelet[3318]: I0317 17:28:12.213351 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgffr\" (UniqueName: \"kubernetes.io/projected/d007819c-c75a-48ce-80a6-dcf89e240e01-kube-api-access-lgffr\") pod \"calico-apiserver-b557bfbcb-krcfl\" (UID: \"d007819c-c75a-48ce-80a6-dcf89e240e01\") " pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" Mar 17 17:28:14.177321 kubelet[3318]: I0317 17:28:12.213368 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2cxg\" (UniqueName: \"kubernetes.io/projected/6743fb2a-96da-4c19-b66f-02242ba2b410-kube-api-access-t2cxg\") pod \"coredns-6f6b679f8f-d87wb\" (UID: \"6743fb2a-96da-4c19-b66f-02242ba2b410\") " pod="kube-system/coredns-6f6b679f8f-d87wb" Mar 17 17:28:12.171169 systemd[1]: Created slice kubepods-besteffort-pod213ca9a6_a07c_4c72_a108_6e622fdd0452.slice - libcontainer container kubepods-besteffort-pod213ca9a6_a07c_4c72_a108_6e622fdd0452.slice. Mar 17 17:28:14.177890 kubelet[3318]: I0317 17:28:12.213384 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwzvr\" (UniqueName: \"kubernetes.io/projected/0186ea99-0088-4544-b92c-7659bf548a6e-kube-api-access-gwzvr\") pod \"coredns-6f6b679f8f-4tzzq\" (UID: \"0186ea99-0088-4544-b92c-7659bf548a6e\") " pod="kube-system/coredns-6f6b679f8f-4tzzq" Mar 17 17:28:14.177890 kubelet[3318]: I0317 17:28:12.213401 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhktv\" (UniqueName: \"kubernetes.io/projected/1e5b40ef-0b7d-4233-9b50-e552e3a1bd38-kube-api-access-xhktv\") pod \"calico-kube-controllers-6fcc8c87fb-p2fc6\" (UID: \"1e5b40ef-0b7d-4233-9b50-e552e3a1bd38\") " pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" Mar 17 17:28:14.177890 kubelet[3318]: I0317 17:28:12.213422 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/213ca9a6-a07c-4c72-a108-6e622fdd0452-calico-apiserver-certs\") pod \"calico-apiserver-b557bfbcb-wssz4\" (UID: \"213ca9a6-a07c-4c72-a108-6e622fdd0452\") " pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" Mar 17 17:28:14.177890 kubelet[3318]: I0317 17:28:12.213441 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d007819c-c75a-48ce-80a6-dcf89e240e01-calico-apiserver-certs\") pod \"calico-apiserver-b557bfbcb-krcfl\" (UID: \"d007819c-c75a-48ce-80a6-dcf89e240e01\") " pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" Mar 17 17:28:14.177890 kubelet[3318]: I0317 17:28:12.213459 3318 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5t2k\" (UniqueName: \"kubernetes.io/projected/213ca9a6-a07c-4c72-a108-6e622fdd0452-kube-api-access-x5t2k\") pod \"calico-apiserver-b557bfbcb-wssz4\" (UID: \"213ca9a6-a07c-4c72-a108-6e622fdd0452\") " pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" Mar 17 17:28:12.176225 systemd[1]: Created slice kubepods-besteffort-pod1e5b40ef_0b7d_4233_9b50_e552e3a1bd38.slice - libcontainer container kubepods-besteffort-pod1e5b40ef_0b7d_4233_9b50_e552e3a1bd38.slice. Mar 17 17:28:14.178402 kubelet[3318]: E0317 17:28:13.315553 3318 secret.go:188] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Mar 17 17:28:14.178402 kubelet[3318]: E0317 17:28:13.315651 3318 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d007819c-c75a-48ce-80a6-dcf89e240e01-calico-apiserver-certs podName:d007819c-c75a-48ce-80a6-dcf89e240e01 nodeName:}" failed. No retries permitted until 2025-03-17 17:28:13.815620749 +0000 UTC m=+43.888999541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/d007819c-c75a-48ce-80a6-dcf89e240e01-calico-apiserver-certs") pod "calico-apiserver-b557bfbcb-krcfl" (UID: "d007819c-c75a-48ce-80a6-dcf89e240e01") : failed to sync secret cache: timed out waiting for the condition Mar 17 17:28:14.178402 kubelet[3318]: E0317 17:28:13.315847 3318 secret.go:188] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Mar 17 17:28:14.178402 kubelet[3318]: E0317 17:28:13.315894 3318 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/213ca9a6-a07c-4c72-a108-6e622fdd0452-calico-apiserver-certs podName:213ca9a6-a07c-4c72-a108-6e622fdd0452 nodeName:}" failed. No retries permitted until 2025-03-17 17:28:13.815882429 +0000 UTC m=+43.889261221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/213ca9a6-a07c-4c72-a108-6e622fdd0452-calico-apiserver-certs") pod "calico-apiserver-b557bfbcb-wssz4" (UID: "213ca9a6-a07c-4c72-a108-6e622fdd0452") : failed to sync secret cache: timed out waiting for the condition Mar 17 17:28:12.181152 systemd[1]: Created slice kubepods-besteffort-podd007819c_c75a_48ce_80a6_dcf89e240e01.slice - libcontainer container kubepods-besteffort-podd007819c_c75a_48ce_80a6_dcf89e240e01.slice. Mar 17 17:28:13.046510 systemd[1]: Created slice kubepods-besteffort-podf6c1365d_cb18_415b_8a89_1e9f3710a559.slice - libcontainer container kubepods-besteffort-podf6c1365d_cb18_415b_8a89_1e9f3710a559.slice. Mar 17 17:28:14.476136 containerd[1745]: time="2025-03-17T17:28:14.476008586Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4tzzq,Uid:0186ea99-0088-4544-b92c-7659bf548a6e,Namespace:kube-system,Attempt:0,}" Mar 17 17:28:14.477368 containerd[1745]: time="2025-03-17T17:28:14.477239508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d87wb,Uid:6743fb2a-96da-4c19-b66f-02242ba2b410,Namespace:kube-system,Attempt:0,}" Mar 17 17:28:14.479275 containerd[1745]: time="2025-03-17T17:28:14.479124832Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-wssz4,Uid:213ca9a6-a07c-4c72-a108-6e622fdd0452,Namespace:calico-apiserver,Attempt:0,}" Mar 17 17:28:14.482095 containerd[1745]: time="2025-03-17T17:28:14.482056118Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-krcfl,Uid:d007819c-c75a-48ce-80a6-dcf89e240e01,Namespace:calico-apiserver,Attempt:0,}" Mar 17 17:28:14.497941 containerd[1745]: time="2025-03-17T17:28:14.497879870Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fcc8c87fb-p2fc6,Uid:1e5b40ef-0b7d-4233-9b50-e552e3a1bd38,Namespace:calico-system,Attempt:0,}" Mar 17 17:28:18.170347 containerd[1745]: time="2025-03-17T17:28:18.170237546Z" level=info msg="shim disconnected" id=7232668e3fef886b8f75e8a82ae971634dac5c5ffa5fd58d1b92c48b8c48cb59 namespace=k8s.io Mar 17 17:28:18.170347 containerd[1745]: time="2025-03-17T17:28:18.170305027Z" level=warning msg="cleaning up after shim disconnected" id=7232668e3fef886b8f75e8a82ae971634dac5c5ffa5fd58d1b92c48b8c48cb59 namespace=k8s.io Mar 17 17:28:18.170347 containerd[1745]: time="2025-03-17T17:28:18.170314027Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:28:18.657442 containerd[1745]: time="2025-03-17T17:28:18.657378589Z" level=error msg="Failed to destroy network for sandbox \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.657870 containerd[1745]: time="2025-03-17T17:28:18.657834590Z" level=error msg="encountered an error cleaning up failed sandbox \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.657938 containerd[1745]: time="2025-03-17T17:28:18.657908350Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-wssz4,Uid:213ca9a6-a07c-4c72-a108-6e622fdd0452,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.658912 kubelet[3318]: E0317 17:28:18.658278 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.658912 kubelet[3318]: E0317 17:28:18.658354 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" Mar 17 17:28:18.658912 kubelet[3318]: E0317 17:28:18.658373 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" Mar 17 17:28:18.659251 kubelet[3318]: E0317 17:28:18.658415 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b557bfbcb-wssz4_calico-apiserver(213ca9a6-a07c-4c72-a108-6e622fdd0452)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b557bfbcb-wssz4_calico-apiserver(213ca9a6-a07c-4c72-a108-6e622fdd0452)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" podUID="213ca9a6-a07c-4c72-a108-6e622fdd0452" Mar 17 17:28:18.689890 containerd[1745]: time="2025-03-17T17:28:18.689793293Z" level=error msg="Failed to destroy network for sandbox \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.690762 containerd[1745]: time="2025-03-17T17:28:18.690733815Z" level=error msg="encountered an error cleaning up failed sandbox \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.690961 containerd[1745]: time="2025-03-17T17:28:18.690927695Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d87wb,Uid:6743fb2a-96da-4c19-b66f-02242ba2b410,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.691258 kubelet[3318]: E0317 17:28:18.691220 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.692478 kubelet[3318]: E0317 17:28:18.692016 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d87wb" Mar 17 17:28:18.692478 kubelet[3318]: E0317 17:28:18.692069 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d87wb" Mar 17 17:28:18.692478 kubelet[3318]: E0317 17:28:18.692416 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-d87wb_kube-system(6743fb2a-96da-4c19-b66f-02242ba2b410)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-d87wb_kube-system(6743fb2a-96da-4c19-b66f-02242ba2b410)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-d87wb" podUID="6743fb2a-96da-4c19-b66f-02242ba2b410" Mar 17 17:28:18.709355 containerd[1745]: time="2025-03-17T17:28:18.709292091Z" level=error msg="Failed to destroy network for sandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.709891 containerd[1745]: time="2025-03-17T17:28:18.709689652Z" level=error msg="encountered an error cleaning up failed sandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.709891 containerd[1745]: time="2025-03-17T17:28:18.709750452Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zckwm,Uid:f6c1365d-cb18-415b-8a89-1e9f3710a559,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.710016 kubelet[3318]: E0317 17:28:18.709969 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.710096 kubelet[3318]: E0317 17:28:18.710021 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zckwm" Mar 17 17:28:18.710096 kubelet[3318]: E0317 17:28:18.710041 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zckwm" Mar 17 17:28:18.710172 kubelet[3318]: E0317 17:28:18.710085 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zckwm_calico-system(f6c1365d-cb18-415b-8a89-1e9f3710a559)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zckwm_calico-system(f6c1365d-cb18-415b-8a89-1e9f3710a559)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:28:18.734086 containerd[1745]: time="2025-03-17T17:28:18.733954020Z" level=error msg="Failed to destroy network for sandbox \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.734548 containerd[1745]: time="2025-03-17T17:28:18.734453901Z" level=error msg="encountered an error cleaning up failed sandbox \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.735480 containerd[1745]: time="2025-03-17T17:28:18.734531221Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fcc8c87fb-p2fc6,Uid:1e5b40ef-0b7d-4233-9b50-e552e3a1bd38,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.735865 kubelet[3318]: E0317 17:28:18.735836 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.736005 kubelet[3318]: E0317 17:28:18.735986 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" Mar 17 17:28:18.736093 kubelet[3318]: E0317 17:28:18.736075 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" Mar 17 17:28:18.736377 kubelet[3318]: E0317 17:28:18.736191 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6fcc8c87fb-p2fc6_calico-system(1e5b40ef-0b7d-4233-9b50-e552e3a1bd38)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6fcc8c87fb-p2fc6_calico-system(1e5b40ef-0b7d-4233-9b50-e552e3a1bd38)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" podUID="1e5b40ef-0b7d-4233-9b50-e552e3a1bd38" Mar 17 17:28:18.742510 containerd[1745]: time="2025-03-17T17:28:18.742459077Z" level=error msg="Failed to destroy network for sandbox \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.742789 containerd[1745]: time="2025-03-17T17:28:18.742762837Z" level=error msg="encountered an error cleaning up failed sandbox \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.742877 containerd[1745]: time="2025-03-17T17:28:18.742843717Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4tzzq,Uid:0186ea99-0088-4544-b92c-7659bf548a6e,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.743078 kubelet[3318]: E0317 17:28:18.743043 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.743398 kubelet[3318]: E0317 17:28:18.743095 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-4tzzq" Mar 17 17:28:18.743398 kubelet[3318]: E0317 17:28:18.743114 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-4tzzq" Mar 17 17:28:18.743398 kubelet[3318]: E0317 17:28:18.743153 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-4tzzq_kube-system(0186ea99-0088-4544-b92c-7659bf548a6e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-4tzzq_kube-system(0186ea99-0088-4544-b92c-7659bf548a6e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-4tzzq" podUID="0186ea99-0088-4544-b92c-7659bf548a6e" Mar 17 17:28:18.744664 containerd[1745]: time="2025-03-17T17:28:18.744536761Z" level=error msg="Failed to destroy network for sandbox \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.745113 containerd[1745]: time="2025-03-17T17:28:18.745087922Z" level=error msg="encountered an error cleaning up failed sandbox \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.745366 containerd[1745]: time="2025-03-17T17:28:18.745314322Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-krcfl,Uid:d007819c-c75a-48ce-80a6-dcf89e240e01,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.745751 kubelet[3318]: E0317 17:28:18.745687 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:18.745962 kubelet[3318]: E0317 17:28:18.745834 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" Mar 17 17:28:18.745962 kubelet[3318]: E0317 17:28:18.745862 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" Mar 17 17:28:18.745962 kubelet[3318]: E0317 17:28:18.745915 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b557bfbcb-krcfl_calico-apiserver(d007819c-c75a-48ce-80a6-dcf89e240e01)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b557bfbcb-krcfl_calico-apiserver(d007819c-c75a-48ce-80a6-dcf89e240e01)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" podUID="d007819c-c75a-48ce-80a6-dcf89e240e01" Mar 17 17:28:19.176423 kubelet[3318]: I0317 17:28:19.176389 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8" Mar 17 17:28:19.177897 containerd[1745]: time="2025-03-17T17:28:19.177664893Z" level=info msg="StopPodSandbox for \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\"" Mar 17 17:28:19.178574 containerd[1745]: time="2025-03-17T17:28:19.178413095Z" level=info msg="Ensure that sandbox 76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8 in task-service has been cleanup successfully" Mar 17 17:28:19.178764 containerd[1745]: time="2025-03-17T17:28:19.178663575Z" level=info msg="TearDown network for sandbox \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\" successfully" Mar 17 17:28:19.178764 containerd[1745]: time="2025-03-17T17:28:19.178681175Z" level=info msg="StopPodSandbox for \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\" returns successfully" Mar 17 17:28:19.179284 kubelet[3318]: I0317 17:28:19.178841 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455" Mar 17 17:28:19.179822 containerd[1745]: time="2025-03-17T17:28:19.179437057Z" level=info msg="StopPodSandbox for \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\"" Mar 17 17:28:19.179822 containerd[1745]: time="2025-03-17T17:28:19.179574377Z" level=info msg="Ensure that sandbox d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455 in task-service has been cleanup successfully" Mar 17 17:28:19.180051 containerd[1745]: time="2025-03-17T17:28:19.180028898Z" level=info msg="TearDown network for sandbox \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\" successfully" Mar 17 17:28:19.180218 containerd[1745]: time="2025-03-17T17:28:19.180107778Z" level=info msg="StopPodSandbox for \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\" returns successfully" Mar 17 17:28:19.180842 containerd[1745]: time="2025-03-17T17:28:19.180534139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4tzzq,Uid:0186ea99-0088-4544-b92c-7659bf548a6e,Namespace:kube-system,Attempt:1,}" Mar 17 17:28:19.180842 containerd[1745]: time="2025-03-17T17:28:19.180707779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d87wb,Uid:6743fb2a-96da-4c19-b66f-02242ba2b410,Namespace:kube-system,Attempt:1,}" Mar 17 17:28:19.181497 kubelet[3318]: I0317 17:28:19.181477 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3" Mar 17 17:28:19.183177 kubelet[3318]: I0317 17:28:19.183157 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954" Mar 17 17:28:19.183673 containerd[1745]: time="2025-03-17T17:28:19.183516745Z" level=info msg="StopPodSandbox for \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\"" Mar 17 17:28:19.184319 containerd[1745]: time="2025-03-17T17:28:19.183961026Z" level=info msg="Ensure that sandbox c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3 in task-service has been cleanup successfully" Mar 17 17:28:19.184319 containerd[1745]: time="2025-03-17T17:28:19.183980346Z" level=info msg="StopPodSandbox for \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\"" Mar 17 17:28:19.184440 containerd[1745]: time="2025-03-17T17:28:19.184415347Z" level=info msg="Ensure that sandbox 8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954 in task-service has been cleanup successfully" Mar 17 17:28:19.185049 containerd[1745]: time="2025-03-17T17:28:19.185015028Z" level=info msg="TearDown network for sandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\" successfully" Mar 17 17:28:19.185049 containerd[1745]: time="2025-03-17T17:28:19.185042148Z" level=info msg="StopPodSandbox for \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\" returns successfully" Mar 17 17:28:19.185453 containerd[1745]: time="2025-03-17T17:28:19.185265828Z" level=info msg="TearDown network for sandbox \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\" successfully" Mar 17 17:28:19.185453 containerd[1745]: time="2025-03-17T17:28:19.185285908Z" level=info msg="StopPodSandbox for \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\" returns successfully" Mar 17 17:28:19.186528 containerd[1745]: time="2025-03-17T17:28:19.186213830Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-wssz4,Uid:213ca9a6-a07c-4c72-a108-6e622fdd0452,Namespace:calico-apiserver,Attempt:1,}" Mar 17 17:28:19.186528 containerd[1745]: time="2025-03-17T17:28:19.186454071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zckwm,Uid:f6c1365d-cb18-415b-8a89-1e9f3710a559,Namespace:calico-system,Attempt:1,}" Mar 17 17:28:19.189745 containerd[1745]: time="2025-03-17T17:28:19.189707477Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 17 17:28:19.190469 kubelet[3318]: I0317 17:28:19.189512 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb" Mar 17 17:28:19.192194 containerd[1745]: time="2025-03-17T17:28:19.190689239Z" level=info msg="StopPodSandbox for \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\"" Mar 17 17:28:19.192194 containerd[1745]: time="2025-03-17T17:28:19.191117120Z" level=info msg="Ensure that sandbox ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb in task-service has been cleanup successfully" Mar 17 17:28:19.192194 containerd[1745]: time="2025-03-17T17:28:19.191345000Z" level=info msg="TearDown network for sandbox \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\" successfully" Mar 17 17:28:19.192194 containerd[1745]: time="2025-03-17T17:28:19.191363840Z" level=info msg="StopPodSandbox for \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\" returns successfully" Mar 17 17:28:19.192412 containerd[1745]: time="2025-03-17T17:28:19.192325522Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fcc8c87fb-p2fc6,Uid:1e5b40ef-0b7d-4233-9b50-e552e3a1bd38,Namespace:calico-system,Attempt:1,}" Mar 17 17:28:19.192930 kubelet[3318]: I0317 17:28:19.192899 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969" Mar 17 17:28:19.194250 containerd[1745]: time="2025-03-17T17:28:19.194083006Z" level=info msg="StopPodSandbox for \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\"" Mar 17 17:28:19.194778 containerd[1745]: time="2025-03-17T17:28:19.194745887Z" level=info msg="Ensure that sandbox 58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969 in task-service has been cleanup successfully" Mar 17 17:28:19.196374 containerd[1745]: time="2025-03-17T17:28:19.196324290Z" level=info msg="TearDown network for sandbox \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\" successfully" Mar 17 17:28:19.196374 containerd[1745]: time="2025-03-17T17:28:19.196353250Z" level=info msg="StopPodSandbox for \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\" returns successfully" Mar 17 17:28:19.197160 containerd[1745]: time="2025-03-17T17:28:19.196947811Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-krcfl,Uid:d007819c-c75a-48ce-80a6-dcf89e240e01,Namespace:calico-apiserver,Attempt:1,}" Mar 17 17:28:19.486508 systemd[1]: run-netns-cni\x2d15090c90\x2d4cd4\x2d9cb7\x2d69d4\x2d789d183c3a09.mount: Deactivated successfully. Mar 17 17:28:19.486596 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455-shm.mount: Deactivated successfully. Mar 17 17:28:19.486646 systemd[1]: run-netns-cni\x2dcd07d25a\x2d730a\x2dbdf2\x2d5b2c\x2d7b9421b81908.mount: Deactivated successfully. Mar 17 17:28:19.486693 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954-shm.mount: Deactivated successfully. Mar 17 17:28:19.486742 systemd[1]: run-netns-cni\x2d11acac8b\x2de8ac\x2d84aa\x2d5b09\x2d7dd234976a25.mount: Deactivated successfully. Mar 17 17:28:19.486783 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3-shm.mount: Deactivated successfully. Mar 17 17:28:19.613570 containerd[1745]: time="2025-03-17T17:28:19.613134070Z" level=error msg="Failed to destroy network for sandbox \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.614671 containerd[1745]: time="2025-03-17T17:28:19.614599633Z" level=error msg="encountered an error cleaning up failed sandbox \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.615342 containerd[1745]: time="2025-03-17T17:28:19.615291035Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4tzzq,Uid:0186ea99-0088-4544-b92c-7659bf548a6e,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.615963 kubelet[3318]: E0317 17:28:19.615888 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.616146 kubelet[3318]: E0317 17:28:19.616018 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-4tzzq" Mar 17 17:28:19.616146 kubelet[3318]: E0317 17:28:19.616045 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-4tzzq" Mar 17 17:28:19.616146 kubelet[3318]: E0317 17:28:19.616110 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-4tzzq_kube-system(0186ea99-0088-4544-b92c-7659bf548a6e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-4tzzq_kube-system(0186ea99-0088-4544-b92c-7659bf548a6e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-4tzzq" podUID="0186ea99-0088-4544-b92c-7659bf548a6e" Mar 17 17:28:19.647029 containerd[1745]: time="2025-03-17T17:28:19.646921417Z" level=error msg="Failed to destroy network for sandbox \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.648351 containerd[1745]: time="2025-03-17T17:28:19.647638738Z" level=error msg="encountered an error cleaning up failed sandbox \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.648351 containerd[1745]: time="2025-03-17T17:28:19.647718539Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d87wb,Uid:6743fb2a-96da-4c19-b66f-02242ba2b410,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.648483 kubelet[3318]: E0317 17:28:19.648011 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.648483 kubelet[3318]: E0317 17:28:19.648077 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d87wb" Mar 17 17:28:19.648483 kubelet[3318]: E0317 17:28:19.648097 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d87wb" Mar 17 17:28:19.648633 kubelet[3318]: E0317 17:28:19.648156 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-d87wb_kube-system(6743fb2a-96da-4c19-b66f-02242ba2b410)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-d87wb_kube-system(6743fb2a-96da-4c19-b66f-02242ba2b410)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-d87wb" podUID="6743fb2a-96da-4c19-b66f-02242ba2b410" Mar 17 17:28:19.669148 containerd[1745]: time="2025-03-17T17:28:19.669102981Z" level=error msg="Failed to destroy network for sandbox \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.671258 containerd[1745]: time="2025-03-17T17:28:19.669677862Z" level=error msg="Failed to destroy network for sandbox \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.671258 containerd[1745]: time="2025-03-17T17:28:19.671179025Z" level=error msg="encountered an error cleaning up failed sandbox \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.671258 containerd[1745]: time="2025-03-17T17:28:19.671225105Z" level=error msg="encountered an error cleaning up failed sandbox \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.671424 containerd[1745]: time="2025-03-17T17:28:19.671281385Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-wssz4,Uid:213ca9a6-a07c-4c72-a108-6e622fdd0452,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.671424 containerd[1745]: time="2025-03-17T17:28:19.671242105Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zckwm,Uid:f6c1365d-cb18-415b-8a89-1e9f3710a559,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.671694 kubelet[3318]: E0317 17:28:19.671524 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.671694 kubelet[3318]: E0317 17:28:19.671552 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.671694 kubelet[3318]: E0317 17:28:19.671579 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zckwm" Mar 17 17:28:19.671694 kubelet[3318]: E0317 17:28:19.671590 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" Mar 17 17:28:19.672098 kubelet[3318]: E0317 17:28:19.671602 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zckwm" Mar 17 17:28:19.672098 kubelet[3318]: E0317 17:28:19.671613 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" Mar 17 17:28:19.672098 kubelet[3318]: E0317 17:28:19.671647 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b557bfbcb-wssz4_calico-apiserver(213ca9a6-a07c-4c72-a108-6e622fdd0452)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b557bfbcb-wssz4_calico-apiserver(213ca9a6-a07c-4c72-a108-6e622fdd0452)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" podUID="213ca9a6-a07c-4c72-a108-6e622fdd0452" Mar 17 17:28:19.672298 kubelet[3318]: E0317 17:28:19.671646 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zckwm_calico-system(f6c1365d-cb18-415b-8a89-1e9f3710a559)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zckwm_calico-system(f6c1365d-cb18-415b-8a89-1e9f3710a559)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:28:19.688165 containerd[1745]: time="2025-03-17T17:28:19.688123698Z" level=error msg="Failed to destroy network for sandbox \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.688793 containerd[1745]: time="2025-03-17T17:28:19.688688939Z" level=error msg="encountered an error cleaning up failed sandbox \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.688793 containerd[1745]: time="2025-03-17T17:28:19.688747099Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-krcfl,Uid:d007819c-c75a-48ce-80a6-dcf89e240e01,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.689123 kubelet[3318]: E0317 17:28:19.689093 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.689534 kubelet[3318]: E0317 17:28:19.689239 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" Mar 17 17:28:19.689534 kubelet[3318]: E0317 17:28:19.689262 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" Mar 17 17:28:19.689534 kubelet[3318]: E0317 17:28:19.689313 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b557bfbcb-krcfl_calico-apiserver(d007819c-c75a-48ce-80a6-dcf89e240e01)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b557bfbcb-krcfl_calico-apiserver(d007819c-c75a-48ce-80a6-dcf89e240e01)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" podUID="d007819c-c75a-48ce-80a6-dcf89e240e01" Mar 17 17:28:19.692954 containerd[1745]: time="2025-03-17T17:28:19.692899868Z" level=error msg="Failed to destroy network for sandbox \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.693686 containerd[1745]: time="2025-03-17T17:28:19.693638389Z" level=error msg="encountered an error cleaning up failed sandbox \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.695310 containerd[1745]: time="2025-03-17T17:28:19.693703869Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fcc8c87fb-p2fc6,Uid:1e5b40ef-0b7d-4233-9b50-e552e3a1bd38,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.695403 kubelet[3318]: E0317 17:28:19.693910 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:19.695403 kubelet[3318]: E0317 17:28:19.693957 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" Mar 17 17:28:19.695403 kubelet[3318]: E0317 17:28:19.693975 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" Mar 17 17:28:19.695528 kubelet[3318]: E0317 17:28:19.694009 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6fcc8c87fb-p2fc6_calico-system(1e5b40ef-0b7d-4233-9b50-e552e3a1bd38)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6fcc8c87fb-p2fc6_calico-system(1e5b40ef-0b7d-4233-9b50-e552e3a1bd38)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" podUID="1e5b40ef-0b7d-4233-9b50-e552e3a1bd38" Mar 17 17:28:20.201073 kubelet[3318]: I0317 17:28:20.200466 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07" Mar 17 17:28:20.201686 containerd[1745]: time="2025-03-17T17:28:20.201259228Z" level=info msg="StopPodSandbox for \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\"" Mar 17 17:28:20.204640 containerd[1745]: time="2025-03-17T17:28:20.201611189Z" level=info msg="Ensure that sandbox ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07 in task-service has been cleanup successfully" Mar 17 17:28:20.204640 containerd[1745]: time="2025-03-17T17:28:20.202224550Z" level=info msg="TearDown network for sandbox \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\" successfully" Mar 17 17:28:20.204640 containerd[1745]: time="2025-03-17T17:28:20.202246190Z" level=info msg="StopPodSandbox for \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\" returns successfully" Mar 17 17:28:20.205248 containerd[1745]: time="2025-03-17T17:28:20.204985356Z" level=info msg="StopPodSandbox for \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\"" Mar 17 17:28:20.205248 containerd[1745]: time="2025-03-17T17:28:20.205072916Z" level=info msg="TearDown network for sandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\" successfully" Mar 17 17:28:20.205248 containerd[1745]: time="2025-03-17T17:28:20.205082996Z" level=info msg="StopPodSandbox for \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\" returns successfully" Mar 17 17:28:20.206091 kubelet[3318]: I0317 17:28:20.205997 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18" Mar 17 17:28:20.207574 containerd[1745]: time="2025-03-17T17:28:20.206959039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zckwm,Uid:f6c1365d-cb18-415b-8a89-1e9f3710a559,Namespace:calico-system,Attempt:2,}" Mar 17 17:28:20.207574 containerd[1745]: time="2025-03-17T17:28:20.207314240Z" level=info msg="StopPodSandbox for \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\"" Mar 17 17:28:20.207574 containerd[1745]: time="2025-03-17T17:28:20.207448640Z" level=info msg="Ensure that sandbox 57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18 in task-service has been cleanup successfully" Mar 17 17:28:20.208329 containerd[1745]: time="2025-03-17T17:28:20.208309042Z" level=info msg="TearDown network for sandbox \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\" successfully" Mar 17 17:28:20.208434 containerd[1745]: time="2025-03-17T17:28:20.208419482Z" level=info msg="StopPodSandbox for \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\" returns successfully" Mar 17 17:28:20.208911 containerd[1745]: time="2025-03-17T17:28:20.208890963Z" level=info msg="StopPodSandbox for \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\"" Mar 17 17:28:20.210191 containerd[1745]: time="2025-03-17T17:28:20.210021645Z" level=info msg="TearDown network for sandbox \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\" successfully" Mar 17 17:28:20.210191 containerd[1745]: time="2025-03-17T17:28:20.210042606Z" level=info msg="StopPodSandbox for \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\" returns successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.212061810Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d87wb,Uid:6743fb2a-96da-4c19-b66f-02242ba2b410,Namespace:kube-system,Attempt:2,}" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.213117252Z" level=info msg="StopPodSandbox for \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\"" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.213407412Z" level=info msg="Ensure that sandbox 2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad in task-service has been cleanup successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.213742333Z" level=info msg="TearDown network for sandbox \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\" successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.213808653Z" level=info msg="StopPodSandbox for \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\" returns successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.214275094Z" level=info msg="StopPodSandbox for \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\"" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.214391054Z" level=info msg="TearDown network for sandbox \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\" successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.214404374Z" level=info msg="StopPodSandbox for \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\" returns successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.215081655Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4tzzq,Uid:0186ea99-0088-4544-b92c-7659bf548a6e,Namespace:kube-system,Attempt:2,}" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.215381016Z" level=info msg="StopPodSandbox for \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\"" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.215508016Z" level=info msg="Ensure that sandbox e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5 in task-service has been cleanup successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.215724337Z" level=info msg="TearDown network for sandbox \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\" successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.215737377Z" level=info msg="StopPodSandbox for \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\" returns successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.216118778Z" level=info msg="StopPodSandbox for \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\"" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.216229218Z" level=info msg="TearDown network for sandbox \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\" successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.216240218Z" level=info msg="StopPodSandbox for \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\" returns successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.217210180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fcc8c87fb-p2fc6,Uid:1e5b40ef-0b7d-4233-9b50-e552e3a1bd38,Namespace:calico-system,Attempt:2,}" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.217862621Z" level=info msg="StopPodSandbox for \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\"" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.218102141Z" level=info msg="Ensure that sandbox 1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020 in task-service has been cleanup successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.218498942Z" level=info msg="TearDown network for sandbox \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\" successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.218512822Z" level=info msg="StopPodSandbox for \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\" returns successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.218694783Z" level=info msg="StopPodSandbox for \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\"" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.218784663Z" level=info msg="TearDown network for sandbox \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\" successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.218795383Z" level=info msg="StopPodSandbox for \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\" returns successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.219161343Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-krcfl,Uid:d007819c-c75a-48ce-80a6-dcf89e240e01,Namespace:calico-apiserver,Attempt:2,}" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.219828425Z" level=info msg="StopPodSandbox for \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\"" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.219980945Z" level=info msg="Ensure that sandbox fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d in task-service has been cleanup successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.220099585Z" level=info msg="TearDown network for sandbox \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\" successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.220109985Z" level=info msg="StopPodSandbox for \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\" returns successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.220380226Z" level=info msg="StopPodSandbox for \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\"" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.220448506Z" level=info msg="TearDown network for sandbox \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\" successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.220457226Z" level=info msg="StopPodSandbox for \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\" returns successfully" Mar 17 17:28:20.224298 containerd[1745]: time="2025-03-17T17:28:20.220729227Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-wssz4,Uid:213ca9a6-a07c-4c72-a108-6e622fdd0452,Namespace:calico-apiserver,Attempt:2,}" Mar 17 17:28:20.225101 kubelet[3318]: I0317 17:28:20.212611 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad" Mar 17 17:28:20.225101 kubelet[3318]: I0317 17:28:20.214408 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5" Mar 17 17:28:20.225101 kubelet[3318]: I0317 17:28:20.217402 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020" Mar 17 17:28:20.225101 kubelet[3318]: I0317 17:28:20.219387 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d" Mar 17 17:28:20.480845 systemd[1]: run-netns-cni\x2d4603c6b3\x2de63b\x2d91fe\x2d1459\x2d2461d8ee8a9b.mount: Deactivated successfully. Mar 17 17:28:20.481323 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020-shm.mount: Deactivated successfully. Mar 17 17:28:20.481508 systemd[1]: run-netns-cni\x2d56bdc36f\x2dcd6a\x2d44a3\x2d1f6e\x2dabb3b4c1de73.mount: Deactivated successfully. Mar 17 17:28:20.481916 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5-shm.mount: Deactivated successfully. Mar 17 17:28:20.482068 systemd[1]: run-netns-cni\x2de1d0077d\x2da47a\x2de2f1\x2ddff7\x2dfe694e69c85e.mount: Deactivated successfully. Mar 17 17:28:20.482419 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07-shm.mount: Deactivated successfully. Mar 17 17:28:20.482474 systemd[1]: run-netns-cni\x2dc0c31d10\x2db863\x2df0cb\x2d4a31\x2d3d353e623e66.mount: Deactivated successfully. Mar 17 17:28:20.482521 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d-shm.mount: Deactivated successfully. Mar 17 17:28:20.482567 systemd[1]: run-netns-cni\x2d323f9ed5\x2da798\x2d5699\x2d7a30\x2d67dd35bf669d.mount: Deactivated successfully. Mar 17 17:28:20.482611 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18-shm.mount: Deactivated successfully. Mar 17 17:28:20.482656 systemd[1]: run-netns-cni\x2d53feb356\x2d3174\x2dfcf0\x2d9de6\x2d058b7dc95281.mount: Deactivated successfully. Mar 17 17:28:20.482697 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad-shm.mount: Deactivated successfully. Mar 17 17:28:22.023838 containerd[1745]: time="2025-03-17T17:28:22.022062893Z" level=error msg="Failed to destroy network for sandbox \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.024640 containerd[1745]: time="2025-03-17T17:28:22.024453337Z" level=error msg="encountered an error cleaning up failed sandbox \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.024640 containerd[1745]: time="2025-03-17T17:28:22.024542658Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zckwm,Uid:f6c1365d-cb18-415b-8a89-1e9f3710a559,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.024972 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806-shm.mount: Deactivated successfully. Mar 17 17:28:22.025666 kubelet[3318]: E0317 17:28:22.025534 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.025666 kubelet[3318]: E0317 17:28:22.025623 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zckwm" Mar 17 17:28:22.025666 kubelet[3318]: E0317 17:28:22.025644 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zckwm" Mar 17 17:28:22.026056 kubelet[3318]: E0317 17:28:22.025691 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zckwm_calico-system(f6c1365d-cb18-415b-8a89-1e9f3710a559)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zckwm_calico-system(f6c1365d-cb18-415b-8a89-1e9f3710a559)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:28:22.226737 kubelet[3318]: I0317 17:28:22.226348 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806" Mar 17 17:28:22.227111 containerd[1745]: time="2025-03-17T17:28:22.227075896Z" level=info msg="StopPodSandbox for \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\"" Mar 17 17:28:22.227305 containerd[1745]: time="2025-03-17T17:28:22.227272537Z" level=info msg="Ensure that sandbox 5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806 in task-service has been cleanup successfully" Mar 17 17:28:22.228045 containerd[1745]: time="2025-03-17T17:28:22.227987498Z" level=info msg="TearDown network for sandbox \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\" successfully" Mar 17 17:28:22.228045 containerd[1745]: time="2025-03-17T17:28:22.228012178Z" level=info msg="StopPodSandbox for \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\" returns successfully" Mar 17 17:28:22.228472 containerd[1745]: time="2025-03-17T17:28:22.228299739Z" level=info msg="StopPodSandbox for \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\"" Mar 17 17:28:22.228472 containerd[1745]: time="2025-03-17T17:28:22.228384659Z" level=info msg="TearDown network for sandbox \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\" successfully" Mar 17 17:28:22.228472 containerd[1745]: time="2025-03-17T17:28:22.228397619Z" level=info msg="StopPodSandbox for \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\" returns successfully" Mar 17 17:28:22.229223 containerd[1745]: time="2025-03-17T17:28:22.228683699Z" level=info msg="StopPodSandbox for \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\"" Mar 17 17:28:22.229223 containerd[1745]: time="2025-03-17T17:28:22.228853300Z" level=info msg="TearDown network for sandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\" successfully" Mar 17 17:28:22.229223 containerd[1745]: time="2025-03-17T17:28:22.228877980Z" level=info msg="StopPodSandbox for \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\" returns successfully" Mar 17 17:28:22.229956 containerd[1745]: time="2025-03-17T17:28:22.229610301Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zckwm,Uid:f6c1365d-cb18-415b-8a89-1e9f3710a559,Namespace:calico-system,Attempt:3,}" Mar 17 17:28:22.264981 containerd[1745]: time="2025-03-17T17:28:22.264853331Z" level=error msg="Failed to destroy network for sandbox \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.265506 containerd[1745]: time="2025-03-17T17:28:22.265342772Z" level=error msg="encountered an error cleaning up failed sandbox \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.265506 containerd[1745]: time="2025-03-17T17:28:22.265409212Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d87wb,Uid:6743fb2a-96da-4c19-b66f-02242ba2b410,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.266089 kubelet[3318]: E0317 17:28:22.265663 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.266089 kubelet[3318]: E0317 17:28:22.265780 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d87wb" Mar 17 17:28:22.266089 kubelet[3318]: E0317 17:28:22.265817 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d87wb" Mar 17 17:28:22.266231 kubelet[3318]: E0317 17:28:22.265864 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-d87wb_kube-system(6743fb2a-96da-4c19-b66f-02242ba2b410)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-d87wb_kube-system(6743fb2a-96da-4c19-b66f-02242ba2b410)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-d87wb" podUID="6743fb2a-96da-4c19-b66f-02242ba2b410" Mar 17 17:28:22.374693 containerd[1745]: time="2025-03-17T17:28:22.374643307Z" level=error msg="Failed to destroy network for sandbox \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.375073 containerd[1745]: time="2025-03-17T17:28:22.375013507Z" level=error msg="encountered an error cleaning up failed sandbox \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.375143 containerd[1745]: time="2025-03-17T17:28:22.375108748Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4tzzq,Uid:0186ea99-0088-4544-b92c-7659bf548a6e,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.375381 kubelet[3318]: E0317 17:28:22.375342 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.375447 kubelet[3318]: E0317 17:28:22.375403 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-4tzzq" Mar 17 17:28:22.375447 kubelet[3318]: E0317 17:28:22.375423 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-4tzzq" Mar 17 17:28:22.375649 kubelet[3318]: E0317 17:28:22.375470 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-4tzzq_kube-system(0186ea99-0088-4544-b92c-7659bf548a6e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-4tzzq_kube-system(0186ea99-0088-4544-b92c-7659bf548a6e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-4tzzq" podUID="0186ea99-0088-4544-b92c-7659bf548a6e" Mar 17 17:28:22.463645 containerd[1745]: time="2025-03-17T17:28:22.463581562Z" level=error msg="Failed to destroy network for sandbox \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.464237 containerd[1745]: time="2025-03-17T17:28:22.464075563Z" level=error msg="encountered an error cleaning up failed sandbox \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.464237 containerd[1745]: time="2025-03-17T17:28:22.464141363Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fcc8c87fb-p2fc6,Uid:1e5b40ef-0b7d-4233-9b50-e552e3a1bd38,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.464426 kubelet[3318]: E0317 17:28:22.464378 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.464488 kubelet[3318]: E0317 17:28:22.464432 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" Mar 17 17:28:22.464488 kubelet[3318]: E0317 17:28:22.464450 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" Mar 17 17:28:22.464546 kubelet[3318]: E0317 17:28:22.464491 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6fcc8c87fb-p2fc6_calico-system(1e5b40ef-0b7d-4233-9b50-e552e3a1bd38)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6fcc8c87fb-p2fc6_calico-system(1e5b40ef-0b7d-4233-9b50-e552e3a1bd38)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" podUID="1e5b40ef-0b7d-4233-9b50-e552e3a1bd38" Mar 17 17:28:22.510328 containerd[1745]: time="2025-03-17T17:28:22.510251214Z" level=error msg="Failed to destroy network for sandbox \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.510684 containerd[1745]: time="2025-03-17T17:28:22.510641414Z" level=error msg="encountered an error cleaning up failed sandbox \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.510768 containerd[1745]: time="2025-03-17T17:28:22.510731775Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-krcfl,Uid:d007819c-c75a-48ce-80a6-dcf89e240e01,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.511118 kubelet[3318]: E0317 17:28:22.511077 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.511167 kubelet[3318]: E0317 17:28:22.511139 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" Mar 17 17:28:22.511167 kubelet[3318]: E0317 17:28:22.511160 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" Mar 17 17:28:22.511218 kubelet[3318]: E0317 17:28:22.511198 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b557bfbcb-krcfl_calico-apiserver(d007819c-c75a-48ce-80a6-dcf89e240e01)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b557bfbcb-krcfl_calico-apiserver(d007819c-c75a-48ce-80a6-dcf89e240e01)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" podUID="d007819c-c75a-48ce-80a6-dcf89e240e01" Mar 17 17:28:22.539209 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979-shm.mount: Deactivated successfully. Mar 17 17:28:22.539301 systemd[1]: run-netns-cni\x2dc5c6121a\x2d82e3\x2d6802\x2d420b\x2de1a1cedd6c5d.mount: Deactivated successfully. Mar 17 17:28:22.711008 containerd[1745]: time="2025-03-17T17:28:22.710855129Z" level=error msg="Failed to destroy network for sandbox \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.711223 containerd[1745]: time="2025-03-17T17:28:22.711190329Z" level=error msg="encountered an error cleaning up failed sandbox \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.711285 containerd[1745]: time="2025-03-17T17:28:22.711252489Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-wssz4,Uid:213ca9a6-a07c-4c72-a108-6e622fdd0452,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.714467 kubelet[3318]: E0317 17:28:22.711472 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:22.714467 kubelet[3318]: E0317 17:28:22.711526 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" Mar 17 17:28:22.714467 kubelet[3318]: E0317 17:28:22.711546 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" Mar 17 17:28:22.714185 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226-shm.mount: Deactivated successfully. Mar 17 17:28:22.714669 kubelet[3318]: E0317 17:28:22.711589 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b557bfbcb-wssz4_calico-apiserver(213ca9a6-a07c-4c72-a108-6e622fdd0452)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b557bfbcb-wssz4_calico-apiserver(213ca9a6-a07c-4c72-a108-6e622fdd0452)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" podUID="213ca9a6-a07c-4c72-a108-6e622fdd0452" Mar 17 17:28:23.210049 containerd[1745]: time="2025-03-17T17:28:23.209889031Z" level=error msg="Failed to destroy network for sandbox \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:23.215540 containerd[1745]: time="2025-03-17T17:28:23.210505392Z" level=error msg="encountered an error cleaning up failed sandbox \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:23.215540 containerd[1745]: time="2025-03-17T17:28:23.210605752Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zckwm,Uid:f6c1365d-cb18-415b-8a89-1e9f3710a559,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:23.212519 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26-shm.mount: Deactivated successfully. Mar 17 17:28:23.215762 kubelet[3318]: E0317 17:28:23.210891 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:23.215762 kubelet[3318]: E0317 17:28:23.210953 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zckwm" Mar 17 17:28:23.215762 kubelet[3318]: E0317 17:28:23.210974 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zckwm" Mar 17 17:28:23.216284 kubelet[3318]: E0317 17:28:23.211011 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zckwm_calico-system(f6c1365d-cb18-415b-8a89-1e9f3710a559)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zckwm_calico-system(f6c1365d-cb18-415b-8a89-1e9f3710a559)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:28:23.231411 kubelet[3318]: I0317 17:28:23.231236 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156" Mar 17 17:28:23.234170 containerd[1745]: time="2025-03-17T17:28:23.234077639Z" level=info msg="StopPodSandbox for \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\"" Mar 17 17:28:23.234581 containerd[1745]: time="2025-03-17T17:28:23.234451279Z" level=info msg="Ensure that sandbox c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156 in task-service has been cleanup successfully" Mar 17 17:28:23.235470 containerd[1745]: time="2025-03-17T17:28:23.235315841Z" level=info msg="TearDown network for sandbox \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\" successfully" Mar 17 17:28:23.235470 containerd[1745]: time="2025-03-17T17:28:23.235354521Z" level=info msg="StopPodSandbox for \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\" returns successfully" Mar 17 17:28:23.237578 containerd[1745]: time="2025-03-17T17:28:23.236911804Z" level=info msg="StopPodSandbox for \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\"" Mar 17 17:28:23.237578 containerd[1745]: time="2025-03-17T17:28:23.237200205Z" level=info msg="TearDown network for sandbox \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\" successfully" Mar 17 17:28:23.237578 containerd[1745]: time="2025-03-17T17:28:23.237217805Z" level=info msg="StopPodSandbox for \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\" returns successfully" Mar 17 17:28:23.238769 containerd[1745]: time="2025-03-17T17:28:23.238740928Z" level=info msg="StopPodSandbox for \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\"" Mar 17 17:28:23.239103 containerd[1745]: time="2025-03-17T17:28:23.238871328Z" level=info msg="TearDown network for sandbox \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\" successfully" Mar 17 17:28:23.239103 containerd[1745]: time="2025-03-17T17:28:23.238882368Z" level=info msg="StopPodSandbox for \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\" returns successfully" Mar 17 17:28:23.239467 kubelet[3318]: I0317 17:28:23.239163 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979" Mar 17 17:28:23.239648 containerd[1745]: time="2025-03-17T17:28:23.239616329Z" level=info msg="StopPodSandbox for \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\"" Mar 17 17:28:23.239794 containerd[1745]: time="2025-03-17T17:28:23.239765330Z" level=info msg="Ensure that sandbox 8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979 in task-service has been cleanup successfully" Mar 17 17:28:23.241650 containerd[1745]: time="2025-03-17T17:28:23.240949892Z" level=info msg="TearDown network for sandbox \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\" successfully" Mar 17 17:28:23.241650 containerd[1745]: time="2025-03-17T17:28:23.240974252Z" level=info msg="StopPodSandbox for \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\" returns successfully" Mar 17 17:28:23.241650 containerd[1745]: time="2025-03-17T17:28:23.241164533Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-krcfl,Uid:d007819c-c75a-48ce-80a6-dcf89e240e01,Namespace:calico-apiserver,Attempt:3,}" Mar 17 17:28:23.253200 containerd[1745]: time="2025-03-17T17:28:23.253163356Z" level=info msg="StopPodSandbox for \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\"" Mar 17 17:28:23.253623 containerd[1745]: time="2025-03-17T17:28:23.253422557Z" level=info msg="TearDown network for sandbox \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\" successfully" Mar 17 17:28:23.253623 containerd[1745]: time="2025-03-17T17:28:23.253439317Z" level=info msg="StopPodSandbox for \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\" returns successfully" Mar 17 17:28:23.254989 containerd[1745]: time="2025-03-17T17:28:23.254062958Z" level=info msg="StopPodSandbox for \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\"" Mar 17 17:28:23.254989 containerd[1745]: time="2025-03-17T17:28:23.254196878Z" level=info msg="TearDown network for sandbox \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\" successfully" Mar 17 17:28:23.254989 containerd[1745]: time="2025-03-17T17:28:23.254209758Z" level=info msg="StopPodSandbox for \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\" returns successfully" Mar 17 17:28:23.258929 containerd[1745]: time="2025-03-17T17:28:23.258897087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d87wb,Uid:6743fb2a-96da-4c19-b66f-02242ba2b410,Namespace:kube-system,Attempt:3,}" Mar 17 17:28:23.259571 kubelet[3318]: I0317 17:28:23.259539 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90" Mar 17 17:28:23.260983 containerd[1745]: time="2025-03-17T17:28:23.260690891Z" level=info msg="StopPodSandbox for \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\"" Mar 17 17:28:23.261673 containerd[1745]: time="2025-03-17T17:28:23.261066412Z" level=info msg="Ensure that sandbox 3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90 in task-service has been cleanup successfully" Mar 17 17:28:23.261673 containerd[1745]: time="2025-03-17T17:28:23.261494933Z" level=info msg="TearDown network for sandbox \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\" successfully" Mar 17 17:28:23.261673 containerd[1745]: time="2025-03-17T17:28:23.261554693Z" level=info msg="StopPodSandbox for \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\" returns successfully" Mar 17 17:28:23.262495 containerd[1745]: time="2025-03-17T17:28:23.262120014Z" level=info msg="StopPodSandbox for \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\"" Mar 17 17:28:23.262495 containerd[1745]: time="2025-03-17T17:28:23.262311054Z" level=info msg="TearDown network for sandbox \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\" successfully" Mar 17 17:28:23.262495 containerd[1745]: time="2025-03-17T17:28:23.262324294Z" level=info msg="StopPodSandbox for \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\" returns successfully" Mar 17 17:28:23.263713 containerd[1745]: time="2025-03-17T17:28:23.263103096Z" level=info msg="StopPodSandbox for \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\"" Mar 17 17:28:23.263713 containerd[1745]: time="2025-03-17T17:28:23.263180416Z" level=info msg="TearDown network for sandbox \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\" successfully" Mar 17 17:28:23.263713 containerd[1745]: time="2025-03-17T17:28:23.263189736Z" level=info msg="StopPodSandbox for \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\" returns successfully" Mar 17 17:28:23.264345 containerd[1745]: time="2025-03-17T17:28:23.264315138Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4tzzq,Uid:0186ea99-0088-4544-b92c-7659bf548a6e,Namespace:kube-system,Attempt:3,}" Mar 17 17:28:23.265964 kubelet[3318]: I0317 17:28:23.265384 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226" Mar 17 17:28:23.267672 containerd[1745]: time="2025-03-17T17:28:23.267169784Z" level=info msg="StopPodSandbox for \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\"" Mar 17 17:28:23.268527 containerd[1745]: time="2025-03-17T17:28:23.268418506Z" level=info msg="Ensure that sandbox d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226 in task-service has been cleanup successfully" Mar 17 17:28:23.269403 containerd[1745]: time="2025-03-17T17:28:23.269105188Z" level=info msg="TearDown network for sandbox \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\" successfully" Mar 17 17:28:23.269403 containerd[1745]: time="2025-03-17T17:28:23.269129388Z" level=info msg="StopPodSandbox for \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\" returns successfully" Mar 17 17:28:23.271497 containerd[1745]: time="2025-03-17T17:28:23.271317672Z" level=info msg="StopPodSandbox for \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\"" Mar 17 17:28:23.271497 containerd[1745]: time="2025-03-17T17:28:23.271447672Z" level=info msg="TearDown network for sandbox \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\" successfully" Mar 17 17:28:23.271497 containerd[1745]: time="2025-03-17T17:28:23.271458992Z" level=info msg="StopPodSandbox for \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\" returns successfully" Mar 17 17:28:23.278842 containerd[1745]: time="2025-03-17T17:28:23.277673004Z" level=info msg="StopPodSandbox for \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\"" Mar 17 17:28:23.278842 containerd[1745]: time="2025-03-17T17:28:23.278699246Z" level=info msg="TearDown network for sandbox \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\" successfully" Mar 17 17:28:23.278842 containerd[1745]: time="2025-03-17T17:28:23.278713806Z" level=info msg="StopPodSandbox for \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\" returns successfully" Mar 17 17:28:23.281787 containerd[1745]: time="2025-03-17T17:28:23.281264251Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-wssz4,Uid:213ca9a6-a07c-4c72-a108-6e622fdd0452,Namespace:calico-apiserver,Attempt:3,}" Mar 17 17:28:23.284072 kubelet[3318]: I0317 17:28:23.284016 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26" Mar 17 17:28:23.285962 containerd[1745]: time="2025-03-17T17:28:23.285691620Z" level=info msg="StopPodSandbox for \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\"" Mar 17 17:28:23.287637 containerd[1745]: time="2025-03-17T17:28:23.287162983Z" level=info msg="Ensure that sandbox ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26 in task-service has been cleanup successfully" Mar 17 17:28:23.289977 containerd[1745]: time="2025-03-17T17:28:23.289790548Z" level=info msg="TearDown network for sandbox \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\" successfully" Mar 17 17:28:23.289977 containerd[1745]: time="2025-03-17T17:28:23.289922508Z" level=info msg="StopPodSandbox for \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\" returns successfully" Mar 17 17:28:23.290915 containerd[1745]: time="2025-03-17T17:28:23.290873230Z" level=info msg="StopPodSandbox for \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\"" Mar 17 17:28:23.291088 containerd[1745]: time="2025-03-17T17:28:23.290957831Z" level=info msg="TearDown network for sandbox \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\" successfully" Mar 17 17:28:23.291088 containerd[1745]: time="2025-03-17T17:28:23.290967151Z" level=info msg="StopPodSandbox for \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\" returns successfully" Mar 17 17:28:23.292089 containerd[1745]: time="2025-03-17T17:28:23.291768552Z" level=info msg="StopPodSandbox for \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\"" Mar 17 17:28:23.292089 containerd[1745]: time="2025-03-17T17:28:23.291935952Z" level=info msg="TearDown network for sandbox \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\" successfully" Mar 17 17:28:23.292089 containerd[1745]: time="2025-03-17T17:28:23.291970153Z" level=info msg="StopPodSandbox for \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\" returns successfully" Mar 17 17:28:23.293288 containerd[1745]: time="2025-03-17T17:28:23.293144915Z" level=info msg="StopPodSandbox for \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\"" Mar 17 17:28:23.293408 containerd[1745]: time="2025-03-17T17:28:23.293390075Z" level=info msg="TearDown network for sandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\" successfully" Mar 17 17:28:23.293481 containerd[1745]: time="2025-03-17T17:28:23.293469035Z" level=info msg="StopPodSandbox for \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\" returns successfully" Mar 17 17:28:23.293831 kubelet[3318]: I0317 17:28:23.293731 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93" Mar 17 17:28:23.294734 containerd[1745]: time="2025-03-17T17:28:23.294142597Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zckwm,Uid:f6c1365d-cb18-415b-8a89-1e9f3710a559,Namespace:calico-system,Attempt:4,}" Mar 17 17:28:23.294734 containerd[1745]: time="2025-03-17T17:28:23.294719438Z" level=info msg="StopPodSandbox for \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\"" Mar 17 17:28:23.295113 containerd[1745]: time="2025-03-17T17:28:23.294886598Z" level=info msg="Ensure that sandbox 898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93 in task-service has been cleanup successfully" Mar 17 17:28:23.295737 containerd[1745]: time="2025-03-17T17:28:23.295693040Z" level=info msg="TearDown network for sandbox \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\" successfully" Mar 17 17:28:23.296017 containerd[1745]: time="2025-03-17T17:28:23.295864520Z" level=info msg="StopPodSandbox for \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\" returns successfully" Mar 17 17:28:23.296512 containerd[1745]: time="2025-03-17T17:28:23.296472921Z" level=info msg="StopPodSandbox for \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\"" Mar 17 17:28:23.297060 containerd[1745]: time="2025-03-17T17:28:23.296781842Z" level=info msg="TearDown network for sandbox \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\" successfully" Mar 17 17:28:23.297562 containerd[1745]: time="2025-03-17T17:28:23.296797242Z" level=info msg="StopPodSandbox for \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\" returns successfully" Mar 17 17:28:23.298223 containerd[1745]: time="2025-03-17T17:28:23.298110045Z" level=info msg="StopPodSandbox for \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\"" Mar 17 17:28:23.298323 containerd[1745]: time="2025-03-17T17:28:23.298308885Z" level=info msg="TearDown network for sandbox \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\" successfully" Mar 17 17:28:23.298399 containerd[1745]: time="2025-03-17T17:28:23.298386725Z" level=info msg="StopPodSandbox for \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\" returns successfully" Mar 17 17:28:23.299545 containerd[1745]: time="2025-03-17T17:28:23.299275807Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fcc8c87fb-p2fc6,Uid:1e5b40ef-0b7d-4233-9b50-e552e3a1bd38,Namespace:calico-system,Attempt:3,}" Mar 17 17:28:23.538274 systemd[1]: run-netns-cni\x2d0ce44572\x2d0a39\x2d544d\x2d1b06\x2d3c7f23e98415.mount: Deactivated successfully. Mar 17 17:28:23.538931 systemd[1]: run-netns-cni\x2d0115b9b6\x2d8a83\x2d2f5a\x2d9783\x2d9af1fd9c712b.mount: Deactivated successfully. Mar 17 17:28:23.539006 systemd[1]: run-netns-cni\x2d879e55f2\x2de304\x2dd6d6\x2d6a0b\x2d2315081c2a0a.mount: Deactivated successfully. Mar 17 17:28:23.539050 systemd[1]: run-netns-cni\x2d508c1c87\x2db735\x2da2e2\x2d8830\x2d3da778ee1727.mount: Deactivated successfully. Mar 17 17:28:23.539094 systemd[1]: run-netns-cni\x2d5887893f\x2d9848\x2d9c08\x2d8128\x2d28a5acd8e4ae.mount: Deactivated successfully. Mar 17 17:28:23.539135 systemd[1]: run-netns-cni\x2d763c318e\x2ddeb8\x2d17c6\x2d17be\x2df79e3f0c01c0.mount: Deactivated successfully. Mar 17 17:28:30.034546 containerd[1745]: time="2025-03-17T17:28:30.034366340Z" level=info msg="StopPodSandbox for \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\"" Mar 17 17:28:30.034546 containerd[1745]: time="2025-03-17T17:28:30.034475860Z" level=info msg="TearDown network for sandbox \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\" successfully" Mar 17 17:28:30.034546 containerd[1745]: time="2025-03-17T17:28:30.034486060Z" level=info msg="StopPodSandbox for \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\" returns successfully" Mar 17 17:28:30.035513 containerd[1745]: time="2025-03-17T17:28:30.035266781Z" level=info msg="RemovePodSandbox for \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\"" Mar 17 17:28:30.035513 containerd[1745]: time="2025-03-17T17:28:30.035304181Z" level=info msg="Forcibly stopping sandbox \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\"" Mar 17 17:28:30.035513 containerd[1745]: time="2025-03-17T17:28:30.035360861Z" level=info msg="TearDown network for sandbox \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\" successfully" Mar 17 17:28:31.591370 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1709678162.mount: Deactivated successfully. Mar 17 17:28:36.270631 containerd[1745]: time="2025-03-17T17:28:36.270573646Z" level=error msg="Failed to destroy network for sandbox \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.272964 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf-shm.mount: Deactivated successfully. Mar 17 17:28:36.274396 kubelet[3318]: E0317 17:28:36.273519 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.274396 kubelet[3318]: E0317 17:28:36.273580 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" Mar 17 17:28:36.274396 kubelet[3318]: E0317 17:28:36.273599 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" Mar 17 17:28:36.276088 containerd[1745]: time="2025-03-17T17:28:36.273162251Z" level=error msg="encountered an error cleaning up failed sandbox \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.276088 containerd[1745]: time="2025-03-17T17:28:36.273260451Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-krcfl,Uid:d007819c-c75a-48ce-80a6-dcf89e240e01,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.276190 kubelet[3318]: E0317 17:28:36.273638 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b557bfbcb-krcfl_calico-apiserver(d007819c-c75a-48ce-80a6-dcf89e240e01)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b557bfbcb-krcfl_calico-apiserver(d007819c-c75a-48ce-80a6-dcf89e240e01)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" podUID="d007819c-c75a-48ce-80a6-dcf89e240e01" Mar 17 17:28:36.323622 kubelet[3318]: I0317 17:28:36.323591 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf" Mar 17 17:28:36.325140 containerd[1745]: time="2025-03-17T17:28:36.324782669Z" level=info msg="StopPodSandbox for \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\"" Mar 17 17:28:36.327715 containerd[1745]: time="2025-03-17T17:28:36.327115033Z" level=info msg="Ensure that sandbox 3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf in task-service has been cleanup successfully" Mar 17 17:28:36.328390 containerd[1745]: time="2025-03-17T17:28:36.328351236Z" level=info msg="TearDown network for sandbox \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\" successfully" Mar 17 17:28:36.328528 containerd[1745]: time="2025-03-17T17:28:36.328511756Z" level=info msg="StopPodSandbox for \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\" returns successfully" Mar 17 17:28:36.329945 containerd[1745]: time="2025-03-17T17:28:36.329914719Z" level=info msg="StopPodSandbox for \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\"" Mar 17 17:28:36.330229 containerd[1745]: time="2025-03-17T17:28:36.330190759Z" level=info msg="TearDown network for sandbox \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\" successfully" Mar 17 17:28:36.330229 containerd[1745]: time="2025-03-17T17:28:36.330206519Z" level=info msg="StopPodSandbox for \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\" returns successfully" Mar 17 17:28:36.331043 containerd[1745]: time="2025-03-17T17:28:36.330976281Z" level=info msg="StopPodSandbox for \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\"" Mar 17 17:28:36.331260 containerd[1745]: time="2025-03-17T17:28:36.331178241Z" level=info msg="TearDown network for sandbox \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\" successfully" Mar 17 17:28:36.331260 containerd[1745]: time="2025-03-17T17:28:36.331204681Z" level=info msg="StopPodSandbox for \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\" returns successfully" Mar 17 17:28:36.331727 containerd[1745]: time="2025-03-17T17:28:36.331665842Z" level=info msg="StopPodSandbox for \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\"" Mar 17 17:28:36.331884 containerd[1745]: time="2025-03-17T17:28:36.331857162Z" level=info msg="TearDown network for sandbox \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\" successfully" Mar 17 17:28:36.332258 containerd[1745]: time="2025-03-17T17:28:36.331959842Z" level=info msg="StopPodSandbox for \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\" returns successfully" Mar 17 17:28:36.332727 containerd[1745]: time="2025-03-17T17:28:36.332462243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-krcfl,Uid:d007819c-c75a-48ce-80a6-dcf89e240e01,Namespace:calico-apiserver,Attempt:4,}" Mar 17 17:28:36.365443 containerd[1745]: time="2025-03-17T17:28:36.365398706Z" level=error msg="Failed to destroy network for sandbox \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.365900 containerd[1745]: time="2025-03-17T17:28:36.365871586Z" level=error msg="encountered an error cleaning up failed sandbox \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.366069 containerd[1745]: time="2025-03-17T17:28:36.366019227Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d87wb,Uid:6743fb2a-96da-4c19-b66f-02242ba2b410,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.366543 kubelet[3318]: E0317 17:28:36.366389 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.366543 kubelet[3318]: E0317 17:28:36.366466 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d87wb" Mar 17 17:28:36.366543 kubelet[3318]: E0317 17:28:36.366485 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d87wb" Mar 17 17:28:36.366890 kubelet[3318]: E0317 17:28:36.366706 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-d87wb_kube-system(6743fb2a-96da-4c19-b66f-02242ba2b410)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-d87wb_kube-system(6743fb2a-96da-4c19-b66f-02242ba2b410)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-d87wb" podUID="6743fb2a-96da-4c19-b66f-02242ba2b410" Mar 17 17:28:36.379393 containerd[1745]: time="2025-03-17T17:28:36.379190572Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:28:36.379393 containerd[1745]: time="2025-03-17T17:28:36.379260932Z" level=info msg="RemovePodSandbox \"d4cc4d775291c4ac1b6c2cd3678da25bd275df824a74e9053d14befae2801455\" returns successfully" Mar 17 17:28:36.379854 containerd[1745]: time="2025-03-17T17:28:36.379821893Z" level=info msg="StopPodSandbox for \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\"" Mar 17 17:28:36.379947 containerd[1745]: time="2025-03-17T17:28:36.379926933Z" level=info msg="TearDown network for sandbox \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\" successfully" Mar 17 17:28:36.379947 containerd[1745]: time="2025-03-17T17:28:36.379941973Z" level=info msg="StopPodSandbox for \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\" returns successfully" Mar 17 17:28:36.380580 containerd[1745]: time="2025-03-17T17:28:36.380432094Z" level=info msg="RemovePodSandbox for \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\"" Mar 17 17:28:36.380580 containerd[1745]: time="2025-03-17T17:28:36.380480294Z" level=info msg="Forcibly stopping sandbox \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\"" Mar 17 17:28:36.380878 containerd[1745]: time="2025-03-17T17:28:36.380708975Z" level=info msg="TearDown network for sandbox \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\" successfully" Mar 17 17:28:36.722975 containerd[1745]: time="2025-03-17T17:28:36.722869941Z" level=error msg="Failed to destroy network for sandbox \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.723547 containerd[1745]: time="2025-03-17T17:28:36.723390382Z" level=error msg="encountered an error cleaning up failed sandbox \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.723547 containerd[1745]: time="2025-03-17T17:28:36.723463582Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4tzzq,Uid:0186ea99-0088-4544-b92c-7659bf548a6e,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.724249 kubelet[3318]: E0317 17:28:36.723852 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.724249 kubelet[3318]: E0317 17:28:36.723915 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-4tzzq" Mar 17 17:28:36.724249 kubelet[3318]: E0317 17:28:36.723933 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-4tzzq" Mar 17 17:28:36.724468 kubelet[3318]: E0317 17:28:36.723969 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-4tzzq_kube-system(0186ea99-0088-4544-b92c-7659bf548a6e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-4tzzq_kube-system(0186ea99-0088-4544-b92c-7659bf548a6e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-4tzzq" podUID="0186ea99-0088-4544-b92c-7659bf548a6e" Mar 17 17:28:36.772369 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269-shm.mount: Deactivated successfully. Mar 17 17:28:36.772457 systemd[1]: run-netns-cni\x2deed6fedd\x2dc4ae\x2d5ac1\x2d8345\x2d21ac877add24.mount: Deactivated successfully. Mar 17 17:28:36.862517 containerd[1745]: time="2025-03-17T17:28:36.862338165Z" level=error msg="Failed to destroy network for sandbox \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.864705 containerd[1745]: time="2025-03-17T17:28:36.863252887Z" level=error msg="encountered an error cleaning up failed sandbox \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.864705 containerd[1745]: time="2025-03-17T17:28:36.863339727Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-wssz4,Uid:213ca9a6-a07c-4c72-a108-6e622fdd0452,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.864758 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f-shm.mount: Deactivated successfully. Mar 17 17:28:36.865289 kubelet[3318]: E0317 17:28:36.864991 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.865289 kubelet[3318]: E0317 17:28:36.865072 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" Mar 17 17:28:36.865289 kubelet[3318]: E0317 17:28:36.865093 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" Mar 17 17:28:36.865420 kubelet[3318]: E0317 17:28:36.865159 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b557bfbcb-wssz4_calico-apiserver(213ca9a6-a07c-4c72-a108-6e622fdd0452)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b557bfbcb-wssz4_calico-apiserver(213ca9a6-a07c-4c72-a108-6e622fdd0452)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" podUID="213ca9a6-a07c-4c72-a108-6e622fdd0452" Mar 17 17:28:36.911599 containerd[1745]: time="2025-03-17T17:28:36.911541458Z" level=error msg="Failed to destroy network for sandbox \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.911943 containerd[1745]: time="2025-03-17T17:28:36.911910139Z" level=error msg="encountered an error cleaning up failed sandbox \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.912004 containerd[1745]: time="2025-03-17T17:28:36.911975899Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zckwm,Uid:f6c1365d-cb18-415b-8a89-1e9f3710a559,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.912227 kubelet[3318]: E0317 17:28:36.912184 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.912298 kubelet[3318]: E0317 17:28:36.912248 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zckwm" Mar 17 17:28:36.912298 kubelet[3318]: E0317 17:28:36.912271 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zckwm" Mar 17 17:28:36.912343 kubelet[3318]: E0317 17:28:36.912308 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zckwm_calico-system(f6c1365d-cb18-415b-8a89-1e9f3710a559)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zckwm_calico-system(f6c1365d-cb18-415b-8a89-1e9f3710a559)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:28:36.915059 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054-shm.mount: Deactivated successfully. Mar 17 17:28:36.962668 containerd[1745]: time="2025-03-17T17:28:36.962599434Z" level=error msg="Failed to destroy network for sandbox \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.964656 containerd[1745]: time="2025-03-17T17:28:36.962976155Z" level=error msg="encountered an error cleaning up failed sandbox \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.964656 containerd[1745]: time="2025-03-17T17:28:36.963046915Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fcc8c87fb-p2fc6,Uid:1e5b40ef-0b7d-4233-9b50-e552e3a1bd38,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.966144 kubelet[3318]: E0317 17:28:36.963303 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:36.966144 kubelet[3318]: E0317 17:28:36.963358 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" Mar 17 17:28:36.966144 kubelet[3318]: E0317 17:28:36.963376 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" Mar 17 17:28:36.965643 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78-shm.mount: Deactivated successfully. Mar 17 17:28:36.966357 kubelet[3318]: E0317 17:28:36.963411 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6fcc8c87fb-p2fc6_calico-system(1e5b40ef-0b7d-4233-9b50-e552e3a1bd38)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6fcc8c87fb-p2fc6_calico-system(1e5b40ef-0b7d-4233-9b50-e552e3a1bd38)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" podUID="1e5b40ef-0b7d-4233-9b50-e552e3a1bd38" Mar 17 17:28:37.072768 containerd[1745]: time="2025-03-17T17:28:37.072513002Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:28:37.072768 containerd[1745]: time="2025-03-17T17:28:37.072711643Z" level=info msg="RemovePodSandbox \"2d09c2134690d4f4dfe16c672976aec848271f27adab36a57d1fb643779d1bad\" returns successfully" Mar 17 17:28:37.073711 containerd[1745]: time="2025-03-17T17:28:37.073441964Z" level=info msg="StopPodSandbox for \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\"" Mar 17 17:28:37.073711 containerd[1745]: time="2025-03-17T17:28:37.073549564Z" level=info msg="TearDown network for sandbox \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\" successfully" Mar 17 17:28:37.073711 containerd[1745]: time="2025-03-17T17:28:37.073560444Z" level=info msg="StopPodSandbox for \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\" returns successfully" Mar 17 17:28:37.074033 containerd[1745]: time="2025-03-17T17:28:37.073902005Z" level=info msg="RemovePodSandbox for \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\"" Mar 17 17:28:37.074033 containerd[1745]: time="2025-03-17T17:28:37.073936045Z" level=info msg="Forcibly stopping sandbox \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\"" Mar 17 17:28:37.074090 containerd[1745]: time="2025-03-17T17:28:37.074030405Z" level=info msg="TearDown network for sandbox \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\" successfully" Mar 17 17:28:37.118725 containerd[1745]: time="2025-03-17T17:28:37.118627089Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:37.221619 containerd[1745]: time="2025-03-17T17:28:37.221551204Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=137086024" Mar 17 17:28:37.325959 containerd[1745]: time="2025-03-17T17:28:37.325672561Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:28:37.325959 containerd[1745]: time="2025-03-17T17:28:37.325759001Z" level=info msg="RemovePodSandbox \"58d3ba789b91496142f6fbff1171a3775ed2a756e51fefcbca4d007e44d85969\" returns successfully" Mar 17 17:28:37.326775 containerd[1745]: time="2025-03-17T17:28:37.326628642Z" level=info msg="StopPodSandbox for \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\"" Mar 17 17:28:37.326775 containerd[1745]: time="2025-03-17T17:28:37.326730363Z" level=info msg="TearDown network for sandbox \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\" successfully" Mar 17 17:28:37.326775 containerd[1745]: time="2025-03-17T17:28:37.326740603Z" level=info msg="StopPodSandbox for \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\" returns successfully" Mar 17 17:28:37.327744 containerd[1745]: time="2025-03-17T17:28:37.327208764Z" level=info msg="RemovePodSandbox for \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\"" Mar 17 17:28:37.327744 containerd[1745]: time="2025-03-17T17:28:37.327232844Z" level=info msg="Forcibly stopping sandbox \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\"" Mar 17 17:28:37.327744 containerd[1745]: time="2025-03-17T17:28:37.327300684Z" level=info msg="TearDown network for sandbox \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\" successfully" Mar 17 17:28:37.328624 kubelet[3318]: I0317 17:28:37.328495 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f" Mar 17 17:28:37.330369 containerd[1745]: time="2025-03-17T17:28:37.330137289Z" level=info msg="StopPodSandbox for \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\"" Mar 17 17:28:37.331463 containerd[1745]: time="2025-03-17T17:28:37.331210611Z" level=info msg="Ensure that sandbox 38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f in task-service has been cleanup successfully" Mar 17 17:28:37.333332 systemd[1]: run-netns-cni\x2d2f59a086\x2dc060\x2d3476\x2d94cb\x2da956c22bf1e1.mount: Deactivated successfully. Mar 17 17:28:37.333877 containerd[1745]: time="2025-03-17T17:28:37.333447455Z" level=info msg="TearDown network for sandbox \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\" successfully" Mar 17 17:28:37.333877 containerd[1745]: time="2025-03-17T17:28:37.333473855Z" level=info msg="StopPodSandbox for \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\" returns successfully" Mar 17 17:28:37.334768 containerd[1745]: time="2025-03-17T17:28:37.334720218Z" level=info msg="StopPodSandbox for \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\"" Mar 17 17:28:37.335606 containerd[1745]: time="2025-03-17T17:28:37.334847218Z" level=info msg="TearDown network for sandbox \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\" successfully" Mar 17 17:28:37.335606 containerd[1745]: time="2025-03-17T17:28:37.334859138Z" level=info msg="StopPodSandbox for \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\" returns successfully" Mar 17 17:28:37.336922 containerd[1745]: time="2025-03-17T17:28:37.336611461Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4tzzq,Uid:0186ea99-0088-4544-b92c-7659bf548a6e,Namespace:kube-system,Attempt:4,}" Mar 17 17:28:37.337024 kubelet[3318]: I0317 17:28:37.336719 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f" Mar 17 17:28:37.337718 containerd[1745]: time="2025-03-17T17:28:37.337455623Z" level=info msg="StopPodSandbox for \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\"" Mar 17 17:28:37.337718 containerd[1745]: time="2025-03-17T17:28:37.337594943Z" level=info msg="Ensure that sandbox 037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f in task-service has been cleanup successfully" Mar 17 17:28:37.338094 containerd[1745]: time="2025-03-17T17:28:37.338074864Z" level=info msg="TearDown network for sandbox \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\" successfully" Mar 17 17:28:37.338484 containerd[1745]: time="2025-03-17T17:28:37.338464025Z" level=info msg="StopPodSandbox for \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\" returns successfully" Mar 17 17:28:37.338988 containerd[1745]: time="2025-03-17T17:28:37.338968946Z" level=info msg="StopPodSandbox for \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\"" Mar 17 17:28:37.339371 containerd[1745]: time="2025-03-17T17:28:37.339133626Z" level=info msg="TearDown network for sandbox \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\" successfully" Mar 17 17:28:37.339371 containerd[1745]: time="2025-03-17T17:28:37.339147866Z" level=info msg="StopPodSandbox for \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\" returns successfully" Mar 17 17:28:37.339775 containerd[1745]: time="2025-03-17T17:28:37.339662947Z" level=info msg="StopPodSandbox for \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\"" Mar 17 17:28:37.339775 containerd[1745]: time="2025-03-17T17:28:37.339726347Z" level=info msg="TearDown network for sandbox \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\" successfully" Mar 17 17:28:37.339775 containerd[1745]: time="2025-03-17T17:28:37.339735427Z" level=info msg="StopPodSandbox for \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\" returns successfully" Mar 17 17:28:37.340512 containerd[1745]: time="2025-03-17T17:28:37.340348108Z" level=info msg="StopPodSandbox for \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\"" Mar 17 17:28:37.340512 containerd[1745]: time="2025-03-17T17:28:37.340445869Z" level=info msg="TearDown network for sandbox \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\" successfully" Mar 17 17:28:37.340512 containerd[1745]: time="2025-03-17T17:28:37.340456469Z" level=info msg="StopPodSandbox for \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\" returns successfully" Mar 17 17:28:37.341449 containerd[1745]: time="2025-03-17T17:28:37.341094430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-wssz4,Uid:213ca9a6-a07c-4c72-a108-6e622fdd0452,Namespace:calico-apiserver,Attempt:4,}" Mar 17 17:28:37.341685 kubelet[3318]: I0317 17:28:37.341662 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054" Mar 17 17:28:37.342342 containerd[1745]: time="2025-03-17T17:28:37.342082792Z" level=info msg="StopPodSandbox for \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\"" Mar 17 17:28:37.342342 containerd[1745]: time="2025-03-17T17:28:37.342221752Z" level=info msg="Ensure that sandbox 22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054 in task-service has been cleanup successfully" Mar 17 17:28:37.342580 containerd[1745]: time="2025-03-17T17:28:37.342565313Z" level=info msg="TearDown network for sandbox \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\" successfully" Mar 17 17:28:37.342640 containerd[1745]: time="2025-03-17T17:28:37.342629273Z" level=info msg="StopPodSandbox for \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\" returns successfully" Mar 17 17:28:37.343205 containerd[1745]: time="2025-03-17T17:28:37.343185754Z" level=info msg="StopPodSandbox for \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\"" Mar 17 17:28:37.343519 containerd[1745]: time="2025-03-17T17:28:37.343502954Z" level=info msg="TearDown network for sandbox \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\" successfully" Mar 17 17:28:37.343652 containerd[1745]: time="2025-03-17T17:28:37.343588594Z" level=info msg="StopPodSandbox for \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\" returns successfully" Mar 17 17:28:37.344097 containerd[1745]: time="2025-03-17T17:28:37.344071355Z" level=info msg="StopPodSandbox for \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\"" Mar 17 17:28:37.344760 containerd[1745]: time="2025-03-17T17:28:37.344510996Z" level=info msg="TearDown network for sandbox \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\" successfully" Mar 17 17:28:37.344760 containerd[1745]: time="2025-03-17T17:28:37.344532116Z" level=info msg="StopPodSandbox for \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\" returns successfully" Mar 17 17:28:37.344889 containerd[1745]: time="2025-03-17T17:28:37.344871797Z" level=info msg="StopPodSandbox for \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\"" Mar 17 17:28:37.344967 containerd[1745]: time="2025-03-17T17:28:37.344940877Z" level=info msg="TearDown network for sandbox \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\" successfully" Mar 17 17:28:37.344967 containerd[1745]: time="2025-03-17T17:28:37.344957357Z" level=info msg="StopPodSandbox for \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\" returns successfully" Mar 17 17:28:37.346303 containerd[1745]: time="2025-03-17T17:28:37.346282720Z" level=info msg="StopPodSandbox for \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\"" Mar 17 17:28:37.346549 containerd[1745]: time="2025-03-17T17:28:37.346466480Z" level=info msg="TearDown network for sandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\" successfully" Mar 17 17:28:37.346549 containerd[1745]: time="2025-03-17T17:28:37.346483680Z" level=info msg="StopPodSandbox for \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\" returns successfully" Mar 17 17:28:37.347101 containerd[1745]: time="2025-03-17T17:28:37.347001921Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zckwm,Uid:f6c1365d-cb18-415b-8a89-1e9f3710a559,Namespace:calico-system,Attempt:5,}" Mar 17 17:28:37.347345 kubelet[3318]: I0317 17:28:37.347321 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78" Mar 17 17:28:37.348606 containerd[1745]: time="2025-03-17T17:28:37.348582084Z" level=info msg="StopPodSandbox for \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\"" Mar 17 17:28:37.348750 containerd[1745]: time="2025-03-17T17:28:37.348727644Z" level=info msg="Ensure that sandbox ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78 in task-service has been cleanup successfully" Mar 17 17:28:37.349434 containerd[1745]: time="2025-03-17T17:28:37.349407245Z" level=info msg="TearDown network for sandbox \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\" successfully" Mar 17 17:28:37.349434 containerd[1745]: time="2025-03-17T17:28:37.349428486Z" level=info msg="StopPodSandbox for \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\" returns successfully" Mar 17 17:28:37.349938 containerd[1745]: time="2025-03-17T17:28:37.349816886Z" level=info msg="StopPodSandbox for \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\"" Mar 17 17:28:37.349938 containerd[1745]: time="2025-03-17T17:28:37.349896206Z" level=info msg="TearDown network for sandbox \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\" successfully" Mar 17 17:28:37.349938 containerd[1745]: time="2025-03-17T17:28:37.349906406Z" level=info msg="StopPodSandbox for \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\" returns successfully" Mar 17 17:28:37.350358 containerd[1745]: time="2025-03-17T17:28:37.350333607Z" level=info msg="StopPodSandbox for \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\"" Mar 17 17:28:37.350634 containerd[1745]: time="2025-03-17T17:28:37.350576008Z" level=info msg="TearDown network for sandbox \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\" successfully" Mar 17 17:28:37.350634 containerd[1745]: time="2025-03-17T17:28:37.350591248Z" level=info msg="StopPodSandbox for \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\" returns successfully" Mar 17 17:28:37.350773 kubelet[3318]: I0317 17:28:37.350597 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269" Mar 17 17:28:37.351090 containerd[1745]: time="2025-03-17T17:28:37.350950968Z" level=info msg="StopPodSandbox for \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\"" Mar 17 17:28:37.351090 containerd[1745]: time="2025-03-17T17:28:37.351036009Z" level=info msg="TearDown network for sandbox \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\" successfully" Mar 17 17:28:37.351090 containerd[1745]: time="2025-03-17T17:28:37.351045449Z" level=info msg="StopPodSandbox for \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\" returns successfully" Mar 17 17:28:37.352016 containerd[1745]: time="2025-03-17T17:28:37.351879650Z" level=info msg="StopPodSandbox for \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\"" Mar 17 17:28:37.352016 containerd[1745]: time="2025-03-17T17:28:37.351971850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fcc8c87fb-p2fc6,Uid:1e5b40ef-0b7d-4233-9b50-e552e3a1bd38,Namespace:calico-system,Attempt:4,}" Mar 17 17:28:37.352415 containerd[1745]: time="2025-03-17T17:28:37.352243091Z" level=info msg="Ensure that sandbox 09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269 in task-service has been cleanup successfully" Mar 17 17:28:37.352780 containerd[1745]: time="2025-03-17T17:28:37.352751812Z" level=info msg="TearDown network for sandbox \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\" successfully" Mar 17 17:28:37.353078 containerd[1745]: time="2025-03-17T17:28:37.352987572Z" level=info msg="StopPodSandbox for \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\" returns successfully" Mar 17 17:28:37.353676 containerd[1745]: time="2025-03-17T17:28:37.353563653Z" level=info msg="StopPodSandbox for \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\"" Mar 17 17:28:37.353676 containerd[1745]: time="2025-03-17T17:28:37.353654334Z" level=info msg="TearDown network for sandbox \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\" successfully" Mar 17 17:28:37.353826 containerd[1745]: time="2025-03-17T17:28:37.353663614Z" level=info msg="StopPodSandbox for \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\" returns successfully" Mar 17 17:28:37.355209 containerd[1745]: time="2025-03-17T17:28:37.355123176Z" level=info msg="StopPodSandbox for \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\"" Mar 17 17:28:37.355274 containerd[1745]: time="2025-03-17T17:28:37.355208576Z" level=info msg="TearDown network for sandbox \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\" successfully" Mar 17 17:28:37.355274 containerd[1745]: time="2025-03-17T17:28:37.355220776Z" level=info msg="StopPodSandbox for \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\" returns successfully" Mar 17 17:28:37.355879 containerd[1745]: time="2025-03-17T17:28:37.355611017Z" level=info msg="StopPodSandbox for \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\"" Mar 17 17:28:37.355879 containerd[1745]: time="2025-03-17T17:28:37.355742017Z" level=info msg="TearDown network for sandbox \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\" successfully" Mar 17 17:28:37.355879 containerd[1745]: time="2025-03-17T17:28:37.355754617Z" level=info msg="StopPodSandbox for \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\" returns successfully" Mar 17 17:28:37.357123 containerd[1745]: time="2025-03-17T17:28:37.356933740Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d87wb,Uid:6743fb2a-96da-4c19-b66f-02242ba2b410,Namespace:kube-system,Attempt:4,}" Mar 17 17:28:37.371844 containerd[1745]: time="2025-03-17T17:28:37.371084846Z" level=info msg="ImageCreate event name:\"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:37.770618 systemd[1]: run-netns-cni\x2d64aaf510\x2d6352\x2df2a9\x2d9397\x2d30fcfd83c3d0.mount: Deactivated successfully. Mar 17 17:28:37.770703 systemd[1]: run-netns-cni\x2d7ff24acf\x2d37fa\x2d453e\x2d423a\x2d6121ba758149.mount: Deactivated successfully. Mar 17 17:28:37.770748 systemd[1]: run-netns-cni\x2df2ca029b\x2de849\x2d0045\x2d857d\x2d35defb619a9c.mount: Deactivated successfully. Mar 17 17:28:37.770792 systemd[1]: run-netns-cni\x2d9aaa7f3c\x2d0406\x2d4c8b\x2d90f5\x2da7fa25e06fa9.mount: Deactivated successfully. Mar 17 17:28:37.830792 containerd[1745]: time="2025-03-17T17:28:37.830674395Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:28:37.830986 containerd[1745]: time="2025-03-17T17:28:37.830752475Z" level=info msg="RemovePodSandbox \"1bd4c2546eff398b08fe1929eb72c09ecf9666dbfd350f9ca04f59ec91114020\" returns successfully" Mar 17 17:28:37.831706 containerd[1745]: time="2025-03-17T17:28:37.831530797Z" level=info msg="StopPodSandbox for \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\"" Mar 17 17:28:37.831706 containerd[1745]: time="2025-03-17T17:28:37.831626717Z" level=info msg="TearDown network for sandbox \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\" successfully" Mar 17 17:28:37.831706 containerd[1745]: time="2025-03-17T17:28:37.831637357Z" level=info msg="StopPodSandbox for \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\" returns successfully" Mar 17 17:28:37.832050 containerd[1745]: time="2025-03-17T17:28:37.832024838Z" level=info msg="RemovePodSandbox for \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\"" Mar 17 17:28:37.832104 containerd[1745]: time="2025-03-17T17:28:37.832052878Z" level=info msg="Forcibly stopping sandbox \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\"" Mar 17 17:28:37.832127 containerd[1745]: time="2025-03-17T17:28:37.832112278Z" level=info msg="TearDown network for sandbox \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\" successfully" Mar 17 17:28:37.872204 containerd[1745]: time="2025-03-17T17:28:37.871531992Z" level=error msg="Failed to destroy network for sandbox \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:37.872972 containerd[1745]: time="2025-03-17T17:28:37.872635474Z" level=error msg="encountered an error cleaning up failed sandbox \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:37.872972 containerd[1745]: time="2025-03-17T17:28:37.872925475Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-krcfl,Uid:d007819c-c75a-48ce-80a6-dcf89e240e01,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:37.873482 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66-shm.mount: Deactivated successfully. Mar 17 17:28:37.875870 kubelet[3318]: E0317 17:28:37.874286 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:37.875870 kubelet[3318]: E0317 17:28:37.874344 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" Mar 17 17:28:37.875870 kubelet[3318]: E0317 17:28:37.874364 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" Mar 17 17:28:37.876042 kubelet[3318]: E0317 17:28:37.874408 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b557bfbcb-krcfl_calico-apiserver(d007819c-c75a-48ce-80a6-dcf89e240e01)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b557bfbcb-krcfl_calico-apiserver(d007819c-c75a-48ce-80a6-dcf89e240e01)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" podUID="d007819c-c75a-48ce-80a6-dcf89e240e01" Mar 17 17:28:38.067274 containerd[1745]: time="2025-03-17T17:28:38.067135482Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:38.067876 containerd[1745]: time="2025-03-17T17:28:38.067845603Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"137085886\" in 18.878102326s" Mar 17 17:28:38.067931 containerd[1745]: time="2025-03-17T17:28:38.067878524Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\"" Mar 17 17:28:38.326947 containerd[1745]: time="2025-03-17T17:28:38.325264810Z" level=info msg="CreateContainer within sandbox \"95e1b53ae2405324d8d733788d61c4a22aee2172140ca3981acda8be92b1b446\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 17:28:38.355261 kubelet[3318]: I0317 17:28:38.355127 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66" Mar 17 17:28:38.356009 containerd[1745]: time="2025-03-17T17:28:38.355972108Z" level=info msg="StopPodSandbox for \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\"" Mar 17 17:28:38.356203 containerd[1745]: time="2025-03-17T17:28:38.356154268Z" level=info msg="Ensure that sandbox e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66 in task-service has been cleanup successfully" Mar 17 17:28:38.359060 containerd[1745]: time="2025-03-17T17:28:38.357714791Z" level=info msg="TearDown network for sandbox \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\" successfully" Mar 17 17:28:38.359060 containerd[1745]: time="2025-03-17T17:28:38.357752751Z" level=info msg="StopPodSandbox for \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\" returns successfully" Mar 17 17:28:38.359893 systemd[1]: run-netns-cni\x2de1f6ae0a\x2decdf\x2df8b1\x2d6dad\x2d23253be3ba94.mount: Deactivated successfully. Mar 17 17:28:38.360784 containerd[1745]: time="2025-03-17T17:28:38.360286196Z" level=info msg="StopPodSandbox for \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\"" Mar 17 17:28:38.360784 containerd[1745]: time="2025-03-17T17:28:38.360392476Z" level=info msg="TearDown network for sandbox \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\" successfully" Mar 17 17:28:38.360784 containerd[1745]: time="2025-03-17T17:28:38.360402836Z" level=info msg="StopPodSandbox for \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\" returns successfully" Mar 17 17:28:38.361759 containerd[1745]: time="2025-03-17T17:28:38.361640519Z" level=info msg="StopPodSandbox for \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\"" Mar 17 17:28:38.361977 containerd[1745]: time="2025-03-17T17:28:38.361857239Z" level=info msg="TearDown network for sandbox \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\" successfully" Mar 17 17:28:38.361977 containerd[1745]: time="2025-03-17T17:28:38.361874039Z" level=info msg="StopPodSandbox for \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\" returns successfully" Mar 17 17:28:38.362919 containerd[1745]: time="2025-03-17T17:28:38.362898281Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-krcfl,Uid:d007819c-c75a-48ce-80a6-dcf89e240e01,Namespace:calico-apiserver,Attempt:5,}" Mar 17 17:28:39.639127 containerd[1745]: time="2025-03-17T17:28:39.639019853Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:28:39.639127 containerd[1745]: time="2025-03-17T17:28:39.639083293Z" level=info msg="RemovePodSandbox \"ace5c12ad367bc4c9bca90e3bb19d2dc6b7c629c756543f90ed8dffcb00b13bb\" returns successfully" Mar 17 17:28:39.639548 containerd[1745]: time="2025-03-17T17:28:39.639512814Z" level=info msg="StopPodSandbox for \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\"" Mar 17 17:28:39.639646 containerd[1745]: time="2025-03-17T17:28:39.639598894Z" level=info msg="TearDown network for sandbox \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\" successfully" Mar 17 17:28:39.639646 containerd[1745]: time="2025-03-17T17:28:39.639629294Z" level=info msg="StopPodSandbox for \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\" returns successfully" Mar 17 17:28:39.640044 containerd[1745]: time="2025-03-17T17:28:39.640019335Z" level=info msg="RemovePodSandbox for \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\"" Mar 17 17:28:39.640088 containerd[1745]: time="2025-03-17T17:28:39.640045855Z" level=info msg="Forcibly stopping sandbox \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\"" Mar 17 17:28:39.640125 containerd[1745]: time="2025-03-17T17:28:39.640108215Z" level=info msg="TearDown network for sandbox \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\" successfully" Mar 17 17:28:39.740262 containerd[1745]: time="2025-03-17T17:28:39.740181124Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:28:39.740262 containerd[1745]: time="2025-03-17T17:28:39.740247685Z" level=info msg="RemovePodSandbox \"e9a2c387ae88c84195f99733a15396e2ba755bc6d289ccee330ee09723f40ff5\" returns successfully" Mar 17 17:28:39.740885 containerd[1745]: time="2025-03-17T17:28:39.740863406Z" level=info msg="StopPodSandbox for \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\"" Mar 17 17:28:39.741120 containerd[1745]: time="2025-03-17T17:28:39.741034126Z" level=info msg="TearDown network for sandbox \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\" successfully" Mar 17 17:28:39.741120 containerd[1745]: time="2025-03-17T17:28:39.741050006Z" level=info msg="StopPodSandbox for \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\" returns successfully" Mar 17 17:28:39.742544 containerd[1745]: time="2025-03-17T17:28:39.741589567Z" level=info msg="RemovePodSandbox for \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\"" Mar 17 17:28:39.742544 containerd[1745]: time="2025-03-17T17:28:39.741615687Z" level=info msg="Forcibly stopping sandbox \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\"" Mar 17 17:28:39.742544 containerd[1745]: time="2025-03-17T17:28:39.741673727Z" level=info msg="TearDown network for sandbox \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\" successfully" Mar 17 17:28:39.781175 containerd[1745]: time="2025-03-17T17:28:39.781123442Z" level=error msg="Failed to destroy network for sandbox \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.781641 containerd[1745]: time="2025-03-17T17:28:39.781615323Z" level=error msg="encountered an error cleaning up failed sandbox \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.781787 containerd[1745]: time="2025-03-17T17:28:39.781764643Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d87wb,Uid:6743fb2a-96da-4c19-b66f-02242ba2b410,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.782755 kubelet[3318]: E0317 17:28:39.782233 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.782755 kubelet[3318]: E0317 17:28:39.782287 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d87wb" Mar 17 17:28:39.782755 kubelet[3318]: E0317 17:28:39.782322 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d87wb" Mar 17 17:28:39.784441 kubelet[3318]: E0317 17:28:39.782366 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-d87wb_kube-system(6743fb2a-96da-4c19-b66f-02242ba2b410)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-d87wb_kube-system(6743fb2a-96da-4c19-b66f-02242ba2b410)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-d87wb" podUID="6743fb2a-96da-4c19-b66f-02242ba2b410" Mar 17 17:28:39.815737 containerd[1745]: time="2025-03-17T17:28:39.815667787Z" level=info msg="CreateContainer within sandbox \"95e1b53ae2405324d8d733788d61c4a22aee2172140ca3981acda8be92b1b446\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"00c396aa578cae056c045c32c69b2e07a852f22985f12f0a9b7516f5454ff8ae\"" Mar 17 17:28:39.818814 containerd[1745]: time="2025-03-17T17:28:39.818707833Z" level=info msg="StartContainer for \"00c396aa578cae056c045c32c69b2e07a852f22985f12f0a9b7516f5454ff8ae\"" Mar 17 17:28:39.821985 containerd[1745]: time="2025-03-17T17:28:39.821041717Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:28:39.821985 containerd[1745]: time="2025-03-17T17:28:39.821126677Z" level=info msg="RemovePodSandbox \"76052be68beaebe81b2653d012d4457e8fe984f11a808823f67a1585348344d8\" returns successfully" Mar 17 17:28:39.823309 containerd[1745]: time="2025-03-17T17:28:39.823282121Z" level=info msg="StopPodSandbox for \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\"" Mar 17 17:28:39.823791 containerd[1745]: time="2025-03-17T17:28:39.823761442Z" level=info msg="TearDown network for sandbox \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\" successfully" Mar 17 17:28:39.824502 containerd[1745]: time="2025-03-17T17:28:39.824473804Z" level=info msg="StopPodSandbox for \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\" returns successfully" Mar 17 17:28:39.826108 containerd[1745]: time="2025-03-17T17:28:39.826077407Z" level=info msg="RemovePodSandbox for \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\"" Mar 17 17:28:39.826196 containerd[1745]: time="2025-03-17T17:28:39.826112247Z" level=info msg="Forcibly stopping sandbox \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\"" Mar 17 17:28:39.826196 containerd[1745]: time="2025-03-17T17:28:39.826190007Z" level=info msg="TearDown network for sandbox \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\" successfully" Mar 17 17:28:39.853092 containerd[1745]: time="2025-03-17T17:28:39.853040618Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:28:39.853413 containerd[1745]: time="2025-03-17T17:28:39.853114378Z" level=info msg="RemovePodSandbox \"57a8754906faa2f57e41781a549a071aa46ba1d5061820ae654f6034ea7dec18\" returns successfully" Mar 17 17:28:39.853689 containerd[1745]: time="2025-03-17T17:28:39.853665539Z" level=info msg="StopPodSandbox for \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\"" Mar 17 17:28:39.854087 containerd[1745]: time="2025-03-17T17:28:39.853862179Z" level=info msg="TearDown network for sandbox \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\" successfully" Mar 17 17:28:39.854087 containerd[1745]: time="2025-03-17T17:28:39.853880779Z" level=info msg="StopPodSandbox for \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\" returns successfully" Mar 17 17:28:39.854507 containerd[1745]: time="2025-03-17T17:28:39.854421940Z" level=info msg="RemovePodSandbox for \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\"" Mar 17 17:28:39.854507 containerd[1745]: time="2025-03-17T17:28:39.854446540Z" level=info msg="Forcibly stopping sandbox \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\"" Mar 17 17:28:39.854684 containerd[1745]: time="2025-03-17T17:28:39.854666141Z" level=info msg="TearDown network for sandbox \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\" successfully" Mar 17 17:28:39.878747 containerd[1745]: time="2025-03-17T17:28:39.878703506Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:28:39.879063 containerd[1745]: time="2025-03-17T17:28:39.878905027Z" level=info msg="RemovePodSandbox \"c51b838063484263110f52948873ce6a07fc11a070d87dab51237e708120a0e3\" returns successfully" Mar 17 17:28:39.879694 containerd[1745]: time="2025-03-17T17:28:39.879514188Z" level=info msg="StopPodSandbox for \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\"" Mar 17 17:28:39.879694 containerd[1745]: time="2025-03-17T17:28:39.879615428Z" level=info msg="TearDown network for sandbox \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\" successfully" Mar 17 17:28:39.879694 containerd[1745]: time="2025-03-17T17:28:39.879625628Z" level=info msg="StopPodSandbox for \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\" returns successfully" Mar 17 17:28:39.880697 containerd[1745]: time="2025-03-17T17:28:39.880458430Z" level=info msg="RemovePodSandbox for \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\"" Mar 17 17:28:39.880780 containerd[1745]: time="2025-03-17T17:28:39.880758030Z" level=info msg="Forcibly stopping sandbox \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\"" Mar 17 17:28:39.881627 containerd[1745]: time="2025-03-17T17:28:39.881592272Z" level=info msg="TearDown network for sandbox \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\" successfully" Mar 17 17:28:39.886881 containerd[1745]: time="2025-03-17T17:28:39.886784801Z" level=error msg="Failed to destroy network for sandbox \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.888914 containerd[1745]: time="2025-03-17T17:28:39.888309684Z" level=error msg="encountered an error cleaning up failed sandbox \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.888914 containerd[1745]: time="2025-03-17T17:28:39.888475445Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4tzzq,Uid:0186ea99-0088-4544-b92c-7659bf548a6e,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.889057 kubelet[3318]: E0317 17:28:39.888694 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.889057 kubelet[3318]: E0317 17:28:39.888746 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-4tzzq" Mar 17 17:28:39.889057 kubelet[3318]: E0317 17:28:39.888772 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-4tzzq" Mar 17 17:28:39.889592 kubelet[3318]: E0317 17:28:39.889312 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-4tzzq_kube-system(0186ea99-0088-4544-b92c-7659bf548a6e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-4tzzq_kube-system(0186ea99-0088-4544-b92c-7659bf548a6e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-4tzzq" podUID="0186ea99-0088-4544-b92c-7659bf548a6e" Mar 17 17:28:39.899012 systemd[1]: Started cri-containerd-00c396aa578cae056c045c32c69b2e07a852f22985f12f0a9b7516f5454ff8ae.scope - libcontainer container 00c396aa578cae056c045c32c69b2e07a852f22985f12f0a9b7516f5454ff8ae. Mar 17 17:28:39.900541 containerd[1745]: time="2025-03-17T17:28:39.900410547Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:28:39.900541 containerd[1745]: time="2025-03-17T17:28:39.900468867Z" level=info msg="RemovePodSandbox \"fcf3794bb523b0c2145761bd24ba5baa270f67c5813ec82d3e19069770d5807d\" returns successfully" Mar 17 17:28:39.902057 containerd[1745]: time="2025-03-17T17:28:39.901567309Z" level=info msg="StopPodSandbox for \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\"" Mar 17 17:28:39.903090 containerd[1745]: time="2025-03-17T17:28:39.903034032Z" level=info msg="TearDown network for sandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\" successfully" Mar 17 17:28:39.904920 containerd[1745]: time="2025-03-17T17:28:39.903238513Z" level=info msg="StopPodSandbox for \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\" returns successfully" Mar 17 17:28:39.905391 containerd[1745]: time="2025-03-17T17:28:39.905368877Z" level=info msg="RemovePodSandbox for \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\"" Mar 17 17:28:39.905505 containerd[1745]: time="2025-03-17T17:28:39.905491197Z" level=info msg="Forcibly stopping sandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\"" Mar 17 17:28:39.905714 containerd[1745]: time="2025-03-17T17:28:39.905698997Z" level=info msg="TearDown network for sandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\" successfully" Mar 17 17:28:39.920817 containerd[1745]: time="2025-03-17T17:28:39.920732146Z" level=error msg="Failed to destroy network for sandbox \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.921355 containerd[1745]: time="2025-03-17T17:28:39.921317147Z" level=error msg="encountered an error cleaning up failed sandbox \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.921429 containerd[1745]: time="2025-03-17T17:28:39.921399987Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-wssz4,Uid:213ca9a6-a07c-4c72-a108-6e622fdd0452,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.922434 kubelet[3318]: E0317 17:28:39.922381 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.922597 kubelet[3318]: E0317 17:28:39.922579 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" Mar 17 17:28:39.922824 kubelet[3318]: E0317 17:28:39.922668 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" Mar 17 17:28:39.922824 kubelet[3318]: E0317 17:28:39.922748 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b557bfbcb-wssz4_calico-apiserver(213ca9a6-a07c-4c72-a108-6e622fdd0452)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b557bfbcb-wssz4_calico-apiserver(213ca9a6-a07c-4c72-a108-6e622fdd0452)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" podUID="213ca9a6-a07c-4c72-a108-6e622fdd0452" Mar 17 17:28:39.927782 containerd[1745]: time="2025-03-17T17:28:39.927617039Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:28:39.927782 containerd[1745]: time="2025-03-17T17:28:39.927680719Z" level=info msg="RemovePodSandbox \"8beaa8c528cd2d6aef087b3430f5ec3f509ac51c512765e7d01f97b24cece954\" returns successfully" Mar 17 17:28:39.928711 containerd[1745]: time="2025-03-17T17:28:39.928440160Z" level=info msg="StopPodSandbox for \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\"" Mar 17 17:28:39.928711 containerd[1745]: time="2025-03-17T17:28:39.928549880Z" level=info msg="TearDown network for sandbox \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\" successfully" Mar 17 17:28:39.928711 containerd[1745]: time="2025-03-17T17:28:39.928560360Z" level=info msg="StopPodSandbox for \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\" returns successfully" Mar 17 17:28:39.933091 containerd[1745]: time="2025-03-17T17:28:39.932001727Z" level=info msg="RemovePodSandbox for \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\"" Mar 17 17:28:39.933091 containerd[1745]: time="2025-03-17T17:28:39.932045967Z" level=info msg="Forcibly stopping sandbox \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\"" Mar 17 17:28:39.933091 containerd[1745]: time="2025-03-17T17:28:39.932118007Z" level=info msg="TearDown network for sandbox \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\" successfully" Mar 17 17:28:39.952249 containerd[1745]: time="2025-03-17T17:28:39.950533602Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:28:39.952471 containerd[1745]: time="2025-03-17T17:28:39.952440926Z" level=info msg="RemovePodSandbox \"ee14d7aae11e48d6a5eed72409b03f0a8665e4517dd70fd3ee7ecb4ff7932b07\" returns successfully" Mar 17 17:28:39.953486 containerd[1745]: time="2025-03-17T17:28:39.953390167Z" level=info msg="StopPodSandbox for \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\"" Mar 17 17:28:39.953680 containerd[1745]: time="2025-03-17T17:28:39.953663408Z" level=info msg="TearDown network for sandbox \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\" successfully" Mar 17 17:28:39.953757 containerd[1745]: time="2025-03-17T17:28:39.953744608Z" level=info msg="StopPodSandbox for \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\" returns successfully" Mar 17 17:28:39.954192 containerd[1745]: time="2025-03-17T17:28:39.954158849Z" level=info msg="RemovePodSandbox for \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\"" Mar 17 17:28:39.954192 containerd[1745]: time="2025-03-17T17:28:39.954189529Z" level=info msg="Forcibly stopping sandbox \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\"" Mar 17 17:28:39.954288 containerd[1745]: time="2025-03-17T17:28:39.954247129Z" level=info msg="TearDown network for sandbox \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\" successfully" Mar 17 17:28:39.965311 containerd[1745]: time="2025-03-17T17:28:39.965267550Z" level=info msg="StartContainer for \"00c396aa578cae056c045c32c69b2e07a852f22985f12f0a9b7516f5454ff8ae\" returns successfully" Mar 17 17:28:39.968997 containerd[1745]: time="2025-03-17T17:28:39.968952877Z" level=error msg="Failed to destroy network for sandbox \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.969457 containerd[1745]: time="2025-03-17T17:28:39.969428838Z" level=error msg="encountered an error cleaning up failed sandbox \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.969586 containerd[1745]: time="2025-03-17T17:28:39.969567078Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fcc8c87fb-p2fc6,Uid:1e5b40ef-0b7d-4233-9b50-e552e3a1bd38,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.969971 kubelet[3318]: E0317 17:28:39.969939 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.970158 kubelet[3318]: E0317 17:28:39.970136 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" Mar 17 17:28:39.970272 kubelet[3318]: E0317 17:28:39.970252 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" Mar 17 17:28:39.971236 kubelet[3318]: E0317 17:28:39.970490 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6fcc8c87fb-p2fc6_calico-system(1e5b40ef-0b7d-4233-9b50-e552e3a1bd38)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6fcc8c87fb-p2fc6_calico-system(1e5b40ef-0b7d-4233-9b50-e552e3a1bd38)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" podUID="1e5b40ef-0b7d-4233-9b50-e552e3a1bd38" Mar 17 17:28:39.973040 containerd[1745]: time="2025-03-17T17:28:39.972994884Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:28:39.973110 containerd[1745]: time="2025-03-17T17:28:39.973069125Z" level=info msg="RemovePodSandbox \"5fbcef337b363be95572707b128940d0c303ecec15ba043ae00c0c84e00b3806\" returns successfully" Mar 17 17:28:39.986120 containerd[1745]: time="2025-03-17T17:28:39.986068229Z" level=error msg="Failed to destroy network for sandbox \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.987190 containerd[1745]: time="2025-03-17T17:28:39.986649830Z" level=error msg="encountered an error cleaning up failed sandbox \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.987190 containerd[1745]: time="2025-03-17T17:28:39.986979711Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zckwm,Uid:f6c1365d-cb18-415b-8a89-1e9f3710a559,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.987911 kubelet[3318]: E0317 17:28:39.987871 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:39.990825 kubelet[3318]: E0317 17:28:39.988046 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zckwm" Mar 17 17:28:39.990825 kubelet[3318]: E0317 17:28:39.988071 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zckwm" Mar 17 17:28:39.991229 kubelet[3318]: E0317 17:28:39.991005 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zckwm_calico-system(f6c1365d-cb18-415b-8a89-1e9f3710a559)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zckwm_calico-system(f6c1365d-cb18-415b-8a89-1e9f3710a559)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zckwm" podUID="f6c1365d-cb18-415b-8a89-1e9f3710a559" Mar 17 17:28:39.999966 containerd[1745]: time="2025-03-17T17:28:39.999914055Z" level=error msg="Failed to destroy network for sandbox \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:40.000266 containerd[1745]: time="2025-03-17T17:28:40.000231456Z" level=error msg="encountered an error cleaning up failed sandbox \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:40.000312 containerd[1745]: time="2025-03-17T17:28:40.000298016Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-krcfl,Uid:d007819c-c75a-48ce-80a6-dcf89e240e01,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:40.000844 kubelet[3318]: E0317 17:28:40.000488 3318 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:28:40.000844 kubelet[3318]: E0317 17:28:40.000560 3318 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" Mar 17 17:28:40.000844 kubelet[3318]: E0317 17:28:40.000579 3318 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" Mar 17 17:28:40.000998 kubelet[3318]: E0317 17:28:40.000616 3318 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b557bfbcb-krcfl_calico-apiserver(d007819c-c75a-48ce-80a6-dcf89e240e01)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b557bfbcb-krcfl_calico-apiserver(d007819c-c75a-48ce-80a6-dcf89e240e01)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" podUID="d007819c-c75a-48ce-80a6-dcf89e240e01" Mar 17 17:28:40.348169 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 17 17:28:40.348481 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 17 17:28:40.363761 kubelet[3318]: I0317 17:28:40.363312 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e" Mar 17 17:28:40.365580 containerd[1745]: time="2025-03-17T17:28:40.365126546Z" level=info msg="StopPodSandbox for \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\"" Mar 17 17:28:40.365580 containerd[1745]: time="2025-03-17T17:28:40.365314346Z" level=info msg="Ensure that sandbox 966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e in task-service has been cleanup successfully" Mar 17 17:28:40.365812 containerd[1745]: time="2025-03-17T17:28:40.365750947Z" level=info msg="TearDown network for sandbox \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\" successfully" Mar 17 17:28:40.365948 containerd[1745]: time="2025-03-17T17:28:40.365907787Z" level=info msg="StopPodSandbox for \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\" returns successfully" Mar 17 17:28:40.366466 containerd[1745]: time="2025-03-17T17:28:40.366262868Z" level=info msg="StopPodSandbox for \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\"" Mar 17 17:28:40.366466 containerd[1745]: time="2025-03-17T17:28:40.366372308Z" level=info msg="TearDown network for sandbox \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\" successfully" Mar 17 17:28:40.366466 containerd[1745]: time="2025-03-17T17:28:40.366384988Z" level=info msg="StopPodSandbox for \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\" returns successfully" Mar 17 17:28:40.367286 containerd[1745]: time="2025-03-17T17:28:40.367117869Z" level=info msg="StopPodSandbox for \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\"" Mar 17 17:28:40.368196 containerd[1745]: time="2025-03-17T17:28:40.367859871Z" level=info msg="TearDown network for sandbox \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\" successfully" Mar 17 17:28:40.368196 containerd[1745]: time="2025-03-17T17:28:40.367887911Z" level=info msg="StopPodSandbox for \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\" returns successfully" Mar 17 17:28:40.369512 containerd[1745]: time="2025-03-17T17:28:40.369446314Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zckwm,Uid:f6c1365d-cb18-415b-8a89-1e9f3710a559,Namespace:calico-system,Attempt:6,}" Mar 17 17:28:40.374601 kubelet[3318]: I0317 17:28:40.374416 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0" Mar 17 17:28:40.376707 containerd[1745]: time="2025-03-17T17:28:40.376290167Z" level=info msg="StopPodSandbox for \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\"" Mar 17 17:28:40.380083 containerd[1745]: time="2025-03-17T17:28:40.379962334Z" level=info msg="Ensure that sandbox 82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0 in task-service has been cleanup successfully" Mar 17 17:28:40.381911 containerd[1745]: time="2025-03-17T17:28:40.381735737Z" level=info msg="TearDown network for sandbox \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\" successfully" Mar 17 17:28:40.381911 containerd[1745]: time="2025-03-17T17:28:40.381861857Z" level=info msg="StopPodSandbox for \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\" returns successfully" Mar 17 17:28:40.386240 containerd[1745]: time="2025-03-17T17:28:40.386186225Z" level=info msg="StopPodSandbox for \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\"" Mar 17 17:28:40.386339 containerd[1745]: time="2025-03-17T17:28:40.386280466Z" level=info msg="TearDown network for sandbox \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\" successfully" Mar 17 17:28:40.386339 containerd[1745]: time="2025-03-17T17:28:40.386290026Z" level=info msg="StopPodSandbox for \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\" returns successfully" Mar 17 17:28:40.397296 containerd[1745]: time="2025-03-17T17:28:40.395717203Z" level=info msg="StopPodSandbox for \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\"" Mar 17 17:28:40.397296 containerd[1745]: time="2025-03-17T17:28:40.395837124Z" level=info msg="TearDown network for sandbox \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\" successfully" Mar 17 17:28:40.397296 containerd[1745]: time="2025-03-17T17:28:40.395849164Z" level=info msg="StopPodSandbox for \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\" returns successfully" Mar 17 17:28:40.397624 containerd[1745]: time="2025-03-17T17:28:40.397539047Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fcc8c87fb-p2fc6,Uid:1e5b40ef-0b7d-4233-9b50-e552e3a1bd38,Namespace:calico-system,Attempt:5,}" Mar 17 17:28:40.399402 kubelet[3318]: I0317 17:28:40.399334 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4" Mar 17 17:28:40.405840 containerd[1745]: time="2025-03-17T17:28:40.404976421Z" level=info msg="StopPodSandbox for \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\"" Mar 17 17:28:40.405840 containerd[1745]: time="2025-03-17T17:28:40.405159941Z" level=info msg="Ensure that sandbox 891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4 in task-service has been cleanup successfully" Mar 17 17:28:40.409483 kubelet[3318]: I0317 17:28:40.409177 3318 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-x9tvk" podStartSLOduration=3.919623268 podStartE2EDuration="56.409139749s" podCreationTimestamp="2025-03-17 17:27:44 +0000 UTC" firstStartedPulling="2025-03-17 17:27:45.579941325 +0000 UTC m=+15.653320157" lastFinishedPulling="2025-03-17 17:28:38.069457806 +0000 UTC m=+68.142836638" observedRunningTime="2025-03-17 17:28:40.406203823 +0000 UTC m=+70.479582655" watchObservedRunningTime="2025-03-17 17:28:40.409139749 +0000 UTC m=+70.482518541" Mar 17 17:28:40.410108 containerd[1745]: time="2025-03-17T17:28:40.409959710Z" level=info msg="TearDown network for sandbox \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\" successfully" Mar 17 17:28:40.411093 containerd[1745]: time="2025-03-17T17:28:40.410795512Z" level=info msg="StopPodSandbox for \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\" returns successfully" Mar 17 17:28:40.411874 containerd[1745]: time="2025-03-17T17:28:40.411849554Z" level=info msg="StopPodSandbox for \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\"" Mar 17 17:28:40.412237 containerd[1745]: time="2025-03-17T17:28:40.412008354Z" level=info msg="TearDown network for sandbox \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\" successfully" Mar 17 17:28:40.412237 containerd[1745]: time="2025-03-17T17:28:40.412021994Z" level=info msg="StopPodSandbox for \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\" returns successfully" Mar 17 17:28:40.413424 containerd[1745]: time="2025-03-17T17:28:40.413391397Z" level=info msg="StopPodSandbox for \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\"" Mar 17 17:28:40.413925 kubelet[3318]: I0317 17:28:40.413846 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89" Mar 17 17:28:40.414207 containerd[1745]: time="2025-03-17T17:28:40.414155678Z" level=info msg="TearDown network for sandbox \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\" successfully" Mar 17 17:28:40.414320 containerd[1745]: time="2025-03-17T17:28:40.414306439Z" level=info msg="StopPodSandbox for \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\" returns successfully" Mar 17 17:28:40.415354 containerd[1745]: time="2025-03-17T17:28:40.415273480Z" level=info msg="StopPodSandbox for \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\"" Mar 17 17:28:40.415663 containerd[1745]: time="2025-03-17T17:28:40.415629041Z" level=info msg="TearDown network for sandbox \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\" successfully" Mar 17 17:28:40.416843 containerd[1745]: time="2025-03-17T17:28:40.416762603Z" level=info msg="StopPodSandbox for \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\" returns successfully" Mar 17 17:28:40.419298 containerd[1745]: time="2025-03-17T17:28:40.416639923Z" level=info msg="StopPodSandbox for \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\"" Mar 17 17:28:40.422074 containerd[1745]: time="2025-03-17T17:28:40.422004053Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-krcfl,Uid:d007819c-c75a-48ce-80a6-dcf89e240e01,Namespace:calico-apiserver,Attempt:6,}" Mar 17 17:28:40.422832 containerd[1745]: time="2025-03-17T17:28:40.422791975Z" level=info msg="Ensure that sandbox e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89 in task-service has been cleanup successfully" Mar 17 17:28:40.425424 containerd[1745]: time="2025-03-17T17:28:40.425377659Z" level=info msg="TearDown network for sandbox \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\" successfully" Mar 17 17:28:40.425629 containerd[1745]: time="2025-03-17T17:28:40.425514300Z" level=info msg="StopPodSandbox for \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\" returns successfully" Mar 17 17:28:40.426954 kubelet[3318]: I0317 17:28:40.426932 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb" Mar 17 17:28:40.430629 containerd[1745]: time="2025-03-17T17:28:40.430408589Z" level=info msg="StopPodSandbox for \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\"" Mar 17 17:28:40.431247 containerd[1745]: time="2025-03-17T17:28:40.431058950Z" level=info msg="StopPodSandbox for \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\"" Mar 17 17:28:40.432313 containerd[1745]: time="2025-03-17T17:28:40.432266953Z" level=info msg="Ensure that sandbox 37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb in task-service has been cleanup successfully" Mar 17 17:28:40.434145 containerd[1745]: time="2025-03-17T17:28:40.433938796Z" level=info msg="TearDown network for sandbox \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\" successfully" Mar 17 17:28:40.434145 containerd[1745]: time="2025-03-17T17:28:40.433963236Z" level=info msg="StopPodSandbox for \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\" returns successfully" Mar 17 17:28:40.435181 containerd[1745]: time="2025-03-17T17:28:40.434553197Z" level=info msg="TearDown network for sandbox \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\" successfully" Mar 17 17:28:40.435181 containerd[1745]: time="2025-03-17T17:28:40.434608037Z" level=info msg="StopPodSandbox for \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\" returns successfully" Mar 17 17:28:40.435181 containerd[1745]: time="2025-03-17T17:28:40.434941278Z" level=info msg="StopPodSandbox for \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\"" Mar 17 17:28:40.435181 containerd[1745]: time="2025-03-17T17:28:40.435033358Z" level=info msg="TearDown network for sandbox \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\" successfully" Mar 17 17:28:40.435181 containerd[1745]: time="2025-03-17T17:28:40.435043438Z" level=info msg="StopPodSandbox for \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\" returns successfully" Mar 17 17:28:40.439179 containerd[1745]: time="2025-03-17T17:28:40.439141606Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d87wb,Uid:6743fb2a-96da-4c19-b66f-02242ba2b410,Namespace:kube-system,Attempt:5,}" Mar 17 17:28:40.440060 containerd[1745]: time="2025-03-17T17:28:40.440029607Z" level=info msg="StopPodSandbox for \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\"" Mar 17 17:28:40.440720 containerd[1745]: time="2025-03-17T17:28:40.440477968Z" level=info msg="TearDown network for sandbox \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\" successfully" Mar 17 17:28:40.440720 containerd[1745]: time="2025-03-17T17:28:40.440495448Z" level=info msg="StopPodSandbox for \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\" returns successfully" Mar 17 17:28:40.441051 containerd[1745]: time="2025-03-17T17:28:40.441022329Z" level=info msg="StopPodSandbox for \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\"" Mar 17 17:28:40.441667 containerd[1745]: time="2025-03-17T17:28:40.441104209Z" level=info msg="TearDown network for sandbox \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\" successfully" Mar 17 17:28:40.441667 containerd[1745]: time="2025-03-17T17:28:40.441119449Z" level=info msg="StopPodSandbox for \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\" returns successfully" Mar 17 17:28:40.442247 containerd[1745]: time="2025-03-17T17:28:40.441957571Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4tzzq,Uid:0186ea99-0088-4544-b92c-7659bf548a6e,Namespace:kube-system,Attempt:5,}" Mar 17 17:28:40.442459 kubelet[3318]: I0317 17:28:40.442438 3318 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f" Mar 17 17:28:40.446283 containerd[1745]: time="2025-03-17T17:28:40.445975058Z" level=info msg="StopPodSandbox for \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\"" Mar 17 17:28:40.447265 containerd[1745]: time="2025-03-17T17:28:40.447240821Z" level=info msg="Ensure that sandbox 2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f in task-service has been cleanup successfully" Mar 17 17:28:40.447972 containerd[1745]: time="2025-03-17T17:28:40.447946142Z" level=info msg="TearDown network for sandbox \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\" successfully" Mar 17 17:28:40.448074 containerd[1745]: time="2025-03-17T17:28:40.448060062Z" level=info msg="StopPodSandbox for \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\" returns successfully" Mar 17 17:28:40.448846 containerd[1745]: time="2025-03-17T17:28:40.448824064Z" level=info msg="StopPodSandbox for \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\"" Mar 17 17:28:40.449246 containerd[1745]: time="2025-03-17T17:28:40.449056144Z" level=info msg="TearDown network for sandbox \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\" successfully" Mar 17 17:28:40.449246 containerd[1745]: time="2025-03-17T17:28:40.449071904Z" level=info msg="StopPodSandbox for \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\" returns successfully" Mar 17 17:28:40.449934 containerd[1745]: time="2025-03-17T17:28:40.449764546Z" level=info msg="StopPodSandbox for \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\"" Mar 17 17:28:40.449934 containerd[1745]: time="2025-03-17T17:28:40.449857066Z" level=info msg="TearDown network for sandbox \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\" successfully" Mar 17 17:28:40.449934 containerd[1745]: time="2025-03-17T17:28:40.449867866Z" level=info msg="StopPodSandbox for \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\" returns successfully" Mar 17 17:28:40.450753 containerd[1745]: time="2025-03-17T17:28:40.450696627Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-wssz4,Uid:213ca9a6-a07c-4c72-a108-6e622fdd0452,Namespace:calico-apiserver,Attempt:5,}" Mar 17 17:28:40.665263 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3826837460.mount: Deactivated successfully. Mar 17 17:28:40.665365 systemd[1]: run-netns-cni\x2d62226418\x2d0c04\x2dc89b\x2d6e39\x2d20ddec3da302.mount: Deactivated successfully. Mar 17 17:28:40.665413 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb-shm.mount: Deactivated successfully. Mar 17 17:28:40.665470 systemd[1]: run-netns-cni\x2de44859c1\x2d8660\x2d8046\x2d6945\x2d98bb2820db99.mount: Deactivated successfully. Mar 17 17:28:40.665516 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89-shm.mount: Deactivated successfully. Mar 17 17:28:40.991295 systemd-networkd[1440]: calid655f78f0c9: Link UP Mar 17 17:28:40.992822 systemd-networkd[1440]: calid655f78f0c9: Gained carrier Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.841 [INFO][5253] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.854 [INFO][5253] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152.2.2--a--f9f073f8c6-k8s-csi--node--driver--zckwm-eth0 csi-node-driver- calico-system f6c1365d-cb18-415b-8a89-1e9f3710a559 585 0 2025-03-17 17:27:45 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:568c96974f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4152.2.2-a-f9f073f8c6 csi-node-driver-zckwm eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid655f78f0c9 [] []}} ContainerID="52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" Namespace="calico-system" Pod="csi-node-driver-zckwm" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-csi--node--driver--zckwm-" Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.854 [INFO][5253] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" Namespace="calico-system" Pod="csi-node-driver-zckwm" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-csi--node--driver--zckwm-eth0" Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.884 [INFO][5265] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" HandleID="k8s-pod-network.52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-csi--node--driver--zckwm-eth0" Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.895 [INFO][5265] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" HandleID="k8s-pod-network.52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-csi--node--driver--zckwm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000291110), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4152.2.2-a-f9f073f8c6", "pod":"csi-node-driver-zckwm", "timestamp":"2025-03-17 17:28:40.884656208 +0000 UTC"}, Hostname:"ci-4152.2.2-a-f9f073f8c6", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.895 [INFO][5265] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.895 [INFO][5265] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.895 [INFO][5265] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152.2.2-a-f9f073f8c6' Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.898 [INFO][5265] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.902 [INFO][5265] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.908 [INFO][5265] ipam/ipam.go 489: Trying affinity for 192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.911 [INFO][5265] ipam/ipam.go 155: Attempting to load block cidr=192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.913 [INFO][5265] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.913 [INFO][5265] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.63.0/26 handle="k8s-pod-network.52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.915 [INFO][5265] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3 Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.921 [INFO][5265] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.63.0/26 handle="k8s-pod-network.52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.938 [INFO][5265] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.63.1/26] block=192.168.63.0/26 handle="k8s-pod-network.52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.938 [INFO][5265] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.63.1/26] handle="k8s-pod-network.52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.938 [INFO][5265] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:28:41.010199 containerd[1745]: 2025-03-17 17:28:40.938 [INFO][5265] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.63.1/26] IPv6=[] ContainerID="52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" HandleID="k8s-pod-network.52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-csi--node--driver--zckwm-eth0" Mar 17 17:28:41.011008 containerd[1745]: 2025-03-17 17:28:40.940 [INFO][5253] cni-plugin/k8s.go 386: Populated endpoint ContainerID="52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" Namespace="calico-system" Pod="csi-node-driver-zckwm" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-csi--node--driver--zckwm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--f9f073f8c6-k8s-csi--node--driver--zckwm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f6c1365d-cb18-415b-8a89-1e9f3710a559", ResourceVersion:"585", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 27, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-f9f073f8c6", ContainerID:"", Pod:"csi-node-driver-zckwm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.63.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid655f78f0c9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:28:41.011008 containerd[1745]: 2025-03-17 17:28:40.941 [INFO][5253] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.63.1/32] ContainerID="52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" Namespace="calico-system" Pod="csi-node-driver-zckwm" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-csi--node--driver--zckwm-eth0" Mar 17 17:28:41.011008 containerd[1745]: 2025-03-17 17:28:40.941 [INFO][5253] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid655f78f0c9 ContainerID="52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" Namespace="calico-system" Pod="csi-node-driver-zckwm" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-csi--node--driver--zckwm-eth0" Mar 17 17:28:41.011008 containerd[1745]: 2025-03-17 17:28:40.991 [INFO][5253] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" Namespace="calico-system" Pod="csi-node-driver-zckwm" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-csi--node--driver--zckwm-eth0" Mar 17 17:28:41.011008 containerd[1745]: 2025-03-17 17:28:40.991 [INFO][5253] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" Namespace="calico-system" Pod="csi-node-driver-zckwm" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-csi--node--driver--zckwm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--f9f073f8c6-k8s-csi--node--driver--zckwm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f6c1365d-cb18-415b-8a89-1e9f3710a559", ResourceVersion:"585", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 27, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-f9f073f8c6", ContainerID:"52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3", Pod:"csi-node-driver-zckwm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.63.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid655f78f0c9", MAC:"42:e6:bc:84:07:1b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:28:41.011008 containerd[1745]: 2025-03-17 17:28:41.008 [INFO][5253] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3" Namespace="calico-system" Pod="csi-node-driver-zckwm" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-csi--node--driver--zckwm-eth0" Mar 17 17:28:41.482263 containerd[1745]: time="2025-03-17T17:28:41.482100777Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:28:41.482263 containerd[1745]: time="2025-03-17T17:28:41.482169697Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:28:41.482263 containerd[1745]: time="2025-03-17T17:28:41.482182817Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:28:41.483984 containerd[1745]: time="2025-03-17T17:28:41.483722980Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:28:41.506982 systemd[1]: Started cri-containerd-52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3.scope - libcontainer container 52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3. Mar 17 17:28:41.530585 containerd[1745]: time="2025-03-17T17:28:41.530395388Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zckwm,Uid:f6c1365d-cb18-415b-8a89-1e9f3710a559,Namespace:calico-system,Attempt:6,} returns sandbox id \"52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3\"" Mar 17 17:28:41.533167 containerd[1745]: time="2025-03-17T17:28:41.533126553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 17 17:28:41.840405 systemd-networkd[1440]: cali7dc9c2d9051: Link UP Mar 17 17:28:41.844450 systemd-networkd[1440]: cali7dc9c2d9051: Gained carrier Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.736 [INFO][5348] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.748 [INFO][5348] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152.2.2--a--f9f073f8c6-k8s-calico--kube--controllers--6fcc8c87fb--p2fc6-eth0 calico-kube-controllers-6fcc8c87fb- calico-system 1e5b40ef-0b7d-4233-9b50-e552e3a1bd38 713 0 2025-03-17 17:27:45 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6fcc8c87fb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4152.2.2-a-f9f073f8c6 calico-kube-controllers-6fcc8c87fb-p2fc6 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali7dc9c2d9051 [] []}} ContainerID="c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" Namespace="calico-system" Pod="calico-kube-controllers-6fcc8c87fb-p2fc6" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--kube--controllers--6fcc8c87fb--p2fc6-" Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.748 [INFO][5348] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" Namespace="calico-system" Pod="calico-kube-controllers-6fcc8c87fb-p2fc6" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--kube--controllers--6fcc8c87fb--p2fc6-eth0" Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.777 [INFO][5360] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" HandleID="k8s-pod-network.c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-calico--kube--controllers--6fcc8c87fb--p2fc6-eth0" Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.790 [INFO][5360] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" HandleID="k8s-pod-network.c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-calico--kube--controllers--6fcc8c87fb--p2fc6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000332010), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4152.2.2-a-f9f073f8c6", "pod":"calico-kube-controllers-6fcc8c87fb-p2fc6", "timestamp":"2025-03-17 17:28:41.777028734 +0000 UTC"}, Hostname:"ci-4152.2.2-a-f9f073f8c6", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.791 [INFO][5360] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.791 [INFO][5360] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.791 [INFO][5360] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152.2.2-a-f9f073f8c6' Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.795 [INFO][5360] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.800 [INFO][5360] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.807 [INFO][5360] ipam/ipam.go 489: Trying affinity for 192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.810 [INFO][5360] ipam/ipam.go 155: Attempting to load block cidr=192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.812 [INFO][5360] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.812 [INFO][5360] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.63.0/26 handle="k8s-pod-network.c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.815 [INFO][5360] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9 Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.821 [INFO][5360] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.63.0/26 handle="k8s-pod-network.c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.831 [INFO][5360] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.63.2/26] block=192.168.63.0/26 handle="k8s-pod-network.c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.831 [INFO][5360] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.63.2/26] handle="k8s-pod-network.c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.831 [INFO][5360] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:28:41.865757 containerd[1745]: 2025-03-17 17:28:41.831 [INFO][5360] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.63.2/26] IPv6=[] ContainerID="c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" HandleID="k8s-pod-network.c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-calico--kube--controllers--6fcc8c87fb--p2fc6-eth0" Mar 17 17:28:41.869262 containerd[1745]: 2025-03-17 17:28:41.835 [INFO][5348] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" Namespace="calico-system" Pod="calico-kube-controllers-6fcc8c87fb-p2fc6" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--kube--controllers--6fcc8c87fb--p2fc6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--f9f073f8c6-k8s-calico--kube--controllers--6fcc8c87fb--p2fc6-eth0", GenerateName:"calico-kube-controllers-6fcc8c87fb-", Namespace:"calico-system", SelfLink:"", UID:"1e5b40ef-0b7d-4233-9b50-e552e3a1bd38", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 27, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6fcc8c87fb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-f9f073f8c6", ContainerID:"", Pod:"calico-kube-controllers-6fcc8c87fb-p2fc6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.63.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7dc9c2d9051", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:28:41.869262 containerd[1745]: 2025-03-17 17:28:41.836 [INFO][5348] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.63.2/32] ContainerID="c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" Namespace="calico-system" Pod="calico-kube-controllers-6fcc8c87fb-p2fc6" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--kube--controllers--6fcc8c87fb--p2fc6-eth0" Mar 17 17:28:41.869262 containerd[1745]: 2025-03-17 17:28:41.836 [INFO][5348] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7dc9c2d9051 ContainerID="c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" Namespace="calico-system" Pod="calico-kube-controllers-6fcc8c87fb-p2fc6" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--kube--controllers--6fcc8c87fb--p2fc6-eth0" Mar 17 17:28:41.869262 containerd[1745]: 2025-03-17 17:28:41.838 [INFO][5348] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" Namespace="calico-system" Pod="calico-kube-controllers-6fcc8c87fb-p2fc6" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--kube--controllers--6fcc8c87fb--p2fc6-eth0" Mar 17 17:28:41.869262 containerd[1745]: 2025-03-17 17:28:41.845 [INFO][5348] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" Namespace="calico-system" Pod="calico-kube-controllers-6fcc8c87fb-p2fc6" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--kube--controllers--6fcc8c87fb--p2fc6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--f9f073f8c6-k8s-calico--kube--controllers--6fcc8c87fb--p2fc6-eth0", GenerateName:"calico-kube-controllers-6fcc8c87fb-", Namespace:"calico-system", SelfLink:"", UID:"1e5b40ef-0b7d-4233-9b50-e552e3a1bd38", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 27, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6fcc8c87fb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-f9f073f8c6", ContainerID:"c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9", Pod:"calico-kube-controllers-6fcc8c87fb-p2fc6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.63.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7dc9c2d9051", MAC:"12:46:fb:bc:4a:90", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:28:41.869262 containerd[1745]: 2025-03-17 17:28:41.862 [INFO][5348] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9" Namespace="calico-system" Pod="calico-kube-controllers-6fcc8c87fb-p2fc6" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--kube--controllers--6fcc8c87fb--p2fc6-eth0" Mar 17 17:28:42.125854 kernel: bpftool[5499]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 17 17:28:42.588600 systemd-networkd[1440]: vxlan.calico: Link UP Mar 17 17:28:42.588609 systemd-networkd[1440]: vxlan.calico: Gained carrier Mar 17 17:28:42.739953 systemd-networkd[1440]: calid655f78f0c9: Gained IPv6LL Mar 17 17:28:43.058984 systemd-networkd[1440]: cali7dc9c2d9051: Gained IPv6LL Mar 17 17:28:44.082990 systemd-networkd[1440]: vxlan.calico: Gained IPv6LL Mar 17 17:28:45.276895 containerd[1745]: time="2025-03-17T17:28:45.276730512Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:28:45.276895 containerd[1745]: time="2025-03-17T17:28:45.276790793Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:28:45.276895 containerd[1745]: time="2025-03-17T17:28:45.276845913Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:28:45.277475 containerd[1745]: time="2025-03-17T17:28:45.276943473Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:28:45.304020 systemd[1]: Started cri-containerd-c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9.scope - libcontainer container c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9. Mar 17 17:28:45.342786 containerd[1745]: time="2025-03-17T17:28:45.342701914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fcc8c87fb-p2fc6,Uid:1e5b40ef-0b7d-4233-9b50-e552e3a1bd38,Namespace:calico-system,Attempt:5,} returns sandbox id \"c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9\"" Mar 17 17:28:45.619430 systemd-networkd[1440]: caliecaad3486c0: Link UP Mar 17 17:28:45.620755 systemd-networkd[1440]: caliecaad3486c0: Gained carrier Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.466 [INFO][5614] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--d87wb-eth0 coredns-6f6b679f8f- kube-system 6743fb2a-96da-4c19-b66f-02242ba2b410 714 0 2025-03-17 17:27:36 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4152.2.2-a-f9f073f8c6 coredns-6f6b679f8f-d87wb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliecaad3486c0 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" Namespace="kube-system" Pod="coredns-6f6b679f8f-d87wb" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--d87wb-" Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.466 [INFO][5614] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" Namespace="kube-system" Pod="coredns-6f6b679f8f-d87wb" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--d87wb-eth0" Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.507 [INFO][5625] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" HandleID="k8s-pod-network.977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--d87wb-eth0" Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.522 [INFO][5625] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" HandleID="k8s-pod-network.977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--d87wb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000318ae0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4152.2.2-a-f9f073f8c6", "pod":"coredns-6f6b679f8f-d87wb", "timestamp":"2025-03-17 17:28:45.507072898 +0000 UTC"}, Hostname:"ci-4152.2.2-a-f9f073f8c6", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.566 [INFO][5625] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.566 [INFO][5625] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.566 [INFO][5625] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152.2.2-a-f9f073f8c6' Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.569 [INFO][5625] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.575 [INFO][5625] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.583 [INFO][5625] ipam/ipam.go 489: Trying affinity for 192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.585 [INFO][5625] ipam/ipam.go 155: Attempting to load block cidr=192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.588 [INFO][5625] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.588 [INFO][5625] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.63.0/26 handle="k8s-pod-network.977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.590 [INFO][5625] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.602 [INFO][5625] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.63.0/26 handle="k8s-pod-network.977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.613 [INFO][5625] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.63.3/26] block=192.168.63.0/26 handle="k8s-pod-network.977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.613 [INFO][5625] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.63.3/26] handle="k8s-pod-network.977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.613 [INFO][5625] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:28:45.643637 containerd[1745]: 2025-03-17 17:28:45.613 [INFO][5625] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.63.3/26] IPv6=[] ContainerID="977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" HandleID="k8s-pod-network.977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--d87wb-eth0" Mar 17 17:28:45.645143 containerd[1745]: 2025-03-17 17:28:45.616 [INFO][5614] cni-plugin/k8s.go 386: Populated endpoint ContainerID="977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" Namespace="kube-system" Pod="coredns-6f6b679f8f-d87wb" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--d87wb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--d87wb-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"6743fb2a-96da-4c19-b66f-02242ba2b410", ResourceVersion:"714", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 27, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-f9f073f8c6", ContainerID:"", Pod:"coredns-6f6b679f8f-d87wb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.63.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliecaad3486c0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:28:45.645143 containerd[1745]: 2025-03-17 17:28:45.616 [INFO][5614] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.63.3/32] ContainerID="977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" Namespace="kube-system" Pod="coredns-6f6b679f8f-d87wb" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--d87wb-eth0" Mar 17 17:28:45.645143 containerd[1745]: 2025-03-17 17:28:45.616 [INFO][5614] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliecaad3486c0 ContainerID="977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" Namespace="kube-system" Pod="coredns-6f6b679f8f-d87wb" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--d87wb-eth0" Mar 17 17:28:45.645143 containerd[1745]: 2025-03-17 17:28:45.620 [INFO][5614] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" Namespace="kube-system" Pod="coredns-6f6b679f8f-d87wb" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--d87wb-eth0" Mar 17 17:28:45.645143 containerd[1745]: 2025-03-17 17:28:45.622 [INFO][5614] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" Namespace="kube-system" Pod="coredns-6f6b679f8f-d87wb" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--d87wb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--d87wb-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"6743fb2a-96da-4c19-b66f-02242ba2b410", ResourceVersion:"714", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 27, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-f9f073f8c6", ContainerID:"977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d", Pod:"coredns-6f6b679f8f-d87wb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.63.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliecaad3486c0", MAC:"52:d2:4f:d1:41:2a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:28:45.645143 containerd[1745]: 2025-03-17 17:28:45.637 [INFO][5614] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d" Namespace="kube-system" Pod="coredns-6f6b679f8f-d87wb" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--d87wb-eth0" Mar 17 17:28:46.047598 containerd[1745]: time="2025-03-17T17:28:46.046745814Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:28:46.047598 containerd[1745]: time="2025-03-17T17:28:46.046884374Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:28:46.047598 containerd[1745]: time="2025-03-17T17:28:46.046903494Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:28:46.047598 containerd[1745]: time="2025-03-17T17:28:46.047021054Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:28:46.073993 systemd[1]: Started cri-containerd-977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d.scope - libcontainer container 977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d. Mar 17 17:28:46.087099 systemd-networkd[1440]: calie426b81909c: Link UP Mar 17 17:28:46.089337 systemd-networkd[1440]: calie426b81909c: Gained carrier Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:45.959 [INFO][5646] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--krcfl-eth0 calico-apiserver-b557bfbcb- calico-apiserver d007819c-c75a-48ce-80a6-dcf89e240e01 715 0 2025-03-17 17:27:43 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b557bfbcb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4152.2.2-a-f9f073f8c6 calico-apiserver-b557bfbcb-krcfl eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie426b81909c [] []}} ContainerID="c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-krcfl" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--krcfl-" Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:45.960 [INFO][5646] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-krcfl" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--krcfl-eth0" Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:45.995 [INFO][5658] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" HandleID="k8s-pod-network.c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--krcfl-eth0" Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.009 [INFO][5658] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" HandleID="k8s-pod-network.c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--krcfl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400030d310), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4152.2.2-a-f9f073f8c6", "pod":"calico-apiserver-b557bfbcb-krcfl", "timestamp":"2025-03-17 17:28:45.99590612 +0000 UTC"}, Hostname:"ci-4152.2.2-a-f9f073f8c6", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.010 [INFO][5658] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.010 [INFO][5658] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.010 [INFO][5658] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152.2.2-a-f9f073f8c6' Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.013 [INFO][5658] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.023 [INFO][5658] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.031 [INFO][5658] ipam/ipam.go 489: Trying affinity for 192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.035 [INFO][5658] ipam/ipam.go 155: Attempting to load block cidr=192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.040 [INFO][5658] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.040 [INFO][5658] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.63.0/26 handle="k8s-pod-network.c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.045 [INFO][5658] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96 Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.056 [INFO][5658] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.63.0/26 handle="k8s-pod-network.c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.075 [INFO][5658] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.63.4/26] block=192.168.63.0/26 handle="k8s-pod-network.c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.076 [INFO][5658] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.63.4/26] handle="k8s-pod-network.c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.076 [INFO][5658] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:28:46.116327 containerd[1745]: 2025-03-17 17:28:46.076 [INFO][5658] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.63.4/26] IPv6=[] ContainerID="c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" HandleID="k8s-pod-network.c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--krcfl-eth0" Mar 17 17:28:46.117428 containerd[1745]: 2025-03-17 17:28:46.080 [INFO][5646] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-krcfl" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--krcfl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--krcfl-eth0", GenerateName:"calico-apiserver-b557bfbcb-", Namespace:"calico-apiserver", SelfLink:"", UID:"d007819c-c75a-48ce-80a6-dcf89e240e01", ResourceVersion:"715", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 27, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b557bfbcb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-f9f073f8c6", ContainerID:"", Pod:"calico-apiserver-b557bfbcb-krcfl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.63.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie426b81909c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:28:46.117428 containerd[1745]: 2025-03-17 17:28:46.080 [INFO][5646] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.63.4/32] ContainerID="c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-krcfl" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--krcfl-eth0" Mar 17 17:28:46.117428 containerd[1745]: 2025-03-17 17:28:46.080 [INFO][5646] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie426b81909c ContainerID="c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-krcfl" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--krcfl-eth0" Mar 17 17:28:46.117428 containerd[1745]: 2025-03-17 17:28:46.091 [INFO][5646] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-krcfl" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--krcfl-eth0" Mar 17 17:28:46.117428 containerd[1745]: 2025-03-17 17:28:46.092 [INFO][5646] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-krcfl" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--krcfl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--krcfl-eth0", GenerateName:"calico-apiserver-b557bfbcb-", Namespace:"calico-apiserver", SelfLink:"", UID:"d007819c-c75a-48ce-80a6-dcf89e240e01", ResourceVersion:"715", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 27, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b557bfbcb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-f9f073f8c6", ContainerID:"c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96", Pod:"calico-apiserver-b557bfbcb-krcfl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.63.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie426b81909c", MAC:"ee:4e:a7:6d:c0:ae", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:28:46.117428 containerd[1745]: 2025-03-17 17:28:46.113 [INFO][5646] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-krcfl" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--krcfl-eth0" Mar 17 17:28:46.162686 containerd[1745]: time="2025-03-17T17:28:46.161113225Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d87wb,Uid:6743fb2a-96da-4c19-b66f-02242ba2b410,Namespace:kube-system,Attempt:5,} returns sandbox id \"977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d\"" Mar 17 17:28:46.170113 containerd[1745]: time="2025-03-17T17:28:46.170062641Z" level=info msg="CreateContainer within sandbox \"977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 17:28:46.313298 systemd-networkd[1440]: cali5cefdb4b4df: Link UP Mar 17 17:28:46.313526 systemd-networkd[1440]: cali5cefdb4b4df: Gained carrier Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.194 [INFO][5704] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--4tzzq-eth0 coredns-6f6b679f8f- kube-system 0186ea99-0088-4544-b92c-7659bf548a6e 706 0 2025-03-17 17:27:36 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4152.2.2-a-f9f073f8c6 coredns-6f6b679f8f-4tzzq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5cefdb4b4df [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" Namespace="kube-system" Pod="coredns-6f6b679f8f-4tzzq" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--4tzzq-" Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.195 [INFO][5704] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" Namespace="kube-system" Pod="coredns-6f6b679f8f-4tzzq" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--4tzzq-eth0" Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.228 [INFO][5731] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" HandleID="k8s-pod-network.f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--4tzzq-eth0" Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.245 [INFO][5731] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" HandleID="k8s-pod-network.f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--4tzzq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400031b440), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4152.2.2-a-f9f073f8c6", "pod":"coredns-6f6b679f8f-4tzzq", "timestamp":"2025-03-17 17:28:46.228510589 +0000 UTC"}, Hostname:"ci-4152.2.2-a-f9f073f8c6", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.245 [INFO][5731] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.245 [INFO][5731] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.246 [INFO][5731] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152.2.2-a-f9f073f8c6' Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.249 [INFO][5731] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.256 [INFO][5731] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.262 [INFO][5731] ipam/ipam.go 489: Trying affinity for 192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.265 [INFO][5731] ipam/ipam.go 155: Attempting to load block cidr=192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.268 [INFO][5731] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.268 [INFO][5731] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.63.0/26 handle="k8s-pod-network.f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.273 [INFO][5731] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611 Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.292 [INFO][5731] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.63.0/26 handle="k8s-pod-network.f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.306 [INFO][5731] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.63.5/26] block=192.168.63.0/26 handle="k8s-pod-network.f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.306 [INFO][5731] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.63.5/26] handle="k8s-pod-network.f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.306 [INFO][5731] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:28:46.332258 containerd[1745]: 2025-03-17 17:28:46.306 [INFO][5731] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.63.5/26] IPv6=[] ContainerID="f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" HandleID="k8s-pod-network.f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--4tzzq-eth0" Mar 17 17:28:46.333837 containerd[1745]: 2025-03-17 17:28:46.308 [INFO][5704] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" Namespace="kube-system" Pod="coredns-6f6b679f8f-4tzzq" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--4tzzq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--4tzzq-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"0186ea99-0088-4544-b92c-7659bf548a6e", ResourceVersion:"706", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 27, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-f9f073f8c6", ContainerID:"", Pod:"coredns-6f6b679f8f-4tzzq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.63.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5cefdb4b4df", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:28:46.333837 containerd[1745]: 2025-03-17 17:28:46.309 [INFO][5704] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.63.5/32] ContainerID="f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" Namespace="kube-system" Pod="coredns-6f6b679f8f-4tzzq" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--4tzzq-eth0" Mar 17 17:28:46.333837 containerd[1745]: 2025-03-17 17:28:46.309 [INFO][5704] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5cefdb4b4df ContainerID="f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" Namespace="kube-system" Pod="coredns-6f6b679f8f-4tzzq" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--4tzzq-eth0" Mar 17 17:28:46.333837 containerd[1745]: 2025-03-17 17:28:46.312 [INFO][5704] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" Namespace="kube-system" Pod="coredns-6f6b679f8f-4tzzq" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--4tzzq-eth0" Mar 17 17:28:46.333837 containerd[1745]: 2025-03-17 17:28:46.312 [INFO][5704] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" Namespace="kube-system" Pod="coredns-6f6b679f8f-4tzzq" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--4tzzq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--4tzzq-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"0186ea99-0088-4544-b92c-7659bf548a6e", ResourceVersion:"706", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 27, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-f9f073f8c6", ContainerID:"f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611", Pod:"coredns-6f6b679f8f-4tzzq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.63.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5cefdb4b4df", MAC:"ae:4c:38:8f:bb:7f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:28:46.333837 containerd[1745]: 2025-03-17 17:28:46.330 [INFO][5704] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611" Namespace="kube-system" Pod="coredns-6f6b679f8f-4tzzq" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-coredns--6f6b679f8f--4tzzq-eth0" Mar 17 17:28:46.389780 containerd[1745]: time="2025-03-17T17:28:46.389513526Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:28:46.389780 containerd[1745]: time="2025-03-17T17:28:46.389599766Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:28:46.389780 containerd[1745]: time="2025-03-17T17:28:46.389618446Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:28:46.392166 containerd[1745]: time="2025-03-17T17:28:46.392076091Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:28:46.423087 systemd[1]: Started cri-containerd-c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96.scope - libcontainer container c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96. Mar 17 17:28:46.433126 systemd-networkd[1440]: cali700129341e1: Link UP Mar 17 17:28:46.434314 systemd-networkd[1440]: cali700129341e1: Gained carrier Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.295 [INFO][5737] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--wssz4-eth0 calico-apiserver-b557bfbcb- calico-apiserver 213ca9a6-a07c-4c72-a108-6e622fdd0452 710 0 2025-03-17 17:27:43 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b557bfbcb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4152.2.2-a-f9f073f8c6 calico-apiserver-b557bfbcb-wssz4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali700129341e1 [] []}} ContainerID="17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-wssz4" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--wssz4-" Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.295 [INFO][5737] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-wssz4" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--wssz4-eth0" Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.351 [INFO][5750] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" HandleID="k8s-pod-network.17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--wssz4-eth0" Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.364 [INFO][5750] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" HandleID="k8s-pod-network.17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--wssz4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d540), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4152.2.2-a-f9f073f8c6", "pod":"calico-apiserver-b557bfbcb-wssz4", "timestamp":"2025-03-17 17:28:46.351958537 +0000 UTC"}, Hostname:"ci-4152.2.2-a-f9f073f8c6", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.364 [INFO][5750] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.364 [INFO][5750] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.364 [INFO][5750] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152.2.2-a-f9f073f8c6' Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.366 [INFO][5750] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.372 [INFO][5750] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.379 [INFO][5750] ipam/ipam.go 489: Trying affinity for 192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.382 [INFO][5750] ipam/ipam.go 155: Attempting to load block cidr=192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.385 [INFO][5750] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.63.0/26 host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.385 [INFO][5750] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.63.0/26 handle="k8s-pod-network.17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.387 [INFO][5750] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52 Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.405 [INFO][5750] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.63.0/26 handle="k8s-pod-network.17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.422 [INFO][5750] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.63.6/26] block=192.168.63.0/26 handle="k8s-pod-network.17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.422 [INFO][5750] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.63.6/26] handle="k8s-pod-network.17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" host="ci-4152.2.2-a-f9f073f8c6" Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.422 [INFO][5750] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:28:46.465601 containerd[1745]: 2025-03-17 17:28:46.422 [INFO][5750] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.63.6/26] IPv6=[] ContainerID="17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" HandleID="k8s-pod-network.17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" Workload="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--wssz4-eth0" Mar 17 17:28:46.467768 containerd[1745]: 2025-03-17 17:28:46.426 [INFO][5737] cni-plugin/k8s.go 386: Populated endpoint ContainerID="17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-wssz4" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--wssz4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--wssz4-eth0", GenerateName:"calico-apiserver-b557bfbcb-", Namespace:"calico-apiserver", SelfLink:"", UID:"213ca9a6-a07c-4c72-a108-6e622fdd0452", ResourceVersion:"710", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 27, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b557bfbcb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-f9f073f8c6", ContainerID:"", Pod:"calico-apiserver-b557bfbcb-wssz4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.63.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali700129341e1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:28:46.467768 containerd[1745]: 2025-03-17 17:28:46.427 [INFO][5737] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.63.6/32] ContainerID="17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-wssz4" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--wssz4-eth0" Mar 17 17:28:46.467768 containerd[1745]: 2025-03-17 17:28:46.427 [INFO][5737] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali700129341e1 ContainerID="17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-wssz4" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--wssz4-eth0" Mar 17 17:28:46.467768 containerd[1745]: 2025-03-17 17:28:46.435 [INFO][5737] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-wssz4" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--wssz4-eth0" Mar 17 17:28:46.467768 containerd[1745]: 2025-03-17 17:28:46.437 [INFO][5737] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-wssz4" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--wssz4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--wssz4-eth0", GenerateName:"calico-apiserver-b557bfbcb-", Namespace:"calico-apiserver", SelfLink:"", UID:"213ca9a6-a07c-4c72-a108-6e622fdd0452", ResourceVersion:"710", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 27, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b557bfbcb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-f9f073f8c6", ContainerID:"17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52", Pod:"calico-apiserver-b557bfbcb-wssz4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.63.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali700129341e1", MAC:"6e:0b:6b:0c:c2:d4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:28:46.467768 containerd[1745]: 2025-03-17 17:28:46.455 [INFO][5737] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52" Namespace="calico-apiserver" Pod="calico-apiserver-b557bfbcb-wssz4" WorkloadEndpoint="ci--4152.2.2--a--f9f073f8c6-k8s-calico--apiserver--b557bfbcb--wssz4-eth0" Mar 17 17:28:46.478382 containerd[1745]: time="2025-03-17T17:28:46.478324490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-krcfl,Uid:d007819c-c75a-48ce-80a6-dcf89e240e01,Namespace:calico-apiserver,Attempt:6,} returns sandbox id \"c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96\"" Mar 17 17:28:46.707019 systemd-networkd[1440]: caliecaad3486c0: Gained IPv6LL Mar 17 17:28:47.602922 systemd-networkd[1440]: calie426b81909c: Gained IPv6LL Mar 17 17:28:47.858969 systemd-networkd[1440]: cali700129341e1: Gained IPv6LL Mar 17 17:28:47.977067 containerd[1745]: time="2025-03-17T17:28:47.976707616Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:28:47.977856 containerd[1745]: time="2025-03-17T17:28:47.977667297Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:28:47.977856 containerd[1745]: time="2025-03-17T17:28:47.977696897Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:28:47.977856 containerd[1745]: time="2025-03-17T17:28:47.977797258Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:28:48.002009 systemd[1]: Started cri-containerd-f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611.scope - libcontainer container f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611. Mar 17 17:28:48.034078 containerd[1745]: time="2025-03-17T17:28:48.034042161Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4tzzq,Uid:0186ea99-0088-4544-b92c-7659bf548a6e,Namespace:kube-system,Attempt:5,} returns sandbox id \"f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611\"" Mar 17 17:28:48.039005 containerd[1745]: time="2025-03-17T17:28:48.038959331Z" level=info msg="CreateContainer within sandbox \"f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 17:28:48.136999 containerd[1745]: time="2025-03-17T17:28:48.136644671Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:28:48.136999 containerd[1745]: time="2025-03-17T17:28:48.136706111Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:28:48.136999 containerd[1745]: time="2025-03-17T17:28:48.136717191Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:28:48.136999 containerd[1745]: time="2025-03-17T17:28:48.136797191Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:28:48.156869 systemd[1]: run-containerd-runc-k8s.io-17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52-runc.NHNvOr.mount: Deactivated successfully. Mar 17 17:28:48.166022 systemd[1]: Started cri-containerd-17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52.scope - libcontainer container 17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52. Mar 17 17:28:48.197768 containerd[1745]: time="2025-03-17T17:28:48.197717624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b557bfbcb-wssz4,Uid:213ca9a6-a07c-4c72-a108-6e622fdd0452,Namespace:calico-apiserver,Attempt:5,} returns sandbox id \"17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52\"" Mar 17 17:28:48.370987 systemd-networkd[1440]: cali5cefdb4b4df: Gained IPv6LL Mar 17 17:28:48.728889 containerd[1745]: time="2025-03-17T17:28:48.728828164Z" level=info msg="CreateContainer within sandbox \"977b3934c1e5d682da56828ce2b06e4b0b7e4471f70b1618c9a116d46da8a07d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"88eee2bc61ae9531bc8670fc3f60b24a04b3190a469547ef8edb91e6d5fd7862\"" Mar 17 17:28:48.731447 containerd[1745]: time="2025-03-17T17:28:48.729903846Z" level=info msg="StartContainer for \"88eee2bc61ae9531bc8670fc3f60b24a04b3190a469547ef8edb91e6d5fd7862\"" Mar 17 17:28:48.766047 systemd[1]: Started cri-containerd-88eee2bc61ae9531bc8670fc3f60b24a04b3190a469547ef8edb91e6d5fd7862.scope - libcontainer container 88eee2bc61ae9531bc8670fc3f60b24a04b3190a469547ef8edb91e6d5fd7862. Mar 17 17:28:49.213486 containerd[1745]: time="2025-03-17T17:28:49.213434378Z" level=info msg="StartContainer for \"88eee2bc61ae9531bc8670fc3f60b24a04b3190a469547ef8edb91e6d5fd7862\" returns successfully" Mar 17 17:28:49.533880 kubelet[3318]: I0317 17:28:49.533366 3318 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-d87wb" podStartSLOduration=73.533350329 podStartE2EDuration="1m13.533350329s" podCreationTimestamp="2025-03-17 17:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:28:49.512939611 +0000 UTC m=+79.586318443" watchObservedRunningTime="2025-03-17 17:28:49.533350329 +0000 UTC m=+79.606729161" Mar 17 17:28:50.069465 containerd[1745]: time="2025-03-17T17:28:50.069363638Z" level=info msg="CreateContainer within sandbox \"f44ee2d1516feb6c992c6db84b5dd0023350bb8995d4abbc09097558e19ba611\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"2cc0ac596b689e33061d420b4fe995b9d0207ce87aced5a387f1d8be4b0f4fd3\"" Mar 17 17:28:50.070288 containerd[1745]: time="2025-03-17T17:28:50.070063759Z" level=info msg="StartContainer for \"2cc0ac596b689e33061d420b4fe995b9d0207ce87aced5a387f1d8be4b0f4fd3\"" Mar 17 17:28:50.099982 systemd[1]: Started cri-containerd-2cc0ac596b689e33061d420b4fe995b9d0207ce87aced5a387f1d8be4b0f4fd3.scope - libcontainer container 2cc0ac596b689e33061d420b4fe995b9d0207ce87aced5a387f1d8be4b0f4fd3. Mar 17 17:28:50.180449 containerd[1745]: time="2025-03-17T17:28:50.180401123Z" level=info msg="StartContainer for \"2cc0ac596b689e33061d420b4fe995b9d0207ce87aced5a387f1d8be4b0f4fd3\" returns successfully" Mar 17 17:28:50.521375 kubelet[3318]: I0317 17:28:50.520824 3318 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-4tzzq" podStartSLOduration=74.520794947 podStartE2EDuration="1m14.520794947s" podCreationTimestamp="2025-03-17 17:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:28:50.520041506 +0000 UTC m=+80.593420338" watchObservedRunningTime="2025-03-17 17:28:50.520794947 +0000 UTC m=+80.594173779" Mar 17 17:28:51.630747 containerd[1745]: time="2025-03-17T17:28:51.630312774Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:51.675457 containerd[1745]: time="2025-03-17T17:28:51.675383817Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7473801" Mar 17 17:28:51.723414 containerd[1745]: time="2025-03-17T17:28:51.723339024Z" level=info msg="ImageCreate event name:\"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:51.771937 containerd[1745]: time="2025-03-17T17:28:51.771842713Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:51.773044 containerd[1745]: time="2025-03-17T17:28:51.772501994Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"8843558\" in 10.239331801s" Mar 17 17:28:51.773044 containerd[1745]: time="2025-03-17T17:28:51.772540954Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\"" Mar 17 17:28:51.773923 containerd[1745]: time="2025-03-17T17:28:51.773891837Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 17 17:28:51.775279 containerd[1745]: time="2025-03-17T17:28:51.775244279Z" level=info msg="CreateContainer within sandbox \"52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 17 17:28:52.162095 containerd[1745]: time="2025-03-17T17:28:52.161964186Z" level=info msg="CreateContainer within sandbox \"52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"26e784c171da9ed516abe4d8d76fcdde73eed26457d44bee3a83a2fbd94bc70e\"" Mar 17 17:28:52.162715 containerd[1745]: time="2025-03-17T17:28:52.162686827Z" level=info msg="StartContainer for \"26e784c171da9ed516abe4d8d76fcdde73eed26457d44bee3a83a2fbd94bc70e\"" Mar 17 17:28:52.201020 systemd[1]: Started cri-containerd-26e784c171da9ed516abe4d8d76fcdde73eed26457d44bee3a83a2fbd94bc70e.scope - libcontainer container 26e784c171da9ed516abe4d8d76fcdde73eed26457d44bee3a83a2fbd94bc70e. Mar 17 17:28:52.273231 containerd[1745]: time="2025-03-17T17:28:52.273145829Z" level=info msg="StartContainer for \"26e784c171da9ed516abe4d8d76fcdde73eed26457d44bee3a83a2fbd94bc70e\" returns successfully" Mar 17 17:28:57.361925 containerd[1745]: time="2025-03-17T17:28:57.361860687Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:57.364836 containerd[1745]: time="2025-03-17T17:28:57.364675492Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=32560257" Mar 17 17:28:57.370271 containerd[1745]: time="2025-03-17T17:28:57.370224782Z" level=info msg="ImageCreate event name:\"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:57.376722 containerd[1745]: time="2025-03-17T17:28:57.376654994Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:57.377454 containerd[1745]: time="2025-03-17T17:28:57.377320355Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"33929982\" in 5.603396358s" Mar 17 17:28:57.377454 containerd[1745]: time="2025-03-17T17:28:57.377355275Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\"" Mar 17 17:28:57.379093 containerd[1745]: time="2025-03-17T17:28:57.378904918Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 17 17:28:57.405258 containerd[1745]: time="2025-03-17T17:28:57.405039926Z" level=info msg="CreateContainer within sandbox \"c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 17 17:28:57.462916 containerd[1745]: time="2025-03-17T17:28:57.462868391Z" level=info msg="CreateContainer within sandbox \"c9ea2b4177e6fa37d2204856a46e693dfadb77867d51918cc8e3f12fe740dae9\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"a785b66caf28fe4fe06d3606c6fca291adaf57fc6320dfebec57893925c85d04\"" Mar 17 17:28:57.463883 containerd[1745]: time="2025-03-17T17:28:57.463845313Z" level=info msg="StartContainer for \"a785b66caf28fe4fe06d3606c6fca291adaf57fc6320dfebec57893925c85d04\"" Mar 17 17:28:57.497014 systemd[1]: Started cri-containerd-a785b66caf28fe4fe06d3606c6fca291adaf57fc6320dfebec57893925c85d04.scope - libcontainer container a785b66caf28fe4fe06d3606c6fca291adaf57fc6320dfebec57893925c85d04. Mar 17 17:28:57.542078 containerd[1745]: time="2025-03-17T17:28:57.542019176Z" level=info msg="StartContainer for \"a785b66caf28fe4fe06d3606c6fca291adaf57fc6320dfebec57893925c85d04\" returns successfully" Mar 17 17:28:58.561162 kubelet[3318]: I0317 17:28:58.560951 3318 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6fcc8c87fb-p2fc6" podStartSLOduration=61.526778832 podStartE2EDuration="1m13.560929552s" podCreationTimestamp="2025-03-17 17:27:45 +0000 UTC" firstStartedPulling="2025-03-17 17:28:45.344099997 +0000 UTC m=+75.417478829" lastFinishedPulling="2025-03-17 17:28:57.378250717 +0000 UTC m=+87.451629549" observedRunningTime="2025-03-17 17:28:58.559178749 +0000 UTC m=+88.632557741" watchObservedRunningTime="2025-03-17 17:28:58.560929552 +0000 UTC m=+88.634308384" Mar 17 17:28:59.604109 containerd[1745]: time="2025-03-17T17:28:59.604055115Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:59.607443 containerd[1745]: time="2025-03-17T17:28:59.607262481Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=40253267" Mar 17 17:28:59.612662 containerd[1745]: time="2025-03-17T17:28:59.611553569Z" level=info msg="ImageCreate event name:\"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:59.616644 containerd[1745]: time="2025-03-17T17:28:59.616604258Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:28:59.618035 containerd[1745]: time="2025-03-17T17:28:59.618003740Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 2.239064742s" Mar 17 17:28:59.618085 containerd[1745]: time="2025-03-17T17:28:59.618040140Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 17 17:28:59.628237 containerd[1745]: time="2025-03-17T17:28:59.628017958Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 17 17:28:59.635505 containerd[1745]: time="2025-03-17T17:28:59.635326892Z" level=info msg="CreateContainer within sandbox \"c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 17:28:59.682963 containerd[1745]: time="2025-03-17T17:28:59.682882977Z" level=info msg="CreateContainer within sandbox \"c40b579782430541ada843c7e1d93a268525afd4187ccde951f75f43364bfe96\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"3550368dca6428a5157368de0a2303edc6959993fcf64a60dbc13834ff2b6f48\"" Mar 17 17:28:59.685220 containerd[1745]: time="2025-03-17T17:28:59.685160462Z" level=info msg="StartContainer for \"3550368dca6428a5157368de0a2303edc6959993fcf64a60dbc13834ff2b6f48\"" Mar 17 17:28:59.715007 systemd[1]: Started cri-containerd-3550368dca6428a5157368de0a2303edc6959993fcf64a60dbc13834ff2b6f48.scope - libcontainer container 3550368dca6428a5157368de0a2303edc6959993fcf64a60dbc13834ff2b6f48. Mar 17 17:28:59.760550 containerd[1745]: time="2025-03-17T17:28:59.760505838Z" level=info msg="StartContainer for \"3550368dca6428a5157368de0a2303edc6959993fcf64a60dbc13834ff2b6f48\" returns successfully" Mar 17 17:29:00.063344 containerd[1745]: time="2025-03-17T17:29:00.062537623Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:00.070132 containerd[1745]: time="2025-03-17T17:29:00.070075196Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 17 17:29:00.072673 containerd[1745]: time="2025-03-17T17:29:00.072495641Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 444.439203ms" Mar 17 17:29:00.072673 containerd[1745]: time="2025-03-17T17:29:00.072528801Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 17 17:29:00.074211 containerd[1745]: time="2025-03-17T17:29:00.073878443Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 17 17:29:00.079785 containerd[1745]: time="2025-03-17T17:29:00.079612534Z" level=info msg="CreateContainer within sandbox \"17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 17:29:00.175071 containerd[1745]: time="2025-03-17T17:29:00.175015186Z" level=info msg="CreateContainer within sandbox \"17d728edae9cb1b4816bc53ada107edf73c1e7c55e9374e1ea1f61dfeda7ee52\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8a47a85fe9e6729dd12d6ad358f6df23372b4a1442ead91dc04bd158829c740d\"" Mar 17 17:29:00.177054 containerd[1745]: time="2025-03-17T17:29:00.176768709Z" level=info msg="StartContainer for \"8a47a85fe9e6729dd12d6ad358f6df23372b4a1442ead91dc04bd158829c740d\"" Mar 17 17:29:00.209851 systemd[1]: Started cri-containerd-8a47a85fe9e6729dd12d6ad358f6df23372b4a1442ead91dc04bd158829c740d.scope - libcontainer container 8a47a85fe9e6729dd12d6ad358f6df23372b4a1442ead91dc04bd158829c740d. Mar 17 17:29:01.075537 containerd[1745]: time="2025-03-17T17:29:01.075217211Z" level=info msg="StartContainer for \"8a47a85fe9e6729dd12d6ad358f6df23372b4a1442ead91dc04bd158829c740d\" returns successfully" Mar 17 17:29:01.177830 kubelet[3318]: I0317 17:29:01.176764 3318 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-b557bfbcb-krcfl" podStartSLOduration=65.032228895 podStartE2EDuration="1m18.176745834s" podCreationTimestamp="2025-03-17 17:27:43 +0000 UTC" firstStartedPulling="2025-03-17 17:28:46.482515898 +0000 UTC m=+76.555894730" lastFinishedPulling="2025-03-17 17:28:59.627032877 +0000 UTC m=+89.700411669" observedRunningTime="2025-03-17 17:29:01.137690564 +0000 UTC m=+91.211069516" watchObservedRunningTime="2025-03-17 17:29:01.176745834 +0000 UTC m=+91.250124666" Mar 17 17:29:01.931505 kubelet[3318]: I0317 17:29:01.931108 3318 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-b557bfbcb-wssz4" podStartSLOduration=67.057034421 podStartE2EDuration="1m18.931090276s" podCreationTimestamp="2025-03-17 17:27:43 +0000 UTC" firstStartedPulling="2025-03-17 17:28:48.199402507 +0000 UTC m=+78.272781339" lastFinishedPulling="2025-03-17 17:29:00.073458362 +0000 UTC m=+90.146837194" observedRunningTime="2025-03-17 17:29:01.178366597 +0000 UTC m=+91.251745429" watchObservedRunningTime="2025-03-17 17:29:01.931090276 +0000 UTC m=+92.004469108" Mar 17 17:29:03.224669 containerd[1745]: time="2025-03-17T17:29:03.224350290Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:03.227266 containerd[1745]: time="2025-03-17T17:29:03.227225975Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13121717" Mar 17 17:29:03.231524 containerd[1745]: time="2025-03-17T17:29:03.231451023Z" level=info msg="ImageCreate event name:\"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:03.238834 containerd[1745]: time="2025-03-17T17:29:03.237250434Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:03.239534 containerd[1745]: time="2025-03-17T17:29:03.239476638Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"14491426\" in 3.165569955s" Mar 17 17:29:03.239534 containerd[1745]: time="2025-03-17T17:29:03.239525278Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\"" Mar 17 17:29:03.243493 containerd[1745]: time="2025-03-17T17:29:03.243456805Z" level=info msg="CreateContainer within sandbox \"52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 17 17:29:03.287978 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2230775161.mount: Deactivated successfully. Mar 17 17:29:03.303577 containerd[1745]: time="2025-03-17T17:29:03.303531273Z" level=info msg="CreateContainer within sandbox \"52afe93a358b1242d2581959af81724f6f0e8cd33faa150c96efddd82a966ef3\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"4e48eaaf382d819e1dcd15b6220cb445eb858e89b49b0ecf561d558230c335ad\"" Mar 17 17:29:03.304505 containerd[1745]: time="2025-03-17T17:29:03.304199674Z" level=info msg="StartContainer for \"4e48eaaf382d819e1dcd15b6220cb445eb858e89b49b0ecf561d558230c335ad\"" Mar 17 17:29:03.334412 systemd[1]: run-containerd-runc-k8s.io-4e48eaaf382d819e1dcd15b6220cb445eb858e89b49b0ecf561d558230c335ad-runc.wnt6zf.mount: Deactivated successfully. Mar 17 17:29:03.344020 systemd[1]: Started cri-containerd-4e48eaaf382d819e1dcd15b6220cb445eb858e89b49b0ecf561d558230c335ad.scope - libcontainer container 4e48eaaf382d819e1dcd15b6220cb445eb858e89b49b0ecf561d558230c335ad. Mar 17 17:29:03.379518 containerd[1745]: time="2025-03-17T17:29:03.379378810Z" level=info msg="StartContainer for \"4e48eaaf382d819e1dcd15b6220cb445eb858e89b49b0ecf561d558230c335ad\" returns successfully" Mar 17 17:29:04.145271 kubelet[3318]: I0317 17:29:04.145188 3318 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-zckwm" podStartSLOduration=57.435862863 podStartE2EDuration="1m19.145171712s" podCreationTimestamp="2025-03-17 17:27:45 +0000 UTC" firstStartedPulling="2025-03-17 17:28:41.532815473 +0000 UTC m=+71.606194305" lastFinishedPulling="2025-03-17 17:29:03.242124322 +0000 UTC m=+93.315503154" observedRunningTime="2025-03-17 17:29:04.14360631 +0000 UTC m=+94.216985142" watchObservedRunningTime="2025-03-17 17:29:04.145171712 +0000 UTC m=+94.218550544" Mar 17 17:29:04.157059 kubelet[3318]: I0317 17:29:04.156887 3318 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 17 17:29:04.157059 kubelet[3318]: I0317 17:29:04.156946 3318 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 17 17:29:09.578210 systemd[1]: run-containerd-runc-k8s.io-a785b66caf28fe4fe06d3606c6fca291adaf57fc6320dfebec57893925c85d04-runc.psXLPX.mount: Deactivated successfully. Mar 17 17:29:39.976794 containerd[1745]: time="2025-03-17T17:29:39.976672071Z" level=info msg="StopPodSandbox for \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\"" Mar 17 17:29:39.976794 containerd[1745]: time="2025-03-17T17:29:39.976794711Z" level=info msg="TearDown network for sandbox \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\" successfully" Mar 17 17:29:39.976794 containerd[1745]: time="2025-03-17T17:29:39.976825831Z" level=info msg="StopPodSandbox for \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\" returns successfully" Mar 17 17:29:39.977725 containerd[1745]: time="2025-03-17T17:29:39.977659273Z" level=info msg="RemovePodSandbox for \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\"" Mar 17 17:29:39.977725 containerd[1745]: time="2025-03-17T17:29:39.977695833Z" level=info msg="Forcibly stopping sandbox \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\"" Mar 17 17:29:39.977906 containerd[1745]: time="2025-03-17T17:29:39.977767953Z" level=info msg="TearDown network for sandbox \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\" successfully" Mar 17 17:29:39.988231 containerd[1745]: time="2025-03-17T17:29:39.988185729Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:39.988332 containerd[1745]: time="2025-03-17T17:29:39.988265049Z" level=info msg="RemovePodSandbox \"c26b964de11dbec8777e93c0bba39e472a8934ec2792344a9ff2ca70bb0a0156\" returns successfully" Mar 17 17:29:39.988921 containerd[1745]: time="2025-03-17T17:29:39.988712449Z" level=info msg="StopPodSandbox for \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\"" Mar 17 17:29:39.988921 containerd[1745]: time="2025-03-17T17:29:39.988848570Z" level=info msg="TearDown network for sandbox \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\" successfully" Mar 17 17:29:39.988921 containerd[1745]: time="2025-03-17T17:29:39.988860370Z" level=info msg="StopPodSandbox for \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\" returns successfully" Mar 17 17:29:39.989154 containerd[1745]: time="2025-03-17T17:29:39.989123850Z" level=info msg="RemovePodSandbox for \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\"" Mar 17 17:29:39.989154 containerd[1745]: time="2025-03-17T17:29:39.989156690Z" level=info msg="Forcibly stopping sandbox \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\"" Mar 17 17:29:39.989237 containerd[1745]: time="2025-03-17T17:29:39.989219810Z" level=info msg="TearDown network for sandbox \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\" successfully" Mar 17 17:29:39.999141 containerd[1745]: time="2025-03-17T17:29:39.999052745Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:39.999325 containerd[1745]: time="2025-03-17T17:29:39.999150585Z" level=info msg="RemovePodSandbox \"3c97be9389e05e5164d6be35c5b29a04c5b885031e77a23e2dc20621de5596bf\" returns successfully" Mar 17 17:29:39.999894 containerd[1745]: time="2025-03-17T17:29:39.999659746Z" level=info msg="StopPodSandbox for \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\"" Mar 17 17:29:39.999894 containerd[1745]: time="2025-03-17T17:29:39.999768346Z" level=info msg="TearDown network for sandbox \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\" successfully" Mar 17 17:29:39.999894 containerd[1745]: time="2025-03-17T17:29:39.999777426Z" level=info msg="StopPodSandbox for \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\" returns successfully" Mar 17 17:29:40.000957 containerd[1745]: time="2025-03-17T17:29:40.000234027Z" level=info msg="RemovePodSandbox for \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\"" Mar 17 17:29:40.000957 containerd[1745]: time="2025-03-17T17:29:40.000268387Z" level=info msg="Forcibly stopping sandbox \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\"" Mar 17 17:29:40.000957 containerd[1745]: time="2025-03-17T17:29:40.000351507Z" level=info msg="TearDown network for sandbox \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\" successfully" Mar 17 17:29:40.010850 containerd[1745]: time="2025-03-17T17:29:40.010791163Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.010945 containerd[1745]: time="2025-03-17T17:29:40.010886083Z" level=info msg="RemovePodSandbox \"e3c3b014614f6c15af6b83e6e34af9d4a492c895848cc457a752db9ccc880e66\" returns successfully" Mar 17 17:29:40.011486 containerd[1745]: time="2025-03-17T17:29:40.011340804Z" level=info msg="StopPodSandbox for \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\"" Mar 17 17:29:40.011486 containerd[1745]: time="2025-03-17T17:29:40.011434524Z" level=info msg="TearDown network for sandbox \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\" successfully" Mar 17 17:29:40.011486 containerd[1745]: time="2025-03-17T17:29:40.011444004Z" level=info msg="StopPodSandbox for \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\" returns successfully" Mar 17 17:29:40.012245 containerd[1745]: time="2025-03-17T17:29:40.012035485Z" level=info msg="RemovePodSandbox for \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\"" Mar 17 17:29:40.012245 containerd[1745]: time="2025-03-17T17:29:40.012101165Z" level=info msg="Forcibly stopping sandbox \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\"" Mar 17 17:29:40.012245 containerd[1745]: time="2025-03-17T17:29:40.012176245Z" level=info msg="TearDown network for sandbox \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\" successfully" Mar 17 17:29:40.022205 containerd[1745]: time="2025-03-17T17:29:40.022149100Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.022512 containerd[1745]: time="2025-03-17T17:29:40.022221381Z" level=info msg="RemovePodSandbox \"891d369a0a1054659e0fc7c145f261e94614780d535aec4ff12d5ec8f1a168e4\" returns successfully" Mar 17 17:29:40.022737 containerd[1745]: time="2025-03-17T17:29:40.022713181Z" level=info msg="StopPodSandbox for \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\"" Mar 17 17:29:40.022858 containerd[1745]: time="2025-03-17T17:29:40.022841341Z" level=info msg="TearDown network for sandbox \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\" successfully" Mar 17 17:29:40.022908 containerd[1745]: time="2025-03-17T17:29:40.022857542Z" level=info msg="StopPodSandbox for \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\" returns successfully" Mar 17 17:29:40.023909 containerd[1745]: time="2025-03-17T17:29:40.023159662Z" level=info msg="RemovePodSandbox for \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\"" Mar 17 17:29:40.023909 containerd[1745]: time="2025-03-17T17:29:40.023185822Z" level=info msg="Forcibly stopping sandbox \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\"" Mar 17 17:29:40.023909 containerd[1745]: time="2025-03-17T17:29:40.023247022Z" level=info msg="TearDown network for sandbox \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\" successfully" Mar 17 17:29:40.033314 containerd[1745]: time="2025-03-17T17:29:40.033269397Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.033416 containerd[1745]: time="2025-03-17T17:29:40.033343638Z" level=info msg="RemovePodSandbox \"8ba9e5d31c84f5fe57490e1f150bbdc65a5da44adaf0e23269e45be85d17a979\" returns successfully" Mar 17 17:29:40.034092 containerd[1745]: time="2025-03-17T17:29:40.033937158Z" level=info msg="StopPodSandbox for \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\"" Mar 17 17:29:40.034092 containerd[1745]: time="2025-03-17T17:29:40.034038399Z" level=info msg="TearDown network for sandbox \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\" successfully" Mar 17 17:29:40.034092 containerd[1745]: time="2025-03-17T17:29:40.034049239Z" level=info msg="StopPodSandbox for \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\" returns successfully" Mar 17 17:29:40.034372 containerd[1745]: time="2025-03-17T17:29:40.034342879Z" level=info msg="RemovePodSandbox for \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\"" Mar 17 17:29:40.034420 containerd[1745]: time="2025-03-17T17:29:40.034375919Z" level=info msg="Forcibly stopping sandbox \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\"" Mar 17 17:29:40.034465 containerd[1745]: time="2025-03-17T17:29:40.034447119Z" level=info msg="TearDown network for sandbox \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\" successfully" Mar 17 17:29:40.056406 containerd[1745]: time="2025-03-17T17:29:40.056300113Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.056406 containerd[1745]: time="2025-03-17T17:29:40.056373593Z" level=info msg="RemovePodSandbox \"09f72d67951024af5bda6f7585999f19e8e478a412b3ce15f40949f443044269\" returns successfully" Mar 17 17:29:40.056941 containerd[1745]: time="2025-03-17T17:29:40.056891073Z" level=info msg="StopPodSandbox for \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\"" Mar 17 17:29:40.057148 containerd[1745]: time="2025-03-17T17:29:40.057128874Z" level=info msg="TearDown network for sandbox \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\" successfully" Mar 17 17:29:40.057148 containerd[1745]: time="2025-03-17T17:29:40.057146954Z" level=info msg="StopPodSandbox for \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\" returns successfully" Mar 17 17:29:40.057436 containerd[1745]: time="2025-03-17T17:29:40.057412394Z" level=info msg="RemovePodSandbox for \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\"" Mar 17 17:29:40.057502 containerd[1745]: time="2025-03-17T17:29:40.057439994Z" level=info msg="Forcibly stopping sandbox \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\"" Mar 17 17:29:40.057502 containerd[1745]: time="2025-03-17T17:29:40.057497554Z" level=info msg="TearDown network for sandbox \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\" successfully" Mar 17 17:29:40.071660 containerd[1745]: time="2025-03-17T17:29:40.071606696Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.071782 containerd[1745]: time="2025-03-17T17:29:40.071712496Z" level=info msg="RemovePodSandbox \"e340c4f0868fa7913e44ea9ca9d16c3170eedaed4d3328a7e11d80ecba071f89\" returns successfully" Mar 17 17:29:40.072377 containerd[1745]: time="2025-03-17T17:29:40.072184897Z" level=info msg="StopPodSandbox for \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\"" Mar 17 17:29:40.072377 containerd[1745]: time="2025-03-17T17:29:40.072281697Z" level=info msg="TearDown network for sandbox \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\" successfully" Mar 17 17:29:40.072377 containerd[1745]: time="2025-03-17T17:29:40.072290817Z" level=info msg="StopPodSandbox for \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\" returns successfully" Mar 17 17:29:40.072917 containerd[1745]: time="2025-03-17T17:29:40.072743578Z" level=info msg="RemovePodSandbox for \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\"" Mar 17 17:29:40.072917 containerd[1745]: time="2025-03-17T17:29:40.072770098Z" level=info msg="Forcibly stopping sandbox \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\"" Mar 17 17:29:40.072917 containerd[1745]: time="2025-03-17T17:29:40.072874578Z" level=info msg="TearDown network for sandbox \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\" successfully" Mar 17 17:29:40.092848 containerd[1745]: time="2025-03-17T17:29:40.092771288Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.093020 containerd[1745]: time="2025-03-17T17:29:40.092859048Z" level=info msg="RemovePodSandbox \"898221dd89b5eab635e5031ddde953ca83cd911a8c439b5f31b0a6f7f01bae93\" returns successfully" Mar 17 17:29:40.093381 containerd[1745]: time="2025-03-17T17:29:40.093346529Z" level=info msg="StopPodSandbox for \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\"" Mar 17 17:29:40.093624 containerd[1745]: time="2025-03-17T17:29:40.093548049Z" level=info msg="TearDown network for sandbox \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\" successfully" Mar 17 17:29:40.093624 containerd[1745]: time="2025-03-17T17:29:40.093565649Z" level=info msg="StopPodSandbox for \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\" returns successfully" Mar 17 17:29:40.094655 containerd[1745]: time="2025-03-17T17:29:40.093904890Z" level=info msg="RemovePodSandbox for \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\"" Mar 17 17:29:40.094655 containerd[1745]: time="2025-03-17T17:29:40.093934490Z" level=info msg="Forcibly stopping sandbox \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\"" Mar 17 17:29:40.094655 containerd[1745]: time="2025-03-17T17:29:40.093993850Z" level=info msg="TearDown network for sandbox \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\" successfully" Mar 17 17:29:40.113162 containerd[1745]: time="2025-03-17T17:29:40.113035199Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.113162 containerd[1745]: time="2025-03-17T17:29:40.113118439Z" level=info msg="RemovePodSandbox \"ba9e4e98dc3f47642b47b11c94ae2d4d65716a06fe87625039dc796201290d78\" returns successfully" Mar 17 17:29:40.114322 containerd[1745]: time="2025-03-17T17:29:40.113582680Z" level=info msg="StopPodSandbox for \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\"" Mar 17 17:29:40.114322 containerd[1745]: time="2025-03-17T17:29:40.113679560Z" level=info msg="TearDown network for sandbox \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\" successfully" Mar 17 17:29:40.114322 containerd[1745]: time="2025-03-17T17:29:40.113689800Z" level=info msg="StopPodSandbox for \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\" returns successfully" Mar 17 17:29:40.114858 containerd[1745]: time="2025-03-17T17:29:40.114484441Z" level=info msg="RemovePodSandbox for \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\"" Mar 17 17:29:40.114858 containerd[1745]: time="2025-03-17T17:29:40.114515841Z" level=info msg="Forcibly stopping sandbox \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\"" Mar 17 17:29:40.114858 containerd[1745]: time="2025-03-17T17:29:40.114625921Z" level=info msg="TearDown network for sandbox \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\" successfully" Mar 17 17:29:40.135962 containerd[1745]: time="2025-03-17T17:29:40.135900234Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.136174 containerd[1745]: time="2025-03-17T17:29:40.135976994Z" level=info msg="RemovePodSandbox \"82ec87fe1833a6221aa409409b3435e7bdef09cf52bc3c797a27a9872c2374d0\" returns successfully" Mar 17 17:29:40.136597 containerd[1745]: time="2025-03-17T17:29:40.136355235Z" level=info msg="StopPodSandbox for \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\"" Mar 17 17:29:40.136597 containerd[1745]: time="2025-03-17T17:29:40.136449035Z" level=info msg="TearDown network for sandbox \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\" successfully" Mar 17 17:29:40.136597 containerd[1745]: time="2025-03-17T17:29:40.136485515Z" level=info msg="StopPodSandbox for \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\" returns successfully" Mar 17 17:29:40.137597 containerd[1745]: time="2025-03-17T17:29:40.136849755Z" level=info msg="RemovePodSandbox for \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\"" Mar 17 17:29:40.137597 containerd[1745]: time="2025-03-17T17:29:40.136902595Z" level=info msg="Forcibly stopping sandbox \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\"" Mar 17 17:29:40.137597 containerd[1745]: time="2025-03-17T17:29:40.136967716Z" level=info msg="TearDown network for sandbox \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\" successfully" Mar 17 17:29:40.154080 containerd[1745]: time="2025-03-17T17:29:40.154034862Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.154210 containerd[1745]: time="2025-03-17T17:29:40.154113902Z" level=info msg="RemovePodSandbox \"d43e13da1799911d07af1bdc544260c1609de92c2c0076adf97ac2dd8346f226\" returns successfully" Mar 17 17:29:40.155275 containerd[1745]: time="2025-03-17T17:29:40.154671143Z" level=info msg="StopPodSandbox for \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\"" Mar 17 17:29:40.155432 containerd[1745]: time="2025-03-17T17:29:40.155412744Z" level=info msg="TearDown network for sandbox \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\" successfully" Mar 17 17:29:40.155571 containerd[1745]: time="2025-03-17T17:29:40.155473304Z" level=info msg="StopPodSandbox for \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\" returns successfully" Mar 17 17:29:40.156851 containerd[1745]: time="2025-03-17T17:29:40.155961064Z" level=info msg="RemovePodSandbox for \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\"" Mar 17 17:29:40.156851 containerd[1745]: time="2025-03-17T17:29:40.155986905Z" level=info msg="Forcibly stopping sandbox \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\"" Mar 17 17:29:40.156851 containerd[1745]: time="2025-03-17T17:29:40.156057345Z" level=info msg="TearDown network for sandbox \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\" successfully" Mar 17 17:29:40.166652 containerd[1745]: time="2025-03-17T17:29:40.166611241Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.166889 containerd[1745]: time="2025-03-17T17:29:40.166869961Z" level=info msg="RemovePodSandbox \"037b11e5702c38917352a3b8edd4e578b093e45e63df3a69a8809605e162311f\" returns successfully" Mar 17 17:29:40.167364 containerd[1745]: time="2025-03-17T17:29:40.167344242Z" level=info msg="StopPodSandbox for \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\"" Mar 17 17:29:40.167600 containerd[1745]: time="2025-03-17T17:29:40.167543242Z" level=info msg="TearDown network for sandbox \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\" successfully" Mar 17 17:29:40.167600 containerd[1745]: time="2025-03-17T17:29:40.167558722Z" level=info msg="StopPodSandbox for \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\" returns successfully" Mar 17 17:29:40.167992 containerd[1745]: time="2025-03-17T17:29:40.167973043Z" level=info msg="RemovePodSandbox for \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\"" Mar 17 17:29:40.168288 containerd[1745]: time="2025-03-17T17:29:40.168163603Z" level=info msg="Forcibly stopping sandbox \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\"" Mar 17 17:29:40.168288 containerd[1745]: time="2025-03-17T17:29:40.168234963Z" level=info msg="TearDown network for sandbox \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\" successfully" Mar 17 17:29:40.178204 containerd[1745]: time="2025-03-17T17:29:40.178153218Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.178344 containerd[1745]: time="2025-03-17T17:29:40.178228778Z" level=info msg="RemovePodSandbox \"2ea938c1a70b2f76288dfb4b5db88afd1a1a6076dd3f878d38ded03f6a69540f\" returns successfully" Mar 17 17:29:40.178734 containerd[1745]: time="2025-03-17T17:29:40.178710539Z" level=info msg="StopPodSandbox for \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\"" Mar 17 17:29:40.178874 containerd[1745]: time="2025-03-17T17:29:40.178852459Z" level=info msg="TearDown network for sandbox \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\" successfully" Mar 17 17:29:40.178874 containerd[1745]: time="2025-03-17T17:29:40.178870099Z" level=info msg="StopPodSandbox for \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\" returns successfully" Mar 17 17:29:40.179206 containerd[1745]: time="2025-03-17T17:29:40.179184180Z" level=info msg="RemovePodSandbox for \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\"" Mar 17 17:29:40.179827 containerd[1745]: time="2025-03-17T17:29:40.179341900Z" level=info msg="Forcibly stopping sandbox \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\"" Mar 17 17:29:40.179827 containerd[1745]: time="2025-03-17T17:29:40.179408420Z" level=info msg="TearDown network for sandbox \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\" successfully" Mar 17 17:29:40.187574 containerd[1745]: time="2025-03-17T17:29:40.187527393Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.187692 containerd[1745]: time="2025-03-17T17:29:40.187601753Z" level=info msg="RemovePodSandbox \"ed1684477eef9d28513a64629b3cd35371371aa375ca367b7b339965f23a2d26\" returns successfully" Mar 17 17:29:40.188096 containerd[1745]: time="2025-03-17T17:29:40.188068793Z" level=info msg="StopPodSandbox for \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\"" Mar 17 17:29:40.188187 containerd[1745]: time="2025-03-17T17:29:40.188166634Z" level=info msg="TearDown network for sandbox \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\" successfully" Mar 17 17:29:40.188187 containerd[1745]: time="2025-03-17T17:29:40.188182194Z" level=info msg="StopPodSandbox for \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\" returns successfully" Mar 17 17:29:40.189115 containerd[1745]: time="2025-03-17T17:29:40.188541634Z" level=info msg="RemovePodSandbox for \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\"" Mar 17 17:29:40.189115 containerd[1745]: time="2025-03-17T17:29:40.188572234Z" level=info msg="Forcibly stopping sandbox \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\"" Mar 17 17:29:40.189115 containerd[1745]: time="2025-03-17T17:29:40.188633594Z" level=info msg="TearDown network for sandbox \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\" successfully" Mar 17 17:29:40.197307 containerd[1745]: time="2025-03-17T17:29:40.197259447Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.197413 containerd[1745]: time="2025-03-17T17:29:40.197333128Z" level=info msg="RemovePodSandbox \"22f6abfe86a439aa61139f72410213fab8b1761d60c367f25fa765ce76a64054\" returns successfully" Mar 17 17:29:40.198085 containerd[1745]: time="2025-03-17T17:29:40.197918408Z" level=info msg="StopPodSandbox for \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\"" Mar 17 17:29:40.198085 containerd[1745]: time="2025-03-17T17:29:40.198015249Z" level=info msg="TearDown network for sandbox \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\" successfully" Mar 17 17:29:40.198085 containerd[1745]: time="2025-03-17T17:29:40.198024249Z" level=info msg="StopPodSandbox for \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\" returns successfully" Mar 17 17:29:40.198621 containerd[1745]: time="2025-03-17T17:29:40.198575609Z" level=info msg="RemovePodSandbox for \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\"" Mar 17 17:29:40.198621 containerd[1745]: time="2025-03-17T17:29:40.198601050Z" level=info msg="Forcibly stopping sandbox \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\"" Mar 17 17:29:40.199397 containerd[1745]: time="2025-03-17T17:29:40.198764770Z" level=info msg="TearDown network for sandbox \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\" successfully" Mar 17 17:29:40.210309 containerd[1745]: time="2025-03-17T17:29:40.210270467Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.210479 containerd[1745]: time="2025-03-17T17:29:40.210329307Z" level=info msg="RemovePodSandbox \"966f6c9ab9eb832e0c492bc1dcc97c1b2911c14c91cec471aeae39a89a83fd6e\" returns successfully" Mar 17 17:29:40.210746 containerd[1745]: time="2025-03-17T17:29:40.210722788Z" level=info msg="StopPodSandbox for \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\"" Mar 17 17:29:40.210869 containerd[1745]: time="2025-03-17T17:29:40.210847268Z" level=info msg="TearDown network for sandbox \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\" successfully" Mar 17 17:29:40.210869 containerd[1745]: time="2025-03-17T17:29:40.210865908Z" level=info msg="StopPodSandbox for \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\" returns successfully" Mar 17 17:29:40.211751 containerd[1745]: time="2025-03-17T17:29:40.211166949Z" level=info msg="RemovePodSandbox for \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\"" Mar 17 17:29:40.211751 containerd[1745]: time="2025-03-17T17:29:40.211191989Z" level=info msg="Forcibly stopping sandbox \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\"" Mar 17 17:29:40.211751 containerd[1745]: time="2025-03-17T17:29:40.211253269Z" level=info msg="TearDown network for sandbox \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\" successfully" Mar 17 17:29:40.221638 containerd[1745]: time="2025-03-17T17:29:40.221588925Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.221745 containerd[1745]: time="2025-03-17T17:29:40.221664645Z" level=info msg="RemovePodSandbox \"3fda14f08112040a4040e0f8b1eff024c067300dde1fb86868ea9f43c0b79f90\" returns successfully" Mar 17 17:29:40.222281 containerd[1745]: time="2025-03-17T17:29:40.222120645Z" level=info msg="StopPodSandbox for \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\"" Mar 17 17:29:40.222281 containerd[1745]: time="2025-03-17T17:29:40.222226646Z" level=info msg="TearDown network for sandbox \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\" successfully" Mar 17 17:29:40.222281 containerd[1745]: time="2025-03-17T17:29:40.222235606Z" level=info msg="StopPodSandbox for \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\" returns successfully" Mar 17 17:29:40.222834 containerd[1745]: time="2025-03-17T17:29:40.222730246Z" level=info msg="RemovePodSandbox for \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\"" Mar 17 17:29:40.222834 containerd[1745]: time="2025-03-17T17:29:40.222766486Z" level=info msg="Forcibly stopping sandbox \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\"" Mar 17 17:29:40.223100 containerd[1745]: time="2025-03-17T17:29:40.222976007Z" level=info msg="TearDown network for sandbox \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\" successfully" Mar 17 17:29:40.237486 containerd[1745]: time="2025-03-17T17:29:40.237319909Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.237486 containerd[1745]: time="2025-03-17T17:29:40.237399349Z" level=info msg="RemovePodSandbox \"38cdfe844ee8a7e40ddf1c0d56af43782eedbb3dc19223c23e6662430043826f\" returns successfully" Mar 17 17:29:40.238193 containerd[1745]: time="2025-03-17T17:29:40.238022310Z" level=info msg="StopPodSandbox for \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\"" Mar 17 17:29:40.238193 containerd[1745]: time="2025-03-17T17:29:40.238132350Z" level=info msg="TearDown network for sandbox \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\" successfully" Mar 17 17:29:40.238193 containerd[1745]: time="2025-03-17T17:29:40.238142150Z" level=info msg="StopPodSandbox for \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\" returns successfully" Mar 17 17:29:40.238631 containerd[1745]: time="2025-03-17T17:29:40.238551910Z" level=info msg="RemovePodSandbox for \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\"" Mar 17 17:29:40.238631 containerd[1745]: time="2025-03-17T17:29:40.238580230Z" level=info msg="Forcibly stopping sandbox \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\"" Mar 17 17:29:40.239036 containerd[1745]: time="2025-03-17T17:29:40.239013391Z" level=info msg="TearDown network for sandbox \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\" successfully" Mar 17 17:29:40.253905 containerd[1745]: time="2025-03-17T17:29:40.253857614Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:29:40.254052 containerd[1745]: time="2025-03-17T17:29:40.253932294Z" level=info msg="RemovePodSandbox \"37853bde7335ac820c2f4a382f915a6ba5d99bce75b3b834e74de94b7a7297eb\" returns successfully" Mar 17 17:29:57.891608 systemd[1]: run-containerd-runc-k8s.io-00c396aa578cae056c045c32c69b2e07a852f22985f12f0a9b7516f5454ff8ae-runc.30ec74.mount: Deactivated successfully. Mar 17 17:29:58.236086 systemd[1]: Started sshd@7-10.200.20.36:22-10.200.16.10:37832.service - OpenSSH per-connection server daemon (10.200.16.10:37832). Mar 17 17:29:58.653821 sshd[6389]: Accepted publickey for core from 10.200.16.10 port 37832 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:29:58.655910 sshd-session[6389]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:58.660474 systemd-logind[1707]: New session 10 of user core. Mar 17 17:29:58.667017 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 17 17:29:59.031945 sshd[6391]: Connection closed by 10.200.16.10 port 37832 Mar 17 17:29:59.031411 sshd-session[6389]: pam_unix(sshd:session): session closed for user core Mar 17 17:29:59.034326 systemd[1]: sshd@7-10.200.20.36:22-10.200.16.10:37832.service: Deactivated successfully. Mar 17 17:29:59.037610 systemd[1]: session-10.scope: Deactivated successfully. Mar 17 17:29:59.039347 systemd-logind[1707]: Session 10 logged out. Waiting for processes to exit. Mar 17 17:29:59.040301 systemd-logind[1707]: Removed session 10. Mar 17 17:30:04.117565 systemd[1]: Started sshd@8-10.200.20.36:22-10.200.16.10:53366.service - OpenSSH per-connection server daemon (10.200.16.10:53366). Mar 17 17:30:04.579419 sshd[6409]: Accepted publickey for core from 10.200.16.10 port 53366 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:30:04.580753 sshd-session[6409]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:04.585302 systemd-logind[1707]: New session 11 of user core. Mar 17 17:30:04.591019 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 17 17:30:04.972126 sshd[6411]: Connection closed by 10.200.16.10 port 53366 Mar 17 17:30:04.972688 sshd-session[6409]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:04.975979 systemd[1]: sshd@8-10.200.20.36:22-10.200.16.10:53366.service: Deactivated successfully. Mar 17 17:30:04.977932 systemd[1]: session-11.scope: Deactivated successfully. Mar 17 17:30:04.978984 systemd-logind[1707]: Session 11 logged out. Waiting for processes to exit. Mar 17 17:30:04.980436 systemd-logind[1707]: Removed session 11. Mar 17 17:30:10.059541 systemd[1]: Started sshd@9-10.200.20.36:22-10.200.16.10:55828.service - OpenSSH per-connection server daemon (10.200.16.10:55828). Mar 17 17:30:10.512134 sshd[6445]: Accepted publickey for core from 10.200.16.10 port 55828 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:30:10.513455 sshd-session[6445]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:10.517336 systemd-logind[1707]: New session 12 of user core. Mar 17 17:30:10.524949 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 17 17:30:10.898965 sshd[6447]: Connection closed by 10.200.16.10 port 55828 Mar 17 17:30:10.899729 sshd-session[6445]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:10.902383 systemd[1]: sshd@9-10.200.20.36:22-10.200.16.10:55828.service: Deactivated successfully. Mar 17 17:30:10.904441 systemd[1]: session-12.scope: Deactivated successfully. Mar 17 17:30:10.905982 systemd-logind[1707]: Session 12 logged out. Waiting for processes to exit. Mar 17 17:30:10.907022 systemd-logind[1707]: Removed session 12. Mar 17 17:30:15.986876 systemd[1]: Started sshd@10-10.200.20.36:22-10.200.16.10:55840.service - OpenSSH per-connection server daemon (10.200.16.10:55840). Mar 17 17:30:16.476415 sshd[6479]: Accepted publickey for core from 10.200.16.10 port 55840 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:30:16.477884 sshd-session[6479]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:16.482023 systemd-logind[1707]: New session 13 of user core. Mar 17 17:30:16.493992 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 17 17:30:16.903530 sshd[6481]: Connection closed by 10.200.16.10 port 55840 Mar 17 17:30:16.904038 sshd-session[6479]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:16.906887 systemd[1]: sshd@10-10.200.20.36:22-10.200.16.10:55840.service: Deactivated successfully. Mar 17 17:30:16.908839 systemd[1]: session-13.scope: Deactivated successfully. Mar 17 17:30:16.910100 systemd-logind[1707]: Session 13 logged out. Waiting for processes to exit. Mar 17 17:30:16.911519 systemd-logind[1707]: Removed session 13. Mar 17 17:30:16.991145 systemd[1]: Started sshd@11-10.200.20.36:22-10.200.16.10:55842.service - OpenSSH per-connection server daemon (10.200.16.10:55842). Mar 17 17:30:17.481006 sshd[6497]: Accepted publickey for core from 10.200.16.10 port 55842 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:30:17.482372 sshd-session[6497]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:17.486810 systemd-logind[1707]: New session 14 of user core. Mar 17 17:30:17.498003 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 17 17:30:17.951267 sshd[6500]: Connection closed by 10.200.16.10 port 55842 Mar 17 17:30:17.951708 sshd-session[6497]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:17.955681 systemd[1]: sshd@11-10.200.20.36:22-10.200.16.10:55842.service: Deactivated successfully. Mar 17 17:30:17.957616 systemd[1]: session-14.scope: Deactivated successfully. Mar 17 17:30:17.958352 systemd-logind[1707]: Session 14 logged out. Waiting for processes to exit. Mar 17 17:30:17.959385 systemd-logind[1707]: Removed session 14. Mar 17 17:30:18.033994 systemd[1]: Started sshd@12-10.200.20.36:22-10.200.16.10:55850.service - OpenSSH per-connection server daemon (10.200.16.10:55850). Mar 17 17:30:18.489450 sshd[6510]: Accepted publickey for core from 10.200.16.10 port 55850 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:30:18.491352 sshd-session[6510]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:18.495938 systemd-logind[1707]: New session 15 of user core. Mar 17 17:30:18.508086 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 17 17:30:18.884832 sshd[6524]: Connection closed by 10.200.16.10 port 55850 Mar 17 17:30:18.885186 sshd-session[6510]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:18.889199 systemd-logind[1707]: Session 15 logged out. Waiting for processes to exit. Mar 17 17:30:18.889464 systemd[1]: sshd@12-10.200.20.36:22-10.200.16.10:55850.service: Deactivated successfully. Mar 17 17:30:18.891610 systemd[1]: session-15.scope: Deactivated successfully. Mar 17 17:30:18.896462 systemd-logind[1707]: Removed session 15. Mar 17 17:30:23.973907 systemd[1]: Started sshd@13-10.200.20.36:22-10.200.16.10:45140.service - OpenSSH per-connection server daemon (10.200.16.10:45140). Mar 17 17:30:24.470875 sshd[6540]: Accepted publickey for core from 10.200.16.10 port 45140 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:30:24.472343 sshd-session[6540]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:24.479932 systemd-logind[1707]: New session 16 of user core. Mar 17 17:30:24.483972 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 17 17:30:24.900343 sshd[6545]: Connection closed by 10.200.16.10 port 45140 Mar 17 17:30:24.901005 sshd-session[6540]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:24.904396 systemd-logind[1707]: Session 16 logged out. Waiting for processes to exit. Mar 17 17:30:24.905003 systemd[1]: sshd@13-10.200.20.36:22-10.200.16.10:45140.service: Deactivated successfully. Mar 17 17:30:24.907008 systemd[1]: session-16.scope: Deactivated successfully. Mar 17 17:30:24.908327 systemd-logind[1707]: Removed session 16. Mar 17 17:30:29.980074 systemd[1]: Started sshd@14-10.200.20.36:22-10.200.16.10:44466.service - OpenSSH per-connection server daemon (10.200.16.10:44466). Mar 17 17:30:30.399630 sshd[6578]: Accepted publickey for core from 10.200.16.10 port 44466 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:30:30.401061 sshd-session[6578]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:30.405695 systemd-logind[1707]: New session 17 of user core. Mar 17 17:30:30.413000 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 17 17:30:30.769797 sshd[6582]: Connection closed by 10.200.16.10 port 44466 Mar 17 17:30:30.769629 sshd-session[6578]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:30.774918 systemd[1]: sshd@14-10.200.20.36:22-10.200.16.10:44466.service: Deactivated successfully. Mar 17 17:30:30.778502 systemd[1]: session-17.scope: Deactivated successfully. Mar 17 17:30:30.780232 systemd-logind[1707]: Session 17 logged out. Waiting for processes to exit. Mar 17 17:30:30.781750 systemd-logind[1707]: Removed session 17. Mar 17 17:30:30.870106 systemd[1]: Started sshd@15-10.200.20.36:22-10.200.16.10:44468.service - OpenSSH per-connection server daemon (10.200.16.10:44468). Mar 17 17:30:31.370008 sshd[6593]: Accepted publickey for core from 10.200.16.10 port 44468 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:30:31.370565 sshd-session[6593]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:31.376206 systemd-logind[1707]: New session 18 of user core. Mar 17 17:30:31.383017 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 17 17:30:31.891338 sshd[6595]: Connection closed by 10.200.16.10 port 44468 Mar 17 17:30:31.892145 sshd-session[6593]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:31.896168 systemd[1]: sshd@15-10.200.20.36:22-10.200.16.10:44468.service: Deactivated successfully. Mar 17 17:30:31.898100 systemd[1]: session-18.scope: Deactivated successfully. Mar 17 17:30:31.899036 systemd-logind[1707]: Session 18 logged out. Waiting for processes to exit. Mar 17 17:30:31.900530 systemd-logind[1707]: Removed session 18. Mar 17 17:30:31.972387 systemd[1]: Started sshd@16-10.200.20.36:22-10.200.16.10:44476.service - OpenSSH per-connection server daemon (10.200.16.10:44476). Mar 17 17:30:32.436587 sshd[6604]: Accepted publickey for core from 10.200.16.10 port 44476 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:30:32.438941 sshd-session[6604]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:32.445096 systemd-logind[1707]: New session 19 of user core. Mar 17 17:30:32.453247 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 17 17:30:34.646383 sshd[6607]: Connection closed by 10.200.16.10 port 44476 Mar 17 17:30:34.646288 sshd-session[6604]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:34.651255 systemd[1]: sshd@16-10.200.20.36:22-10.200.16.10:44476.service: Deactivated successfully. Mar 17 17:30:34.654573 systemd[1]: session-19.scope: Deactivated successfully. Mar 17 17:30:34.657092 systemd-logind[1707]: Session 19 logged out. Waiting for processes to exit. Mar 17 17:30:34.658383 systemd-logind[1707]: Removed session 19. Mar 17 17:30:34.726460 systemd[1]: Started sshd@17-10.200.20.36:22-10.200.16.10:44486.service - OpenSSH per-connection server daemon (10.200.16.10:44486). Mar 17 17:30:35.179341 sshd[6626]: Accepted publickey for core from 10.200.16.10 port 44486 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:30:35.180774 sshd-session[6626]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:35.184656 systemd-logind[1707]: New session 20 of user core. Mar 17 17:30:35.196988 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 17 17:30:35.685658 sshd[6628]: Connection closed by 10.200.16.10 port 44486 Mar 17 17:30:35.684695 sshd-session[6626]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:35.687518 systemd[1]: sshd@17-10.200.20.36:22-10.200.16.10:44486.service: Deactivated successfully. Mar 17 17:30:35.689672 systemd[1]: session-20.scope: Deactivated successfully. Mar 17 17:30:35.691729 systemd-logind[1707]: Session 20 logged out. Waiting for processes to exit. Mar 17 17:30:35.694156 systemd-logind[1707]: Removed session 20. Mar 17 17:30:35.765273 systemd[1]: Started sshd@18-10.200.20.36:22-10.200.16.10:44502.service - OpenSSH per-connection server daemon (10.200.16.10:44502). Mar 17 17:30:36.218013 sshd[6636]: Accepted publickey for core from 10.200.16.10 port 44502 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:30:36.219418 sshd-session[6636]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:36.223949 systemd-logind[1707]: New session 21 of user core. Mar 17 17:30:36.231077 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 17 17:30:36.603418 sshd[6638]: Connection closed by 10.200.16.10 port 44502 Mar 17 17:30:36.604018 sshd-session[6636]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:36.607869 systemd-logind[1707]: Session 21 logged out. Waiting for processes to exit. Mar 17 17:30:36.608655 systemd[1]: sshd@18-10.200.20.36:22-10.200.16.10:44502.service: Deactivated successfully. Mar 17 17:30:36.611721 systemd[1]: session-21.scope: Deactivated successfully. Mar 17 17:30:36.612680 systemd-logind[1707]: Removed session 21. Mar 17 17:30:41.684587 systemd[1]: Started sshd@19-10.200.20.36:22-10.200.16.10:41994.service - OpenSSH per-connection server daemon (10.200.16.10:41994). Mar 17 17:30:42.155242 sshd[6658]: Accepted publickey for core from 10.200.16.10 port 41994 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:30:42.157591 sshd-session[6658]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:42.166057 systemd-logind[1707]: New session 22 of user core. Mar 17 17:30:42.171041 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 17 17:30:42.604553 sshd[6660]: Connection closed by 10.200.16.10 port 41994 Mar 17 17:30:42.603948 sshd-session[6658]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:42.607766 systemd-logind[1707]: Session 22 logged out. Waiting for processes to exit. Mar 17 17:30:42.608477 systemd[1]: sshd@19-10.200.20.36:22-10.200.16.10:41994.service: Deactivated successfully. Mar 17 17:30:42.611455 systemd[1]: session-22.scope: Deactivated successfully. Mar 17 17:30:42.614014 systemd-logind[1707]: Removed session 22. Mar 17 17:30:47.697688 systemd[1]: Started sshd@20-10.200.20.36:22-10.200.16.10:42004.service - OpenSSH per-connection server daemon (10.200.16.10:42004). Mar 17 17:30:48.195146 sshd[6692]: Accepted publickey for core from 10.200.16.10 port 42004 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:30:48.197064 sshd-session[6692]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:48.201506 systemd-logind[1707]: New session 23 of user core. Mar 17 17:30:48.205983 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 17 17:30:48.622131 sshd[6694]: Connection closed by 10.200.16.10 port 42004 Mar 17 17:30:48.621660 sshd-session[6692]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:48.625077 systemd-logind[1707]: Session 23 logged out. Waiting for processes to exit. Mar 17 17:30:48.625702 systemd[1]: sshd@20-10.200.20.36:22-10.200.16.10:42004.service: Deactivated successfully. Mar 17 17:30:48.628132 systemd[1]: session-23.scope: Deactivated successfully. Mar 17 17:30:48.629216 systemd-logind[1707]: Removed session 23. Mar 17 17:30:53.710276 systemd[1]: Started sshd@21-10.200.20.36:22-10.200.16.10:49446.service - OpenSSH per-connection server daemon (10.200.16.10:49446). Mar 17 17:30:54.200023 sshd[6704]: Accepted publickey for core from 10.200.16.10 port 49446 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:30:54.201370 sshd-session[6704]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:54.205368 systemd-logind[1707]: New session 24 of user core. Mar 17 17:30:54.210967 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 17 17:30:54.621314 sshd[6706]: Connection closed by 10.200.16.10 port 49446 Mar 17 17:30:54.622171 sshd-session[6704]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:54.625708 systemd[1]: sshd@21-10.200.20.36:22-10.200.16.10:49446.service: Deactivated successfully. Mar 17 17:30:54.627518 systemd[1]: session-24.scope: Deactivated successfully. Mar 17 17:30:54.628161 systemd-logind[1707]: Session 24 logged out. Waiting for processes to exit. Mar 17 17:30:54.629399 systemd-logind[1707]: Removed session 24. Mar 17 17:30:59.693120 systemd[1]: Started sshd@22-10.200.20.36:22-10.200.16.10:43506.service - OpenSSH per-connection server daemon (10.200.16.10:43506). Mar 17 17:31:00.112618 sshd[6741]: Accepted publickey for core from 10.200.16.10 port 43506 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:31:00.114180 sshd-session[6741]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:31:00.119175 systemd-logind[1707]: New session 25 of user core. Mar 17 17:31:00.122990 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 17 17:31:00.475934 sshd[6743]: Connection closed by 10.200.16.10 port 43506 Mar 17 17:31:00.476851 sshd-session[6741]: pam_unix(sshd:session): session closed for user core Mar 17 17:31:00.480091 systemd[1]: sshd@22-10.200.20.36:22-10.200.16.10:43506.service: Deactivated successfully. Mar 17 17:31:00.483032 systemd[1]: session-25.scope: Deactivated successfully. Mar 17 17:31:00.484999 systemd-logind[1707]: Session 25 logged out. Waiting for processes to exit. Mar 17 17:31:00.486104 systemd-logind[1707]: Removed session 25. Mar 17 17:31:05.574108 systemd[1]: Started sshd@23-10.200.20.36:22-10.200.16.10:43522.service - OpenSSH per-connection server daemon (10.200.16.10:43522). Mar 17 17:31:06.024844 sshd[6754]: Accepted publickey for core from 10.200.16.10 port 43522 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:31:06.026260 sshd-session[6754]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:31:06.030437 systemd-logind[1707]: New session 26 of user core. Mar 17 17:31:06.034971 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 17 17:31:06.410840 sshd[6756]: Connection closed by 10.200.16.10 port 43522 Mar 17 17:31:06.411386 sshd-session[6754]: pam_unix(sshd:session): session closed for user core Mar 17 17:31:06.414649 systemd[1]: sshd@23-10.200.20.36:22-10.200.16.10:43522.service: Deactivated successfully. Mar 17 17:31:06.416486 systemd[1]: session-26.scope: Deactivated successfully. Mar 17 17:31:06.417493 systemd-logind[1707]: Session 26 logged out. Waiting for processes to exit. Mar 17 17:31:06.418638 systemd-logind[1707]: Removed session 26. Mar 17 17:31:11.485342 systemd[1]: Started sshd@24-10.200.20.36:22-10.200.16.10:54714.service - OpenSSH per-connection server daemon (10.200.16.10:54714). Mar 17 17:31:11.899713 sshd[6788]: Accepted publickey for core from 10.200.16.10 port 54714 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:31:11.901294 sshd-session[6788]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:31:11.905040 systemd-logind[1707]: New session 27 of user core. Mar 17 17:31:11.916989 systemd[1]: Started session-27.scope - Session 27 of User core. Mar 17 17:31:12.263514 sshd[6790]: Connection closed by 10.200.16.10 port 54714 Mar 17 17:31:12.262995 sshd-session[6788]: pam_unix(sshd:session): session closed for user core Mar 17 17:31:12.265837 systemd-logind[1707]: Session 27 logged out. Waiting for processes to exit. Mar 17 17:31:12.267721 systemd[1]: sshd@24-10.200.20.36:22-10.200.16.10:54714.service: Deactivated successfully. Mar 17 17:31:12.270119 systemd[1]: session-27.scope: Deactivated successfully. Mar 17 17:31:12.271001 systemd-logind[1707]: Removed session 27.