Mar 17 17:27:42.319572 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 17 17:27:42.319594 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT Mon Mar 17 16:05:23 -00 2025 Mar 17 17:27:42.319602 kernel: KASLR enabled Mar 17 17:27:42.319608 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Mar 17 17:27:42.319615 kernel: printk: bootconsole [pl11] enabled Mar 17 17:27:42.319621 kernel: efi: EFI v2.7 by EDK II Mar 17 17:27:42.319628 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3f214018 RNG=0x3fd5f998 MEMRESERVE=0x3e423d98 Mar 17 17:27:42.319635 kernel: random: crng init done Mar 17 17:27:42.319641 kernel: secureboot: Secure boot disabled Mar 17 17:27:42.319647 kernel: ACPI: Early table checksum verification disabled Mar 17 17:27:42.319653 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) Mar 17 17:27:42.319659 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319665 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319673 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) Mar 17 17:27:42.319680 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319687 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319693 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319701 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319707 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319714 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319720 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Mar 17 17:27:42.319727 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319733 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Mar 17 17:27:42.319739 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] Mar 17 17:27:42.319746 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] Mar 17 17:27:42.319752 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] Mar 17 17:27:42.319759 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] Mar 17 17:27:42.319765 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] Mar 17 17:27:42.319773 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] Mar 17 17:27:42.319780 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] Mar 17 17:27:42.319786 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] Mar 17 17:27:42.319792 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] Mar 17 17:27:42.319799 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] Mar 17 17:27:42.319805 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] Mar 17 17:27:42.319812 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] Mar 17 17:27:42.319818 kernel: NUMA: NODE_DATA [mem 0x1bf7ef800-0x1bf7f4fff] Mar 17 17:27:42.319825 kernel: Zone ranges: Mar 17 17:27:42.319831 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Mar 17 17:27:42.319837 kernel: DMA32 empty Mar 17 17:27:42.319844 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Mar 17 17:27:42.319855 kernel: Movable zone start for each node Mar 17 17:27:42.319863 kernel: Early memory node ranges Mar 17 17:27:42.319870 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Mar 17 17:27:42.319877 kernel: node 0: [mem 0x0000000000824000-0x000000003e54ffff] Mar 17 17:27:42.319884 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] Mar 17 17:27:42.319892 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] Mar 17 17:27:42.319899 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] Mar 17 17:27:42.319906 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] Mar 17 17:27:42.319912 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Mar 17 17:27:42.319919 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Mar 17 17:27:42.319926 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Mar 17 17:27:42.319933 kernel: psci: probing for conduit method from ACPI. Mar 17 17:27:42.319940 kernel: psci: PSCIv1.1 detected in firmware. Mar 17 17:27:42.319947 kernel: psci: Using standard PSCI v0.2 function IDs Mar 17 17:27:42.319953 kernel: psci: MIGRATE_INFO_TYPE not supported. Mar 17 17:27:42.319961 kernel: psci: SMC Calling Convention v1.4 Mar 17 17:27:42.319967 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Mar 17 17:27:42.319976 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Mar 17 17:27:42.319983 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Mar 17 17:27:42.319989 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Mar 17 17:27:42.319996 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 17 17:27:42.320003 kernel: Detected PIPT I-cache on CPU0 Mar 17 17:27:42.320010 kernel: CPU features: detected: GIC system register CPU interface Mar 17 17:27:42.320017 kernel: CPU features: detected: Hardware dirty bit management Mar 17 17:27:42.320024 kernel: CPU features: detected: Spectre-BHB Mar 17 17:27:42.320031 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 17 17:27:42.320038 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 17 17:27:42.320045 kernel: CPU features: detected: ARM erratum 1418040 Mar 17 17:27:42.320054 kernel: CPU features: detected: ARM erratum 1542419 (kernel portion) Mar 17 17:27:42.320061 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 17 17:27:42.320068 kernel: alternatives: applying boot alternatives Mar 17 17:27:42.320077 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=31b104f73129b84fa679201ebe02fbfd197d071bbf0576d6ccc5c5442bcbb405 Mar 17 17:27:42.320084 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 17:27:42.320091 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 17 17:27:42.320098 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 17 17:27:42.320105 kernel: Fallback order for Node 0: 0 Mar 17 17:27:42.320113 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1032156 Mar 17 17:27:42.320120 kernel: Policy zone: Normal Mar 17 17:27:42.320126 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 17:27:42.320134 kernel: software IO TLB: area num 2. Mar 17 17:27:42.320141 kernel: software IO TLB: mapped [mem 0x0000000036620000-0x000000003a620000] (64MB) Mar 17 17:27:42.320149 kernel: Memory: 3982376K/4194160K available (10240K kernel code, 2186K rwdata, 8100K rodata, 39744K init, 897K bss, 211784K reserved, 0K cma-reserved) Mar 17 17:27:42.320156 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 17 17:27:42.320162 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 17 17:27:42.320170 kernel: rcu: RCU event tracing is enabled. Mar 17 17:27:42.320177 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 17 17:27:42.320184 kernel: Trampoline variant of Tasks RCU enabled. Mar 17 17:27:42.320191 kernel: Tracing variant of Tasks RCU enabled. Mar 17 17:27:42.320198 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 17:27:42.320205 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 17 17:27:42.320213 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 17 17:27:42.320219 kernel: GICv3: 960 SPIs implemented Mar 17 17:27:42.320226 kernel: GICv3: 0 Extended SPIs implemented Mar 17 17:27:42.320242 kernel: Root IRQ handler: gic_handle_irq Mar 17 17:27:42.320251 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 17 17:27:42.320258 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Mar 17 17:27:42.320265 kernel: ITS: No ITS available, not enabling LPIs Mar 17 17:27:42.320277 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 17 17:27:42.320284 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:27:42.320291 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 17 17:27:42.320298 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 17 17:27:42.320305 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 17 17:27:42.320313 kernel: Console: colour dummy device 80x25 Mar 17 17:27:42.320321 kernel: printk: console [tty1] enabled Mar 17 17:27:42.320328 kernel: ACPI: Core revision 20230628 Mar 17 17:27:42.320335 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 17 17:27:42.320342 kernel: pid_max: default: 32768 minimum: 301 Mar 17 17:27:42.320349 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 17 17:27:42.320356 kernel: landlock: Up and running. Mar 17 17:27:42.320363 kernel: SELinux: Initializing. Mar 17 17:27:42.320370 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:27:42.320379 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:27:42.320386 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 17 17:27:42.320394 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 17 17:27:42.320401 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3a8030, hints 0xe, misc 0x31e1 Mar 17 17:27:42.320408 kernel: Hyper-V: Host Build 10.0.22477.1619-1-0 Mar 17 17:27:42.320415 kernel: Hyper-V: enabling crash_kexec_post_notifiers Mar 17 17:27:42.320422 kernel: rcu: Hierarchical SRCU implementation. Mar 17 17:27:42.320437 kernel: rcu: Max phase no-delay instances is 400. Mar 17 17:27:42.320444 kernel: Remapping and enabling EFI services. Mar 17 17:27:42.320452 kernel: smp: Bringing up secondary CPUs ... Mar 17 17:27:42.320459 kernel: Detected PIPT I-cache on CPU1 Mar 17 17:27:42.320467 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Mar 17 17:27:42.320475 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:27:42.320483 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 17 17:27:42.320490 kernel: smp: Brought up 1 node, 2 CPUs Mar 17 17:27:42.320498 kernel: SMP: Total of 2 processors activated. Mar 17 17:27:42.320505 kernel: CPU features: detected: 32-bit EL0 Support Mar 17 17:27:42.320514 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Mar 17 17:27:42.320522 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 17 17:27:42.320529 kernel: CPU features: detected: CRC32 instructions Mar 17 17:27:42.320536 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 17 17:27:42.320544 kernel: CPU features: detected: LSE atomic instructions Mar 17 17:27:42.320551 kernel: CPU features: detected: Privileged Access Never Mar 17 17:27:42.320559 kernel: CPU: All CPU(s) started at EL1 Mar 17 17:27:42.320566 kernel: alternatives: applying system-wide alternatives Mar 17 17:27:42.320573 kernel: devtmpfs: initialized Mar 17 17:27:42.320582 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 17:27:42.320590 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 17 17:27:42.320597 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 17:27:42.320605 kernel: SMBIOS 3.1.0 present. Mar 17 17:27:42.320612 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 Mar 17 17:27:42.320620 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 17:27:42.320627 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 17 17:27:42.320635 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 17 17:27:42.320644 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 17 17:27:42.320651 kernel: audit: initializing netlink subsys (disabled) Mar 17 17:27:42.320659 kernel: audit: type=2000 audit(0.047:1): state=initialized audit_enabled=0 res=1 Mar 17 17:27:42.320666 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 17:27:42.320674 kernel: cpuidle: using governor menu Mar 17 17:27:42.320681 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 17 17:27:42.320688 kernel: ASID allocator initialised with 32768 entries Mar 17 17:27:42.320696 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 17:27:42.320703 kernel: Serial: AMBA PL011 UART driver Mar 17 17:27:42.320712 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 17 17:27:42.320720 kernel: Modules: 0 pages in range for non-PLT usage Mar 17 17:27:42.320727 kernel: Modules: 508944 pages in range for PLT usage Mar 17 17:27:42.320734 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 17 17:27:42.320742 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 17 17:27:42.320750 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 17 17:27:42.320757 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 17 17:27:42.320764 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 17:27:42.320772 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 17 17:27:42.320780 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 17 17:27:42.320788 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 17 17:27:42.320795 kernel: ACPI: Added _OSI(Module Device) Mar 17 17:27:42.320803 kernel: ACPI: Added _OSI(Processor Device) Mar 17 17:27:42.320810 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 17:27:42.320817 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 17:27:42.320825 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 17:27:42.320832 kernel: ACPI: Interpreter enabled Mar 17 17:27:42.320839 kernel: ACPI: Using GIC for interrupt routing Mar 17 17:27:42.320847 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Mar 17 17:27:42.320856 kernel: printk: console [ttyAMA0] enabled Mar 17 17:27:42.320864 kernel: printk: bootconsole [pl11] disabled Mar 17 17:27:42.320872 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Mar 17 17:27:42.320879 kernel: iommu: Default domain type: Translated Mar 17 17:27:42.320887 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 17 17:27:42.320894 kernel: efivars: Registered efivars operations Mar 17 17:27:42.320902 kernel: vgaarb: loaded Mar 17 17:27:42.320909 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 17 17:27:42.320916 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 17:27:42.320926 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 17:27:42.320934 kernel: pnp: PnP ACPI init Mar 17 17:27:42.320941 kernel: pnp: PnP ACPI: found 0 devices Mar 17 17:27:42.320948 kernel: NET: Registered PF_INET protocol family Mar 17 17:27:42.320956 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 17 17:27:42.320963 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 17 17:27:42.320971 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 17:27:42.320979 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 17 17:27:42.320987 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 17 17:27:42.320995 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 17 17:27:42.321002 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:27:42.321010 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:27:42.321018 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 17:27:42.321025 kernel: PCI: CLS 0 bytes, default 64 Mar 17 17:27:42.321032 kernel: kvm [1]: HYP mode not available Mar 17 17:27:42.321040 kernel: Initialise system trusted keyrings Mar 17 17:27:42.321047 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 17 17:27:42.321057 kernel: Key type asymmetric registered Mar 17 17:27:42.321064 kernel: Asymmetric key parser 'x509' registered Mar 17 17:27:42.321072 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 17 17:27:42.321079 kernel: io scheduler mq-deadline registered Mar 17 17:27:42.321086 kernel: io scheduler kyber registered Mar 17 17:27:42.321094 kernel: io scheduler bfq registered Mar 17 17:27:42.321102 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 17:27:42.321109 kernel: thunder_xcv, ver 1.0 Mar 17 17:27:42.321116 kernel: thunder_bgx, ver 1.0 Mar 17 17:27:42.321124 kernel: nicpf, ver 1.0 Mar 17 17:27:42.321133 kernel: nicvf, ver 1.0 Mar 17 17:27:42.321264 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 17 17:27:42.321337 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-17T17:27:41 UTC (1742232461) Mar 17 17:27:42.321348 kernel: efifb: probing for efifb Mar 17 17:27:42.321355 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Mar 17 17:27:42.321363 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Mar 17 17:27:42.321370 kernel: efifb: scrolling: redraw Mar 17 17:27:42.321380 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 17 17:27:42.321388 kernel: Console: switching to colour frame buffer device 128x48 Mar 17 17:27:42.321395 kernel: fb0: EFI VGA frame buffer device Mar 17 17:27:42.321402 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Mar 17 17:27:42.321410 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 17 17:27:42.321418 kernel: No ACPI PMU IRQ for CPU0 Mar 17 17:27:42.321426 kernel: No ACPI PMU IRQ for CPU1 Mar 17 17:27:42.321433 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 1 counters available Mar 17 17:27:42.321440 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 17 17:27:42.321449 kernel: watchdog: Hard watchdog permanently disabled Mar 17 17:27:42.321457 kernel: NET: Registered PF_INET6 protocol family Mar 17 17:27:42.321464 kernel: Segment Routing with IPv6 Mar 17 17:27:42.321472 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 17:27:42.321479 kernel: NET: Registered PF_PACKET protocol family Mar 17 17:27:42.321487 kernel: Key type dns_resolver registered Mar 17 17:27:42.321494 kernel: registered taskstats version 1 Mar 17 17:27:42.321502 kernel: Loading compiled-in X.509 certificates Mar 17 17:27:42.321509 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 74c9b4f5dfad711856d7363c976664fc02c1e24c' Mar 17 17:27:42.321517 kernel: Key type .fscrypt registered Mar 17 17:27:42.321526 kernel: Key type fscrypt-provisioning registered Mar 17 17:27:42.321533 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 17:27:42.321541 kernel: ima: Allocated hash algorithm: sha1 Mar 17 17:27:42.321548 kernel: ima: No architecture policies found Mar 17 17:27:42.321556 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 17 17:27:42.321564 kernel: clk: Disabling unused clocks Mar 17 17:27:42.321571 kernel: Freeing unused kernel memory: 39744K Mar 17 17:27:42.321579 kernel: Run /init as init process Mar 17 17:27:42.321587 kernel: with arguments: Mar 17 17:27:42.321595 kernel: /init Mar 17 17:27:42.321602 kernel: with environment: Mar 17 17:27:42.321610 kernel: HOME=/ Mar 17 17:27:42.321618 kernel: TERM=linux Mar 17 17:27:42.321625 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 17:27:42.321634 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 17 17:27:42.321644 systemd[1]: Detected virtualization microsoft. Mar 17 17:27:42.321653 systemd[1]: Detected architecture arm64. Mar 17 17:27:42.321661 systemd[1]: Running in initrd. Mar 17 17:27:42.321669 systemd[1]: No hostname configured, using default hostname. Mar 17 17:27:42.321677 systemd[1]: Hostname set to . Mar 17 17:27:42.321686 systemd[1]: Initializing machine ID from random generator. Mar 17 17:27:42.321694 systemd[1]: Queued start job for default target initrd.target. Mar 17 17:27:42.321702 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:27:42.321710 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:27:42.321720 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 17 17:27:42.321728 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:27:42.321736 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 17 17:27:42.321745 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 17 17:27:42.321754 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 17 17:27:42.321762 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 17 17:27:42.321770 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:27:42.321780 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:27:42.321788 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:27:42.321796 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:27:42.321804 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:27:42.321812 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:27:42.321820 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:27:42.321828 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:27:42.321837 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 17 17:27:42.321846 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 17 17:27:42.321854 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:27:42.321862 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:27:42.321870 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:27:42.321878 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:27:42.321887 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 17 17:27:42.321895 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:27:42.321903 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 17 17:27:42.321911 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 17:27:42.321920 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:27:42.321928 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:27:42.321949 systemd-journald[217]: Collecting audit messages is disabled. Mar 17 17:27:42.321969 systemd-journald[217]: Journal started Mar 17 17:27:42.321990 systemd-journald[217]: Runtime Journal (/run/log/journal/fbd10789bbf84b96aee41a602ead7456) is 8.0M, max 78.5M, 70.5M free. Mar 17 17:27:42.333666 systemd-modules-load[218]: Inserted module 'overlay' Mar 17 17:27:42.339239 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:27:42.367245 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:27:42.367305 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 17:27:42.376151 systemd-modules-load[218]: Inserted module 'br_netfilter' Mar 17 17:27:42.381574 kernel: Bridge firewalling registered Mar 17 17:27:42.376606 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 17 17:27:42.387802 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:27:42.402249 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 17:27:42.411216 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:27:42.421670 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:27:42.441504 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:27:42.449553 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:27:42.474395 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 17:27:42.492386 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:27:42.502259 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:27:42.516256 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:27:42.538208 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:27:42.545635 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:27:42.572449 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 17 17:27:42.581422 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:27:42.602253 dracut-cmdline[252]: dracut-dracut-053 Mar 17 17:27:42.602253 dracut-cmdline[252]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=31b104f73129b84fa679201ebe02fbfd197d071bbf0576d6ccc5c5442bcbb405 Mar 17 17:27:42.648471 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:27:42.661467 systemd-resolved[255]: Positive Trust Anchors: Mar 17 17:27:42.661478 systemd-resolved[255]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:27:42.661508 systemd-resolved[255]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:27:42.665300 systemd-resolved[255]: Defaulting to hostname 'linux'. Mar 17 17:27:42.669529 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:27:42.688866 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:27:42.734576 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:27:42.795281 kernel: SCSI subsystem initialized Mar 17 17:27:42.803254 kernel: Loading iSCSI transport class v2.0-870. Mar 17 17:27:42.815932 kernel: iscsi: registered transport (tcp) Mar 17 17:27:42.833015 kernel: iscsi: registered transport (qla4xxx) Mar 17 17:27:42.833069 kernel: QLogic iSCSI HBA Driver Mar 17 17:27:42.871034 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 17 17:27:42.887357 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 17 17:27:42.922031 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 17:27:42.922099 kernel: device-mapper: uevent: version 1.0.3 Mar 17 17:27:42.928962 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 17 17:27:42.976261 kernel: raid6: neonx8 gen() 15767 MB/s Mar 17 17:27:42.996247 kernel: raid6: neonx4 gen() 15665 MB/s Mar 17 17:27:43.016244 kernel: raid6: neonx2 gen() 13281 MB/s Mar 17 17:27:43.037245 kernel: raid6: neonx1 gen() 10486 MB/s Mar 17 17:27:43.057244 kernel: raid6: int64x8 gen() 6974 MB/s Mar 17 17:27:43.077244 kernel: raid6: int64x4 gen() 7344 MB/s Mar 17 17:27:43.098246 kernel: raid6: int64x2 gen() 6131 MB/s Mar 17 17:27:43.121708 kernel: raid6: int64x1 gen() 5061 MB/s Mar 17 17:27:43.121737 kernel: raid6: using algorithm neonx8 gen() 15767 MB/s Mar 17 17:27:43.146634 kernel: raid6: .... xor() 11941 MB/s, rmw enabled Mar 17 17:27:43.146650 kernel: raid6: using neon recovery algorithm Mar 17 17:27:43.157956 kernel: xor: measuring software checksum speed Mar 17 17:27:43.157971 kernel: 8regs : 19735 MB/sec Mar 17 17:27:43.165081 kernel: 32regs : 18769 MB/sec Mar 17 17:27:43.165093 kernel: arm64_neon : 27096 MB/sec Mar 17 17:27:43.169337 kernel: xor: using function: arm64_neon (27096 MB/sec) Mar 17 17:27:43.219255 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 17 17:27:43.228620 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:27:43.251391 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:27:43.273843 systemd-udevd[440]: Using default interface naming scheme 'v255'. Mar 17 17:27:43.279584 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:27:43.310500 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 17 17:27:43.327569 dracut-pre-trigger[452]: rd.md=0: removing MD RAID activation Mar 17 17:27:43.355289 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:27:43.369444 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:27:43.410198 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:27:43.429441 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 17 17:27:43.456186 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 17 17:27:43.470201 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:27:43.485863 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:27:43.499699 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:27:43.516422 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 17 17:27:43.533322 kernel: hv_vmbus: Vmbus version:5.3 Mar 17 17:27:43.544012 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:27:43.573944 kernel: hv_vmbus: registering driver hyperv_keyboard Mar 17 17:27:43.573984 kernel: hv_vmbus: registering driver hid_hyperv Mar 17 17:27:43.573996 kernel: hv_vmbus: registering driver hv_netvsc Mar 17 17:27:43.574006 kernel: hv_vmbus: registering driver hv_storvsc Mar 17 17:27:43.574016 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 17 17:27:43.573785 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:27:43.607277 kernel: scsi host1: storvsc_host_t Mar 17 17:27:43.607459 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 17 17:27:43.607471 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input0 Mar 17 17:27:43.607481 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input1 Mar 17 17:27:43.573927 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:27:43.660454 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Mar 17 17:27:43.660616 kernel: scsi host0: storvsc_host_t Mar 17 17:27:43.660708 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Mar 17 17:27:43.639512 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:27:43.653519 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:27:43.687419 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 0 Mar 17 17:27:43.653772 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:27:43.708423 kernel: hv_netvsc 00224878-0b63-0022-4878-0b6300224878 eth0: VF slot 1 added Mar 17 17:27:43.660641 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:27:43.682579 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:27:43.723263 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:27:43.753282 kernel: hv_vmbus: registering driver hv_pci Mar 17 17:27:43.753322 kernel: PTP clock support registered Mar 17 17:27:43.753433 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:27:43.777458 kernel: hv_utils: Registering HyperV Utility Driver Mar 17 17:27:43.777481 kernel: hv_vmbus: registering driver hv_utils Mar 17 17:27:43.795948 kernel: hv_utils: Heartbeat IC version 3.0 Mar 17 17:27:43.795989 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Mar 17 17:27:43.920523 kernel: hv_utils: Shutdown IC version 3.2 Mar 17 17:27:43.920540 kernel: hv_utils: TimeSync IC version 4.0 Mar 17 17:27:43.920550 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 17 17:27:43.920560 kernel: hv_pci 8cfe3a2d-f5b5-43e2-965d-57ce49de7108: PCI VMBus probing: Using version 0x10004 Mar 17 17:27:44.016304 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Mar 17 17:27:44.016446 kernel: hv_pci 8cfe3a2d-f5b5-43e2-965d-57ce49de7108: PCI host bridge to bus f5b5:00 Mar 17 17:27:44.016539 kernel: pci_bus f5b5:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Mar 17 17:27:44.016636 kernel: pci_bus f5b5:00: No busn resource found for root bus, will use [bus 00-ff] Mar 17 17:27:44.016712 kernel: pci f5b5:00:02.0: [15b3:1018] type 00 class 0x020000 Mar 17 17:27:44.016806 kernel: pci f5b5:00:02.0: reg 0x10: [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 17 17:27:44.016887 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Mar 17 17:27:44.027936 kernel: pci f5b5:00:02.0: enabling Extended Tags Mar 17 17:27:44.028072 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Mar 17 17:27:44.028171 kernel: pci f5b5:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at f5b5:00:02.0 (capable of 126.016 Gb/s with 8.0 GT/s PCIe x16 link) Mar 17 17:27:44.028779 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 17 17:27:44.028879 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Mar 17 17:27:44.028961 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Mar 17 17:27:44.029044 kernel: pci_bus f5b5:00: busn_res: [bus 00-ff] end is updated to 00 Mar 17 17:27:44.029127 kernel: pci f5b5:00:02.0: BAR 0: assigned [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 17 17:27:44.029216 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:27:44.029242 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 17 17:27:43.904330 systemd-resolved[255]: Clock change detected. Flushing caches. Mar 17 17:27:43.950588 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:27:44.082085 kernel: mlx5_core f5b5:00:02.0: enabling device (0000 -> 0002) Mar 17 17:27:44.300954 kernel: mlx5_core f5b5:00:02.0: firmware version: 16.30.1284 Mar 17 17:27:44.301085 kernel: hv_netvsc 00224878-0b63-0022-4878-0b6300224878 eth0: VF registering: eth1 Mar 17 17:27:44.301177 kernel: mlx5_core f5b5:00:02.0 eth1: joined to eth0 Mar 17 17:27:44.301297 kernel: mlx5_core f5b5:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Mar 17 17:27:44.308246 kernel: mlx5_core f5b5:00:02.0 enP62901s1: renamed from eth1 Mar 17 17:27:44.731522 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Mar 17 17:27:45.083967 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Mar 17 17:27:45.099536 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by (udev-worker) (497) Mar 17 17:27:45.109572 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 17 17:27:45.133384 kernel: BTRFS: device fsid c0c482e3-6885-4a4e-b31c-6bc8f8c403e7 devid 1 transid 40 /dev/sda3 scanned by (udev-worker) (486) Mar 17 17:27:45.144676 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Mar 17 17:27:45.152165 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Mar 17 17:27:45.181396 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 17 17:27:45.207251 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:27:45.215247 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:27:46.225405 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:27:46.226710 disk-uuid[602]: The operation has completed successfully. Mar 17 17:27:46.284809 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 17:27:46.284905 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 17 17:27:46.308373 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 17 17:27:46.321627 sh[688]: Success Mar 17 17:27:46.341263 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 17 17:27:46.572075 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 17 17:27:46.578627 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 17 17:27:46.601366 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 17 17:27:46.630908 kernel: BTRFS info (device dm-0): first mount of filesystem c0c482e3-6885-4a4e-b31c-6bc8f8c403e7 Mar 17 17:27:46.630950 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:27:46.638033 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 17 17:27:46.643198 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 17 17:27:46.647747 kernel: BTRFS info (device dm-0): using free space tree Mar 17 17:27:47.015882 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 17 17:27:47.021876 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 17 17:27:47.043488 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 17 17:27:47.050834 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 17 17:27:47.094822 kernel: BTRFS info (device sda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:27:47.094844 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:27:47.094854 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:27:47.111324 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 17:27:47.125783 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 17:27:47.130515 kernel: BTRFS info (device sda6): last unmount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:27:47.137955 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 17 17:27:47.152499 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 17 17:27:47.210706 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:27:47.229358 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:27:47.255493 systemd-networkd[872]: lo: Link UP Mar 17 17:27:47.255504 systemd-networkd[872]: lo: Gained carrier Mar 17 17:27:47.257439 systemd-networkd[872]: Enumeration completed Mar 17 17:27:47.259084 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:27:47.265624 systemd-networkd[872]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:27:47.265628 systemd-networkd[872]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:27:47.266345 systemd[1]: Reached target network.target - Network. Mar 17 17:27:47.358259 kernel: mlx5_core f5b5:00:02.0 enP62901s1: Link up Mar 17 17:27:47.397274 kernel: hv_netvsc 00224878-0b63-0022-4878-0b6300224878 eth0: Data path switched to VF: enP62901s1 Mar 17 17:27:47.396889 systemd-networkd[872]: enP62901s1: Link UP Mar 17 17:27:47.396976 systemd-networkd[872]: eth0: Link UP Mar 17 17:27:47.397135 systemd-networkd[872]: eth0: Gained carrier Mar 17 17:27:47.397144 systemd-networkd[872]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:27:47.409716 systemd-networkd[872]: enP62901s1: Gained carrier Mar 17 17:27:47.431274 systemd-networkd[872]: eth0: DHCPv4 address 10.200.20.22/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 17 17:27:48.099605 ignition[794]: Ignition 2.20.0 Mar 17 17:27:48.099617 ignition[794]: Stage: fetch-offline Mar 17 17:27:48.103863 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:27:48.099649 ignition[794]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:27:48.099658 ignition[794]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:27:48.099748 ignition[794]: parsed url from cmdline: "" Mar 17 17:27:48.099751 ignition[794]: no config URL provided Mar 17 17:27:48.099755 ignition[794]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 17:27:48.134490 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 17 17:27:48.099763 ignition[794]: no config at "/usr/lib/ignition/user.ign" Mar 17 17:27:48.099768 ignition[794]: failed to fetch config: resource requires networking Mar 17 17:27:48.099932 ignition[794]: Ignition finished successfully Mar 17 17:27:48.160212 ignition[882]: Ignition 2.20.0 Mar 17 17:27:48.160219 ignition[882]: Stage: fetch Mar 17 17:27:48.160424 ignition[882]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:27:48.160434 ignition[882]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:27:48.160528 ignition[882]: parsed url from cmdline: "" Mar 17 17:27:48.160531 ignition[882]: no config URL provided Mar 17 17:27:48.160535 ignition[882]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 17:27:48.160546 ignition[882]: no config at "/usr/lib/ignition/user.ign" Mar 17 17:27:48.160573 ignition[882]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Mar 17 17:27:48.258056 ignition[882]: GET result: OK Mar 17 17:27:48.258125 ignition[882]: config has been read from IMDS userdata Mar 17 17:27:48.258172 ignition[882]: parsing config with SHA512: 8f975b6895a36971b4ecc3dd0851a631055163f6a1cf9e5648d9323ef95dcfd1da82756cc152b28fb89291371b4629b4519d8ccd19d2c44f7fbcb766f9e02644 Mar 17 17:27:48.262744 unknown[882]: fetched base config from "system" Mar 17 17:27:48.263129 ignition[882]: fetch: fetch complete Mar 17 17:27:48.262751 unknown[882]: fetched base config from "system" Mar 17 17:27:48.263133 ignition[882]: fetch: fetch passed Mar 17 17:27:48.262756 unknown[882]: fetched user config from "azure" Mar 17 17:27:48.263174 ignition[882]: Ignition finished successfully Mar 17 17:27:48.268040 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 17 17:27:48.297084 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 17 17:27:48.317412 ignition[888]: Ignition 2.20.0 Mar 17 17:27:48.317424 ignition[888]: Stage: kargs Mar 17 17:27:48.321844 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 17 17:27:48.317585 ignition[888]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:27:48.317594 ignition[888]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:27:48.318470 ignition[888]: kargs: kargs passed Mar 17 17:27:48.349444 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 17 17:27:48.318515 ignition[888]: Ignition finished successfully Mar 17 17:27:48.371470 ignition[894]: Ignition 2.20.0 Mar 17 17:27:48.371480 ignition[894]: Stage: disks Mar 17 17:27:48.378585 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 17 17:27:48.371642 ignition[894]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:27:48.384533 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 17 17:27:48.371650 ignition[894]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:27:48.395940 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 17 17:27:48.372488 ignition[894]: disks: disks passed Mar 17 17:27:48.407809 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:27:48.372526 ignition[894]: Ignition finished successfully Mar 17 17:27:48.419588 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:27:48.431289 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:27:48.456470 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 17 17:27:48.526485 systemd-fsck[902]: ROOT: clean, 14/7326000 files, 477710/7359488 blocks Mar 17 17:27:48.532260 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 17 17:27:48.551434 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 17 17:27:48.611254 kernel: EXT4-fs (sda9): mounted filesystem 6b579bf2-7716-4d59-98eb-b92ea668693e r/w with ordered data mode. Quota mode: none. Mar 17 17:27:48.611486 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 17 17:27:48.616671 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 17 17:27:48.669302 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:27:48.680199 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 17 17:27:48.689400 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 17 17:27:48.696771 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 17:27:48.736366 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (913) Mar 17 17:27:48.696805 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:27:48.761054 kernel: BTRFS info (device sda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:27:48.761081 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:27:48.761092 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:27:48.724897 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 17 17:27:48.771019 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 17:27:48.771472 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 17 17:27:48.785510 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:27:49.035396 systemd-networkd[872]: eth0: Gained IPv6LL Mar 17 17:27:49.227386 systemd-networkd[872]: enP62901s1: Gained IPv6LL Mar 17 17:27:49.396872 coreos-metadata[915]: Mar 17 17:27:49.396 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 17 17:27:49.406997 coreos-metadata[915]: Mar 17 17:27:49.406 INFO Fetch successful Mar 17 17:27:49.412518 coreos-metadata[915]: Mar 17 17:27:49.412 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Mar 17 17:27:49.434379 coreos-metadata[915]: Mar 17 17:27:49.434 INFO Fetch successful Mar 17 17:27:49.451058 coreos-metadata[915]: Mar 17 17:27:49.451 INFO wrote hostname ci-4152.2.2-a-6c46d54d7c to /sysroot/etc/hostname Mar 17 17:27:49.460627 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 17 17:27:49.829184 initrd-setup-root[943]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 17:27:50.124907 initrd-setup-root[950]: cut: /sysroot/etc/group: No such file or directory Mar 17 17:27:50.150451 initrd-setup-root[957]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 17:27:50.170886 initrd-setup-root[964]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 17:27:50.998153 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 17 17:27:51.015449 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 17 17:27:51.027406 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 17 17:27:51.047423 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 17 17:27:51.053251 kernel: BTRFS info (device sda6): last unmount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:27:51.067762 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 17 17:27:51.085400 ignition[1034]: INFO : Ignition 2.20.0 Mar 17 17:27:51.085400 ignition[1034]: INFO : Stage: mount Mar 17 17:27:51.095125 ignition[1034]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:27:51.095125 ignition[1034]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:27:51.095125 ignition[1034]: INFO : mount: mount passed Mar 17 17:27:51.095125 ignition[1034]: INFO : Ignition finished successfully Mar 17 17:27:51.095150 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 17 17:27:51.124437 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 17 17:27:51.143360 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:27:51.165582 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (1044) Mar 17 17:27:51.183506 kernel: BTRFS info (device sda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:27:51.183560 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:27:51.183574 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:27:51.190260 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 17:27:51.191041 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:27:51.219238 ignition[1061]: INFO : Ignition 2.20.0 Mar 17 17:27:51.219238 ignition[1061]: INFO : Stage: files Mar 17 17:27:51.219238 ignition[1061]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:27:51.219238 ignition[1061]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:27:51.239851 ignition[1061]: DEBUG : files: compiled without relabeling support, skipping Mar 17 17:27:51.239851 ignition[1061]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 17:27:51.239851 ignition[1061]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 17:27:51.313631 ignition[1061]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 17:27:51.321443 ignition[1061]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 17:27:51.321443 ignition[1061]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 17:27:51.315697 unknown[1061]: wrote ssh authorized keys file for user: core Mar 17 17:27:51.360649 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 17 17:27:51.372055 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Mar 17 17:27:51.406614 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 17 17:27:51.592556 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 17 17:27:51.592556 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-arm64.raw: attempt #1 Mar 17 17:27:51.963912 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 17 17:27:52.329243 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 17:27:52.329243 ignition[1061]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 17 17:27:52.367262 ignition[1061]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:27:52.379419 ignition[1061]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:27:52.379419 ignition[1061]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 17 17:27:52.379419 ignition[1061]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 17 17:27:52.379419 ignition[1061]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 17 17:27:52.379419 ignition[1061]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:27:52.379419 ignition[1061]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:27:52.379419 ignition[1061]: INFO : files: files passed Mar 17 17:27:52.379419 ignition[1061]: INFO : Ignition finished successfully Mar 17 17:27:52.380283 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 17 17:27:52.421538 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 17 17:27:52.440420 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 17 17:27:52.499346 initrd-setup-root-after-ignition[1089]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:27:52.499346 initrd-setup-root-after-ignition[1089]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:27:52.465395 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 17:27:52.535459 initrd-setup-root-after-ignition[1094]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:27:52.465760 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 17 17:27:52.493841 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:27:52.501534 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 17 17:27:52.527458 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 17 17:27:52.568023 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 17:27:52.568136 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 17 17:27:52.579159 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 17 17:27:52.591823 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 17 17:27:52.603412 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 17 17:27:52.631495 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 17 17:27:52.642569 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:27:52.667367 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 17 17:27:52.686562 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 17:27:52.686703 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 17 17:27:52.700247 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:27:52.713110 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:27:52.726174 systemd[1]: Stopped target timers.target - Timer Units. Mar 17 17:27:52.737896 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 17:27:52.737957 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:27:52.756273 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 17 17:27:52.763591 systemd[1]: Stopped target basic.target - Basic System. Mar 17 17:27:52.775835 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 17 17:27:52.787684 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:27:52.799349 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 17 17:27:52.811947 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 17 17:27:52.823933 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:27:52.836639 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 17 17:27:52.848492 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 17 17:27:52.861125 systemd[1]: Stopped target swap.target - Swaps. Mar 17 17:27:52.871445 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 17:27:52.871519 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:27:52.887000 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:27:52.898612 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:27:52.911688 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 17 17:27:52.911728 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:27:52.924767 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 17:27:52.924834 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 17 17:27:52.942514 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 17:27:52.942563 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:27:52.957551 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 17:27:52.957590 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 17 17:27:52.968703 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 17 17:27:52.968753 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 17 17:27:53.000413 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 17 17:27:53.044007 ignition[1115]: INFO : Ignition 2.20.0 Mar 17 17:27:53.044007 ignition[1115]: INFO : Stage: umount Mar 17 17:27:53.044007 ignition[1115]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:27:53.044007 ignition[1115]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:27:53.044007 ignition[1115]: INFO : umount: umount passed Mar 17 17:27:53.044007 ignition[1115]: INFO : Ignition finished successfully Mar 17 17:27:53.025306 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 17 17:27:53.031078 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 17:27:53.031163 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:27:53.038514 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 17:27:53.038561 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:27:53.054611 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 17:27:53.054695 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 17 17:27:53.066414 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 17:27:53.066516 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 17 17:27:53.077809 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 17:27:53.077869 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 17 17:27:53.091245 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 17 17:27:53.091285 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 17 17:27:53.101364 systemd[1]: Stopped target network.target - Network. Mar 17 17:27:53.111714 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 17:27:53.111771 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:27:53.123392 systemd[1]: Stopped target paths.target - Path Units. Mar 17 17:27:53.133908 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 17:27:53.139942 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:27:53.147340 systemd[1]: Stopped target slices.target - Slice Units. Mar 17 17:27:53.158250 systemd[1]: Stopped target sockets.target - Socket Units. Mar 17 17:27:53.169918 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 17:27:53.169967 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:27:53.181145 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 17:27:53.181183 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:27:53.192468 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 17:27:53.192523 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 17 17:27:53.203757 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 17 17:27:53.203799 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 17 17:27:53.215015 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 17 17:27:53.225716 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 17 17:27:53.236445 systemd-networkd[872]: eth0: DHCPv6 lease lost Mar 17 17:27:53.243470 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 17:27:53.243994 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 17:27:53.244089 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 17 17:27:53.256408 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 17:27:53.479438 kernel: hv_netvsc 00224878-0b63-0022-4878-0b6300224878 eth0: Data path switched from VF: enP62901s1 Mar 17 17:27:53.256523 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 17 17:27:53.270346 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 17:27:53.270408 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:27:53.307429 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 17 17:27:53.318425 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 17:27:53.318496 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:27:53.330570 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 17:27:53.330619 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:27:53.341146 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 17:27:53.341203 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 17 17:27:53.352798 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 17 17:27:53.352837 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:27:53.365130 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:27:53.412892 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 17:27:53.413053 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:27:53.426617 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 17:27:53.426661 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 17 17:27:53.437413 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 17:27:53.437455 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:27:53.448540 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 17:27:53.448590 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:27:53.474090 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 17:27:53.474150 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 17 17:27:53.490913 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:27:53.490992 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:27:53.534537 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 17 17:27:53.547938 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 17 17:27:53.548006 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:27:53.560512 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 17 17:27:53.560555 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:27:53.573845 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 17:27:53.573890 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:27:53.586708 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:27:53.586759 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:27:53.599346 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 17:27:53.599461 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 17 17:27:53.611635 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 17:27:53.611719 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 17 17:27:57.214423 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 17:27:57.214530 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 17 17:27:57.221238 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 17 17:27:57.232119 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 17:27:57.232195 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 17 17:27:57.254444 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 17 17:27:57.273084 systemd[1]: Switching root. Mar 17 17:27:57.353410 systemd-journald[217]: Journal stopped Mar 17 17:27:42.319572 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 17 17:27:42.319594 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT Mon Mar 17 16:05:23 -00 2025 Mar 17 17:27:42.319602 kernel: KASLR enabled Mar 17 17:27:42.319608 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Mar 17 17:27:42.319615 kernel: printk: bootconsole [pl11] enabled Mar 17 17:27:42.319621 kernel: efi: EFI v2.7 by EDK II Mar 17 17:27:42.319628 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3f214018 RNG=0x3fd5f998 MEMRESERVE=0x3e423d98 Mar 17 17:27:42.319635 kernel: random: crng init done Mar 17 17:27:42.319641 kernel: secureboot: Secure boot disabled Mar 17 17:27:42.319647 kernel: ACPI: Early table checksum verification disabled Mar 17 17:27:42.319653 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) Mar 17 17:27:42.319659 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319665 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319673 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) Mar 17 17:27:42.319680 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319687 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319693 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319701 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319707 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319714 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319720 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Mar 17 17:27:42.319727 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 17:27:42.319733 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Mar 17 17:27:42.319739 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] Mar 17 17:27:42.319746 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] Mar 17 17:27:42.319752 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] Mar 17 17:27:42.319759 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] Mar 17 17:27:42.319765 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] Mar 17 17:27:42.319773 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] Mar 17 17:27:42.319780 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] Mar 17 17:27:42.319786 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] Mar 17 17:27:42.319792 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] Mar 17 17:27:42.319799 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] Mar 17 17:27:42.319805 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] Mar 17 17:27:42.319812 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] Mar 17 17:27:42.319818 kernel: NUMA: NODE_DATA [mem 0x1bf7ef800-0x1bf7f4fff] Mar 17 17:27:42.319825 kernel: Zone ranges: Mar 17 17:27:42.319831 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Mar 17 17:27:42.319837 kernel: DMA32 empty Mar 17 17:27:42.319844 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Mar 17 17:27:42.319855 kernel: Movable zone start for each node Mar 17 17:27:42.319863 kernel: Early memory node ranges Mar 17 17:27:42.319870 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Mar 17 17:27:42.319877 kernel: node 0: [mem 0x0000000000824000-0x000000003e54ffff] Mar 17 17:27:42.319884 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] Mar 17 17:27:42.319892 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] Mar 17 17:27:42.319899 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] Mar 17 17:27:42.319906 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] Mar 17 17:27:42.319912 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Mar 17 17:27:42.319919 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Mar 17 17:27:42.319926 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Mar 17 17:27:42.319933 kernel: psci: probing for conduit method from ACPI. Mar 17 17:27:42.319940 kernel: psci: PSCIv1.1 detected in firmware. Mar 17 17:27:42.319947 kernel: psci: Using standard PSCI v0.2 function IDs Mar 17 17:27:42.319953 kernel: psci: MIGRATE_INFO_TYPE not supported. Mar 17 17:27:42.319961 kernel: psci: SMC Calling Convention v1.4 Mar 17 17:27:42.319967 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Mar 17 17:27:42.319976 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Mar 17 17:27:42.319983 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Mar 17 17:27:42.319989 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Mar 17 17:27:42.319996 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 17 17:27:42.320003 kernel: Detected PIPT I-cache on CPU0 Mar 17 17:27:42.320010 kernel: CPU features: detected: GIC system register CPU interface Mar 17 17:27:42.320017 kernel: CPU features: detected: Hardware dirty bit management Mar 17 17:27:42.320024 kernel: CPU features: detected: Spectre-BHB Mar 17 17:27:42.320031 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 17 17:27:42.320038 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 17 17:27:42.320045 kernel: CPU features: detected: ARM erratum 1418040 Mar 17 17:27:42.320054 kernel: CPU features: detected: ARM erratum 1542419 (kernel portion) Mar 17 17:27:42.320061 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 17 17:27:42.320068 kernel: alternatives: applying boot alternatives Mar 17 17:27:42.320077 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=31b104f73129b84fa679201ebe02fbfd197d071bbf0576d6ccc5c5442bcbb405 Mar 17 17:27:42.320084 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 17:27:42.320091 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 17 17:27:42.320098 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 17 17:27:42.320105 kernel: Fallback order for Node 0: 0 Mar 17 17:27:42.320113 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1032156 Mar 17 17:27:42.320120 kernel: Policy zone: Normal Mar 17 17:27:42.320126 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 17:27:42.320134 kernel: software IO TLB: area num 2. Mar 17 17:27:42.320141 kernel: software IO TLB: mapped [mem 0x0000000036620000-0x000000003a620000] (64MB) Mar 17 17:27:42.320149 kernel: Memory: 3982376K/4194160K available (10240K kernel code, 2186K rwdata, 8100K rodata, 39744K init, 897K bss, 211784K reserved, 0K cma-reserved) Mar 17 17:27:42.320156 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 17 17:27:42.320162 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 17 17:27:42.320170 kernel: rcu: RCU event tracing is enabled. Mar 17 17:27:42.320177 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 17 17:27:42.320184 kernel: Trampoline variant of Tasks RCU enabled. Mar 17 17:27:42.320191 kernel: Tracing variant of Tasks RCU enabled. Mar 17 17:27:42.320198 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 17:27:42.320205 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 17 17:27:42.320213 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 17 17:27:42.320219 kernel: GICv3: 960 SPIs implemented Mar 17 17:27:42.320226 kernel: GICv3: 0 Extended SPIs implemented Mar 17 17:27:42.320242 kernel: Root IRQ handler: gic_handle_irq Mar 17 17:27:42.320251 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 17 17:27:42.320258 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Mar 17 17:27:42.320265 kernel: ITS: No ITS available, not enabling LPIs Mar 17 17:27:42.320277 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 17 17:27:42.320284 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:27:42.320291 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 17 17:27:42.320298 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 17 17:27:42.320305 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 17 17:27:42.320313 kernel: Console: colour dummy device 80x25 Mar 17 17:27:42.320321 kernel: printk: console [tty1] enabled Mar 17 17:27:42.320328 kernel: ACPI: Core revision 20230628 Mar 17 17:27:42.320335 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 17 17:27:42.320342 kernel: pid_max: default: 32768 minimum: 301 Mar 17 17:27:42.320349 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 17 17:27:42.320356 kernel: landlock: Up and running. Mar 17 17:27:42.320363 kernel: SELinux: Initializing. Mar 17 17:27:42.320370 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:27:42.320379 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:27:42.320386 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 17 17:27:42.320394 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 17 17:27:42.320401 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3a8030, hints 0xe, misc 0x31e1 Mar 17 17:27:42.320408 kernel: Hyper-V: Host Build 10.0.22477.1619-1-0 Mar 17 17:27:42.320415 kernel: Hyper-V: enabling crash_kexec_post_notifiers Mar 17 17:27:42.320422 kernel: rcu: Hierarchical SRCU implementation. Mar 17 17:27:42.320437 kernel: rcu: Max phase no-delay instances is 400. Mar 17 17:27:42.320444 kernel: Remapping and enabling EFI services. Mar 17 17:27:42.320452 kernel: smp: Bringing up secondary CPUs ... Mar 17 17:27:42.320459 kernel: Detected PIPT I-cache on CPU1 Mar 17 17:27:42.320467 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Mar 17 17:27:42.320475 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:27:42.320483 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 17 17:27:42.320490 kernel: smp: Brought up 1 node, 2 CPUs Mar 17 17:27:42.320498 kernel: SMP: Total of 2 processors activated. Mar 17 17:27:42.320505 kernel: CPU features: detected: 32-bit EL0 Support Mar 17 17:27:42.320514 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Mar 17 17:27:42.320522 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 17 17:27:42.320529 kernel: CPU features: detected: CRC32 instructions Mar 17 17:27:42.320536 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 17 17:27:42.320544 kernel: CPU features: detected: LSE atomic instructions Mar 17 17:27:42.320551 kernel: CPU features: detected: Privileged Access Never Mar 17 17:27:42.320559 kernel: CPU: All CPU(s) started at EL1 Mar 17 17:27:42.320566 kernel: alternatives: applying system-wide alternatives Mar 17 17:27:42.320573 kernel: devtmpfs: initialized Mar 17 17:27:42.320582 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 17:27:42.320590 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 17 17:27:42.320597 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 17:27:42.320605 kernel: SMBIOS 3.1.0 present. Mar 17 17:27:42.320612 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 Mar 17 17:27:42.320620 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 17:27:42.320627 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 17 17:27:42.320635 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 17 17:27:42.320644 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 17 17:27:42.320651 kernel: audit: initializing netlink subsys (disabled) Mar 17 17:27:42.320659 kernel: audit: type=2000 audit(0.047:1): state=initialized audit_enabled=0 res=1 Mar 17 17:27:42.320666 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 17:27:42.320674 kernel: cpuidle: using governor menu Mar 17 17:27:42.320681 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 17 17:27:42.320688 kernel: ASID allocator initialised with 32768 entries Mar 17 17:27:42.320696 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 17:27:42.320703 kernel: Serial: AMBA PL011 UART driver Mar 17 17:27:42.320712 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 17 17:27:42.320720 kernel: Modules: 0 pages in range for non-PLT usage Mar 17 17:27:42.320727 kernel: Modules: 508944 pages in range for PLT usage Mar 17 17:27:42.320734 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 17 17:27:42.320742 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 17 17:27:42.320750 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 17 17:27:42.320757 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 17 17:27:42.320764 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 17:27:42.320772 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 17 17:27:42.320780 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 17 17:27:42.320788 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 17 17:27:42.320795 kernel: ACPI: Added _OSI(Module Device) Mar 17 17:27:42.320803 kernel: ACPI: Added _OSI(Processor Device) Mar 17 17:27:42.320810 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 17:27:42.320817 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 17:27:42.320825 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 17:27:42.320832 kernel: ACPI: Interpreter enabled Mar 17 17:27:42.320839 kernel: ACPI: Using GIC for interrupt routing Mar 17 17:27:42.320847 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Mar 17 17:27:42.320856 kernel: printk: console [ttyAMA0] enabled Mar 17 17:27:42.320864 kernel: printk: bootconsole [pl11] disabled Mar 17 17:27:42.320872 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Mar 17 17:27:42.320879 kernel: iommu: Default domain type: Translated Mar 17 17:27:42.320887 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 17 17:27:42.320894 kernel: efivars: Registered efivars operations Mar 17 17:27:42.320902 kernel: vgaarb: loaded Mar 17 17:27:42.320909 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 17 17:27:42.320916 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 17:27:42.320926 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 17:27:42.320934 kernel: pnp: PnP ACPI init Mar 17 17:27:42.320941 kernel: pnp: PnP ACPI: found 0 devices Mar 17 17:27:42.320948 kernel: NET: Registered PF_INET protocol family Mar 17 17:27:42.320956 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 17 17:27:42.320963 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 17 17:27:42.320971 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 17:27:42.320979 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 17 17:27:42.320987 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 17 17:27:42.320995 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 17 17:27:42.321002 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:27:42.321010 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:27:42.321018 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 17:27:42.321025 kernel: PCI: CLS 0 bytes, default 64 Mar 17 17:27:42.321032 kernel: kvm [1]: HYP mode not available Mar 17 17:27:42.321040 kernel: Initialise system trusted keyrings Mar 17 17:27:42.321047 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 17 17:27:42.321057 kernel: Key type asymmetric registered Mar 17 17:27:42.321064 kernel: Asymmetric key parser 'x509' registered Mar 17 17:27:42.321072 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 17 17:27:42.321079 kernel: io scheduler mq-deadline registered Mar 17 17:27:42.321086 kernel: io scheduler kyber registered Mar 17 17:27:42.321094 kernel: io scheduler bfq registered Mar 17 17:27:42.321102 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 17:27:42.321109 kernel: thunder_xcv, ver 1.0 Mar 17 17:27:42.321116 kernel: thunder_bgx, ver 1.0 Mar 17 17:27:42.321124 kernel: nicpf, ver 1.0 Mar 17 17:27:42.321133 kernel: nicvf, ver 1.0 Mar 17 17:27:42.321264 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 17 17:27:42.321337 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-17T17:27:41 UTC (1742232461) Mar 17 17:27:42.321348 kernel: efifb: probing for efifb Mar 17 17:27:42.321355 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Mar 17 17:27:42.321363 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Mar 17 17:27:42.321370 kernel: efifb: scrolling: redraw Mar 17 17:27:42.321380 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 17 17:27:42.321388 kernel: Console: switching to colour frame buffer device 128x48 Mar 17 17:27:42.321395 kernel: fb0: EFI VGA frame buffer device Mar 17 17:27:42.321402 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Mar 17 17:27:42.321410 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 17 17:27:42.321418 kernel: No ACPI PMU IRQ for CPU0 Mar 17 17:27:42.321426 kernel: No ACPI PMU IRQ for CPU1 Mar 17 17:27:42.321433 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 1 counters available Mar 17 17:27:42.321440 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 17 17:27:42.321449 kernel: watchdog: Hard watchdog permanently disabled Mar 17 17:27:42.321457 kernel: NET: Registered PF_INET6 protocol family Mar 17 17:27:42.321464 kernel: Segment Routing with IPv6 Mar 17 17:27:42.321472 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 17:27:42.321479 kernel: NET: Registered PF_PACKET protocol family Mar 17 17:27:42.321487 kernel: Key type dns_resolver registered Mar 17 17:27:42.321494 kernel: registered taskstats version 1 Mar 17 17:27:42.321502 kernel: Loading compiled-in X.509 certificates Mar 17 17:27:42.321509 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 74c9b4f5dfad711856d7363c976664fc02c1e24c' Mar 17 17:27:42.321517 kernel: Key type .fscrypt registered Mar 17 17:27:42.321526 kernel: Key type fscrypt-provisioning registered Mar 17 17:27:42.321533 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 17:27:42.321541 kernel: ima: Allocated hash algorithm: sha1 Mar 17 17:27:42.321548 kernel: ima: No architecture policies found Mar 17 17:27:42.321556 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 17 17:27:42.321564 kernel: clk: Disabling unused clocks Mar 17 17:27:42.321571 kernel: Freeing unused kernel memory: 39744K Mar 17 17:27:42.321579 kernel: Run /init as init process Mar 17 17:27:42.321587 kernel: with arguments: Mar 17 17:27:42.321595 kernel: /init Mar 17 17:27:42.321602 kernel: with environment: Mar 17 17:27:42.321610 kernel: HOME=/ Mar 17 17:27:42.321618 kernel: TERM=linux Mar 17 17:27:42.321625 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 17:27:42.321634 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 17 17:27:42.321644 systemd[1]: Detected virtualization microsoft. Mar 17 17:27:42.321653 systemd[1]: Detected architecture arm64. Mar 17 17:27:42.321661 systemd[1]: Running in initrd. Mar 17 17:27:42.321669 systemd[1]: No hostname configured, using default hostname. Mar 17 17:27:42.321677 systemd[1]: Hostname set to . Mar 17 17:27:42.321686 systemd[1]: Initializing machine ID from random generator. Mar 17 17:27:42.321694 systemd[1]: Queued start job for default target initrd.target. Mar 17 17:27:42.321702 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:27:42.321710 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:27:42.321720 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 17 17:27:42.321728 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:27:42.321736 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 17 17:27:42.321745 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 17 17:27:42.321754 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 17 17:27:42.321762 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 17 17:27:42.321770 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:27:42.321780 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:27:42.321788 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:27:42.321796 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:27:42.321804 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:27:42.321812 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:27:42.321820 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:27:42.321828 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:27:42.321837 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 17 17:27:42.321846 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 17 17:27:42.321854 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:27:42.321862 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:27:42.321870 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:27:42.321878 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:27:42.321887 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 17 17:27:42.321895 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:27:42.321903 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 17 17:27:42.321911 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 17:27:42.321920 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:27:42.321928 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:27:42.321949 systemd-journald[217]: Collecting audit messages is disabled. Mar 17 17:27:42.321969 systemd-journald[217]: Journal started Mar 17 17:27:42.321990 systemd-journald[217]: Runtime Journal (/run/log/journal/fbd10789bbf84b96aee41a602ead7456) is 8.0M, max 78.5M, 70.5M free. Mar 17 17:27:42.333666 systemd-modules-load[218]: Inserted module 'overlay' Mar 17 17:27:42.339239 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:27:42.367245 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:27:42.367305 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 17:27:42.376151 systemd-modules-load[218]: Inserted module 'br_netfilter' Mar 17 17:27:42.381574 kernel: Bridge firewalling registered Mar 17 17:27:42.376606 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 17 17:27:42.387802 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:27:42.402249 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 17:27:42.411216 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:27:42.421670 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:27:42.441504 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:27:42.449553 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:27:42.474395 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 17:27:42.492386 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:27:42.502259 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:27:42.516256 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:27:42.538208 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:27:42.545635 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:27:42.572449 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 17 17:27:42.581422 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:27:42.602253 dracut-cmdline[252]: dracut-dracut-053 Mar 17 17:27:42.602253 dracut-cmdline[252]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=31b104f73129b84fa679201ebe02fbfd197d071bbf0576d6ccc5c5442bcbb405 Mar 17 17:27:42.648471 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:27:42.661467 systemd-resolved[255]: Positive Trust Anchors: Mar 17 17:27:42.661478 systemd-resolved[255]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:27:42.661508 systemd-resolved[255]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:27:42.665300 systemd-resolved[255]: Defaulting to hostname 'linux'. Mar 17 17:27:42.669529 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:27:42.688866 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:27:42.734576 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:27:42.795281 kernel: SCSI subsystem initialized Mar 17 17:27:42.803254 kernel: Loading iSCSI transport class v2.0-870. Mar 17 17:27:42.815932 kernel: iscsi: registered transport (tcp) Mar 17 17:27:42.833015 kernel: iscsi: registered transport (qla4xxx) Mar 17 17:27:42.833069 kernel: QLogic iSCSI HBA Driver Mar 17 17:27:42.871034 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 17 17:27:42.887357 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 17 17:27:42.922031 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 17:27:42.922099 kernel: device-mapper: uevent: version 1.0.3 Mar 17 17:27:42.928962 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 17 17:27:42.976261 kernel: raid6: neonx8 gen() 15767 MB/s Mar 17 17:27:42.996247 kernel: raid6: neonx4 gen() 15665 MB/s Mar 17 17:27:43.016244 kernel: raid6: neonx2 gen() 13281 MB/s Mar 17 17:27:43.037245 kernel: raid6: neonx1 gen() 10486 MB/s Mar 17 17:27:43.057244 kernel: raid6: int64x8 gen() 6974 MB/s Mar 17 17:27:43.077244 kernel: raid6: int64x4 gen() 7344 MB/s Mar 17 17:27:43.098246 kernel: raid6: int64x2 gen() 6131 MB/s Mar 17 17:27:43.121708 kernel: raid6: int64x1 gen() 5061 MB/s Mar 17 17:27:43.121737 kernel: raid6: using algorithm neonx8 gen() 15767 MB/s Mar 17 17:27:43.146634 kernel: raid6: .... xor() 11941 MB/s, rmw enabled Mar 17 17:27:43.146650 kernel: raid6: using neon recovery algorithm Mar 17 17:27:43.157956 kernel: xor: measuring software checksum speed Mar 17 17:27:43.157971 kernel: 8regs : 19735 MB/sec Mar 17 17:27:43.165081 kernel: 32regs : 18769 MB/sec Mar 17 17:27:43.165093 kernel: arm64_neon : 27096 MB/sec Mar 17 17:27:43.169337 kernel: xor: using function: arm64_neon (27096 MB/sec) Mar 17 17:27:43.219255 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 17 17:27:43.228620 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:27:43.251391 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:27:43.273843 systemd-udevd[440]: Using default interface naming scheme 'v255'. Mar 17 17:27:43.279584 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:27:43.310500 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 17 17:27:43.327569 dracut-pre-trigger[452]: rd.md=0: removing MD RAID activation Mar 17 17:27:43.355289 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:27:43.369444 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:27:43.410198 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:27:43.429441 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 17 17:27:43.456186 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 17 17:27:43.470201 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:27:43.485863 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:27:43.499699 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:27:43.516422 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 17 17:27:43.533322 kernel: hv_vmbus: Vmbus version:5.3 Mar 17 17:27:43.544012 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:27:43.573944 kernel: hv_vmbus: registering driver hyperv_keyboard Mar 17 17:27:43.573984 kernel: hv_vmbus: registering driver hid_hyperv Mar 17 17:27:43.573996 kernel: hv_vmbus: registering driver hv_netvsc Mar 17 17:27:43.574006 kernel: hv_vmbus: registering driver hv_storvsc Mar 17 17:27:43.574016 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 17 17:27:43.573785 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:27:43.607277 kernel: scsi host1: storvsc_host_t Mar 17 17:27:43.607459 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 17 17:27:43.607471 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input0 Mar 17 17:27:43.607481 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input1 Mar 17 17:27:43.573927 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:27:43.660454 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Mar 17 17:27:43.660616 kernel: scsi host0: storvsc_host_t Mar 17 17:27:43.660708 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Mar 17 17:27:43.639512 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:27:43.653519 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:27:43.687419 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 0 Mar 17 17:27:43.653772 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:27:43.708423 kernel: hv_netvsc 00224878-0b63-0022-4878-0b6300224878 eth0: VF slot 1 added Mar 17 17:27:43.660641 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:27:43.682579 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:27:43.723263 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:27:43.753282 kernel: hv_vmbus: registering driver hv_pci Mar 17 17:27:43.753322 kernel: PTP clock support registered Mar 17 17:27:43.753433 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:27:43.777458 kernel: hv_utils: Registering HyperV Utility Driver Mar 17 17:27:43.777481 kernel: hv_vmbus: registering driver hv_utils Mar 17 17:27:43.795948 kernel: hv_utils: Heartbeat IC version 3.0 Mar 17 17:27:43.795989 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Mar 17 17:27:43.920523 kernel: hv_utils: Shutdown IC version 3.2 Mar 17 17:27:43.920540 kernel: hv_utils: TimeSync IC version 4.0 Mar 17 17:27:43.920550 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 17 17:27:43.920560 kernel: hv_pci 8cfe3a2d-f5b5-43e2-965d-57ce49de7108: PCI VMBus probing: Using version 0x10004 Mar 17 17:27:44.016304 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Mar 17 17:27:44.016446 kernel: hv_pci 8cfe3a2d-f5b5-43e2-965d-57ce49de7108: PCI host bridge to bus f5b5:00 Mar 17 17:27:44.016539 kernel: pci_bus f5b5:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Mar 17 17:27:44.016636 kernel: pci_bus f5b5:00: No busn resource found for root bus, will use [bus 00-ff] Mar 17 17:27:44.016712 kernel: pci f5b5:00:02.0: [15b3:1018] type 00 class 0x020000 Mar 17 17:27:44.016806 kernel: pci f5b5:00:02.0: reg 0x10: [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 17 17:27:44.016887 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Mar 17 17:27:44.027936 kernel: pci f5b5:00:02.0: enabling Extended Tags Mar 17 17:27:44.028072 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Mar 17 17:27:44.028171 kernel: pci f5b5:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at f5b5:00:02.0 (capable of 126.016 Gb/s with 8.0 GT/s PCIe x16 link) Mar 17 17:27:44.028779 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 17 17:27:44.028879 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Mar 17 17:27:44.028961 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Mar 17 17:27:44.029044 kernel: pci_bus f5b5:00: busn_res: [bus 00-ff] end is updated to 00 Mar 17 17:27:44.029127 kernel: pci f5b5:00:02.0: BAR 0: assigned [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 17 17:27:44.029216 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:27:44.029242 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 17 17:27:43.904330 systemd-resolved[255]: Clock change detected. Flushing caches. Mar 17 17:27:43.950588 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:27:44.082085 kernel: mlx5_core f5b5:00:02.0: enabling device (0000 -> 0002) Mar 17 17:27:44.300954 kernel: mlx5_core f5b5:00:02.0: firmware version: 16.30.1284 Mar 17 17:27:44.301085 kernel: hv_netvsc 00224878-0b63-0022-4878-0b6300224878 eth0: VF registering: eth1 Mar 17 17:27:44.301177 kernel: mlx5_core f5b5:00:02.0 eth1: joined to eth0 Mar 17 17:27:44.301297 kernel: mlx5_core f5b5:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Mar 17 17:27:44.308246 kernel: mlx5_core f5b5:00:02.0 enP62901s1: renamed from eth1 Mar 17 17:27:44.731522 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Mar 17 17:27:45.083967 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Mar 17 17:27:45.099536 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by (udev-worker) (497) Mar 17 17:27:45.109572 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 17 17:27:45.133384 kernel: BTRFS: device fsid c0c482e3-6885-4a4e-b31c-6bc8f8c403e7 devid 1 transid 40 /dev/sda3 scanned by (udev-worker) (486) Mar 17 17:27:45.144676 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Mar 17 17:27:45.152165 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Mar 17 17:27:45.181396 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 17 17:27:45.207251 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:27:45.215247 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:27:46.225405 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 17:27:46.226710 disk-uuid[602]: The operation has completed successfully. Mar 17 17:27:46.284809 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 17:27:46.284905 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 17 17:27:46.308373 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 17 17:27:46.321627 sh[688]: Success Mar 17 17:27:46.341263 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 17 17:27:46.572075 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 17 17:27:46.578627 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 17 17:27:46.601366 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 17 17:27:46.630908 kernel: BTRFS info (device dm-0): first mount of filesystem c0c482e3-6885-4a4e-b31c-6bc8f8c403e7 Mar 17 17:27:46.630950 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:27:46.638033 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 17 17:27:46.643198 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 17 17:27:46.647747 kernel: BTRFS info (device dm-0): using free space tree Mar 17 17:27:47.015882 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 17 17:27:47.021876 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 17 17:27:47.043488 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 17 17:27:47.050834 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 17 17:27:47.094822 kernel: BTRFS info (device sda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:27:47.094844 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:27:47.094854 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:27:47.111324 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 17:27:47.125783 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 17:27:47.130515 kernel: BTRFS info (device sda6): last unmount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:27:47.137955 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 17 17:27:47.152499 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 17 17:27:47.210706 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:27:47.229358 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:27:47.255493 systemd-networkd[872]: lo: Link UP Mar 17 17:27:47.255504 systemd-networkd[872]: lo: Gained carrier Mar 17 17:27:47.257439 systemd-networkd[872]: Enumeration completed Mar 17 17:27:47.259084 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:27:47.265624 systemd-networkd[872]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:27:47.265628 systemd-networkd[872]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:27:47.266345 systemd[1]: Reached target network.target - Network. Mar 17 17:27:47.358259 kernel: mlx5_core f5b5:00:02.0 enP62901s1: Link up Mar 17 17:27:47.397274 kernel: hv_netvsc 00224878-0b63-0022-4878-0b6300224878 eth0: Data path switched to VF: enP62901s1 Mar 17 17:27:47.396889 systemd-networkd[872]: enP62901s1: Link UP Mar 17 17:27:47.396976 systemd-networkd[872]: eth0: Link UP Mar 17 17:27:47.397135 systemd-networkd[872]: eth0: Gained carrier Mar 17 17:27:47.397144 systemd-networkd[872]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:27:47.409716 systemd-networkd[872]: enP62901s1: Gained carrier Mar 17 17:27:47.431274 systemd-networkd[872]: eth0: DHCPv4 address 10.200.20.22/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 17 17:27:48.099605 ignition[794]: Ignition 2.20.0 Mar 17 17:27:48.099617 ignition[794]: Stage: fetch-offline Mar 17 17:27:48.103863 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:27:48.099649 ignition[794]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:27:48.099658 ignition[794]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:27:48.099748 ignition[794]: parsed url from cmdline: "" Mar 17 17:27:48.099751 ignition[794]: no config URL provided Mar 17 17:27:48.099755 ignition[794]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 17:27:48.134490 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 17 17:27:48.099763 ignition[794]: no config at "/usr/lib/ignition/user.ign" Mar 17 17:27:48.099768 ignition[794]: failed to fetch config: resource requires networking Mar 17 17:27:48.099932 ignition[794]: Ignition finished successfully Mar 17 17:27:48.160212 ignition[882]: Ignition 2.20.0 Mar 17 17:27:48.160219 ignition[882]: Stage: fetch Mar 17 17:27:48.160424 ignition[882]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:27:48.160434 ignition[882]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:27:48.160528 ignition[882]: parsed url from cmdline: "" Mar 17 17:27:48.160531 ignition[882]: no config URL provided Mar 17 17:27:48.160535 ignition[882]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 17:27:48.160546 ignition[882]: no config at "/usr/lib/ignition/user.ign" Mar 17 17:27:48.160573 ignition[882]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Mar 17 17:27:48.258056 ignition[882]: GET result: OK Mar 17 17:27:48.258125 ignition[882]: config has been read from IMDS userdata Mar 17 17:27:48.258172 ignition[882]: parsing config with SHA512: 8f975b6895a36971b4ecc3dd0851a631055163f6a1cf9e5648d9323ef95dcfd1da82756cc152b28fb89291371b4629b4519d8ccd19d2c44f7fbcb766f9e02644 Mar 17 17:27:48.262744 unknown[882]: fetched base config from "system" Mar 17 17:27:48.263129 ignition[882]: fetch: fetch complete Mar 17 17:27:48.262751 unknown[882]: fetched base config from "system" Mar 17 17:27:48.263133 ignition[882]: fetch: fetch passed Mar 17 17:27:48.262756 unknown[882]: fetched user config from "azure" Mar 17 17:27:48.263174 ignition[882]: Ignition finished successfully Mar 17 17:27:48.268040 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 17 17:27:48.297084 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 17 17:27:48.317412 ignition[888]: Ignition 2.20.0 Mar 17 17:27:48.317424 ignition[888]: Stage: kargs Mar 17 17:27:48.321844 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 17 17:27:48.317585 ignition[888]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:27:48.317594 ignition[888]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:27:48.318470 ignition[888]: kargs: kargs passed Mar 17 17:27:48.349444 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 17 17:27:48.318515 ignition[888]: Ignition finished successfully Mar 17 17:27:48.371470 ignition[894]: Ignition 2.20.0 Mar 17 17:27:48.371480 ignition[894]: Stage: disks Mar 17 17:27:48.378585 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 17 17:27:48.371642 ignition[894]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:27:48.384533 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 17 17:27:48.371650 ignition[894]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:27:48.395940 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 17 17:27:48.372488 ignition[894]: disks: disks passed Mar 17 17:27:48.407809 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:27:48.372526 ignition[894]: Ignition finished successfully Mar 17 17:27:48.419588 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:27:48.431289 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:27:48.456470 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 17 17:27:48.526485 systemd-fsck[902]: ROOT: clean, 14/7326000 files, 477710/7359488 blocks Mar 17 17:27:48.532260 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 17 17:27:48.551434 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 17 17:27:48.611254 kernel: EXT4-fs (sda9): mounted filesystem 6b579bf2-7716-4d59-98eb-b92ea668693e r/w with ordered data mode. Quota mode: none. Mar 17 17:27:48.611486 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 17 17:27:48.616671 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 17 17:27:48.669302 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:27:48.680199 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 17 17:27:48.689400 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 17 17:27:48.696771 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 17:27:48.736366 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (913) Mar 17 17:27:48.696805 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:27:48.761054 kernel: BTRFS info (device sda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:27:48.761081 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:27:48.761092 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:27:48.724897 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 17 17:27:48.771019 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 17:27:48.771472 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 17 17:27:48.785510 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:27:49.035396 systemd-networkd[872]: eth0: Gained IPv6LL Mar 17 17:27:49.227386 systemd-networkd[872]: enP62901s1: Gained IPv6LL Mar 17 17:27:49.396872 coreos-metadata[915]: Mar 17 17:27:49.396 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 17 17:27:49.406997 coreos-metadata[915]: Mar 17 17:27:49.406 INFO Fetch successful Mar 17 17:27:49.412518 coreos-metadata[915]: Mar 17 17:27:49.412 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Mar 17 17:27:49.434379 coreos-metadata[915]: Mar 17 17:27:49.434 INFO Fetch successful Mar 17 17:27:49.451058 coreos-metadata[915]: Mar 17 17:27:49.451 INFO wrote hostname ci-4152.2.2-a-6c46d54d7c to /sysroot/etc/hostname Mar 17 17:27:49.460627 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 17 17:27:49.829184 initrd-setup-root[943]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 17:27:50.124907 initrd-setup-root[950]: cut: /sysroot/etc/group: No such file or directory Mar 17 17:27:50.150451 initrd-setup-root[957]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 17:27:50.170886 initrd-setup-root[964]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 17:27:50.998153 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 17 17:27:51.015449 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 17 17:27:51.027406 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 17 17:27:51.047423 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 17 17:27:51.053251 kernel: BTRFS info (device sda6): last unmount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:27:51.067762 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 17 17:27:51.085400 ignition[1034]: INFO : Ignition 2.20.0 Mar 17 17:27:51.085400 ignition[1034]: INFO : Stage: mount Mar 17 17:27:51.095125 ignition[1034]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:27:51.095125 ignition[1034]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:27:51.095125 ignition[1034]: INFO : mount: mount passed Mar 17 17:27:51.095125 ignition[1034]: INFO : Ignition finished successfully Mar 17 17:27:51.095150 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 17 17:27:51.124437 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 17 17:27:51.143360 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:27:51.165582 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (1044) Mar 17 17:27:51.183506 kernel: BTRFS info (device sda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:27:51.183560 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:27:51.183574 kernel: BTRFS info (device sda6): using free space tree Mar 17 17:27:51.190260 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 17:27:51.191041 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:27:51.219238 ignition[1061]: INFO : Ignition 2.20.0 Mar 17 17:27:51.219238 ignition[1061]: INFO : Stage: files Mar 17 17:27:51.219238 ignition[1061]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:27:51.219238 ignition[1061]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:27:51.239851 ignition[1061]: DEBUG : files: compiled without relabeling support, skipping Mar 17 17:27:51.239851 ignition[1061]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 17:27:51.239851 ignition[1061]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 17:27:51.313631 ignition[1061]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 17:27:51.321443 ignition[1061]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 17:27:51.321443 ignition[1061]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 17:27:51.315697 unknown[1061]: wrote ssh authorized keys file for user: core Mar 17 17:27:51.360649 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 17 17:27:51.372055 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Mar 17 17:27:51.406614 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 17 17:27:51.592556 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 17 17:27:51.592556 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 17:27:51.614495 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-arm64.raw: attempt #1 Mar 17 17:27:51.963912 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 17 17:27:52.329243 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 17:27:52.329243 ignition[1061]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 17 17:27:52.367262 ignition[1061]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:27:52.379419 ignition[1061]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:27:52.379419 ignition[1061]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 17 17:27:52.379419 ignition[1061]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 17 17:27:52.379419 ignition[1061]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 17 17:27:52.379419 ignition[1061]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:27:52.379419 ignition[1061]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:27:52.379419 ignition[1061]: INFO : files: files passed Mar 17 17:27:52.379419 ignition[1061]: INFO : Ignition finished successfully Mar 17 17:27:52.380283 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 17 17:27:52.421538 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 17 17:27:52.440420 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 17 17:27:52.499346 initrd-setup-root-after-ignition[1089]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:27:52.499346 initrd-setup-root-after-ignition[1089]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:27:52.465395 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 17:27:52.535459 initrd-setup-root-after-ignition[1094]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:27:52.465760 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 17 17:27:52.493841 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:27:52.501534 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 17 17:27:52.527458 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 17 17:27:52.568023 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 17:27:52.568136 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 17 17:27:52.579159 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 17 17:27:52.591823 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 17 17:27:52.603412 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 17 17:27:52.631495 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 17 17:27:52.642569 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:27:52.667367 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 17 17:27:52.686562 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 17:27:52.686703 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 17 17:27:52.700247 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:27:52.713110 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:27:52.726174 systemd[1]: Stopped target timers.target - Timer Units. Mar 17 17:27:52.737896 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 17:27:52.737957 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:27:52.756273 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 17 17:27:52.763591 systemd[1]: Stopped target basic.target - Basic System. Mar 17 17:27:52.775835 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 17 17:27:52.787684 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:27:52.799349 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 17 17:27:52.811947 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 17 17:27:52.823933 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:27:52.836639 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 17 17:27:52.848492 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 17 17:27:52.861125 systemd[1]: Stopped target swap.target - Swaps. Mar 17 17:27:52.871445 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 17:27:52.871519 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:27:52.887000 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:27:52.898612 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:27:52.911688 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 17 17:27:52.911728 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:27:52.924767 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 17:27:52.924834 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 17 17:27:52.942514 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 17:27:52.942563 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:27:52.957551 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 17:27:52.957590 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 17 17:27:52.968703 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 17 17:27:52.968753 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 17 17:27:53.000413 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 17 17:27:53.044007 ignition[1115]: INFO : Ignition 2.20.0 Mar 17 17:27:53.044007 ignition[1115]: INFO : Stage: umount Mar 17 17:27:53.044007 ignition[1115]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:27:53.044007 ignition[1115]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 17:27:53.044007 ignition[1115]: INFO : umount: umount passed Mar 17 17:27:53.044007 ignition[1115]: INFO : Ignition finished successfully Mar 17 17:27:53.025306 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 17 17:27:53.031078 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 17:27:53.031163 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:27:53.038514 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 17:27:53.038561 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:27:53.054611 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 17:27:53.054695 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 17 17:27:53.066414 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 17:27:53.066516 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 17 17:27:53.077809 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 17:27:53.077869 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 17 17:27:53.091245 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 17 17:27:53.091285 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 17 17:27:53.101364 systemd[1]: Stopped target network.target - Network. Mar 17 17:27:53.111714 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 17:27:53.111771 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:27:53.123392 systemd[1]: Stopped target paths.target - Path Units. Mar 17 17:27:53.133908 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 17:27:53.139942 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:27:53.147340 systemd[1]: Stopped target slices.target - Slice Units. Mar 17 17:27:53.158250 systemd[1]: Stopped target sockets.target - Socket Units. Mar 17 17:27:53.169918 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 17:27:53.169967 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:27:53.181145 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 17:27:53.181183 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:27:53.192468 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 17:27:53.192523 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 17 17:27:53.203757 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 17 17:27:53.203799 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 17 17:27:53.215015 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 17 17:27:53.225716 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 17 17:27:53.236445 systemd-networkd[872]: eth0: DHCPv6 lease lost Mar 17 17:27:53.243470 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 17:27:53.243994 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 17:27:53.244089 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 17 17:27:53.256408 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 17:27:53.479438 kernel: hv_netvsc 00224878-0b63-0022-4878-0b6300224878 eth0: Data path switched from VF: enP62901s1 Mar 17 17:27:53.256523 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 17 17:27:53.270346 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 17:27:53.270408 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:27:53.307429 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 17 17:27:53.318425 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 17:27:53.318496 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:27:53.330570 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 17:27:53.330619 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:27:53.341146 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 17:27:53.341203 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 17 17:27:53.352798 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 17 17:27:53.352837 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:27:53.365130 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:27:53.412892 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 17:27:53.413053 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:27:53.426617 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 17:27:53.426661 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 17 17:27:53.437413 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 17:27:53.437455 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:27:53.448540 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 17:27:53.448590 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:27:53.474090 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 17:27:53.474150 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 17 17:27:53.490913 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:27:53.490992 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:27:53.534537 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 17 17:27:53.547938 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 17 17:27:53.548006 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:27:53.560512 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 17 17:27:53.560555 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:27:53.573845 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 17:27:53.573890 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:27:53.586708 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:27:53.586759 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:27:53.599346 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 17:27:53.599461 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 17 17:27:53.611635 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 17:27:53.611719 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 17 17:27:57.214423 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 17:27:57.214530 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 17 17:27:57.221238 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 17 17:27:57.232119 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 17:27:57.232195 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 17 17:27:57.254444 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 17 17:27:57.273084 systemd[1]: Switching root. Mar 17 17:27:57.353410 systemd-journald[217]: Journal stopped Mar 17 17:28:02.496534 systemd-journald[217]: Received SIGTERM from PID 1 (systemd). Mar 17 17:28:02.496560 kernel: SELinux: policy capability network_peer_controls=1 Mar 17 17:28:02.496570 kernel: SELinux: policy capability open_perms=1 Mar 17 17:28:02.496580 kernel: SELinux: policy capability extended_socket_class=1 Mar 17 17:28:02.496588 kernel: SELinux: policy capability always_check_network=0 Mar 17 17:28:02.496595 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 17 17:28:02.496604 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 17 17:28:02.496612 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 17 17:28:02.496620 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 17 17:28:02.496628 kernel: audit: type=1403 audit(1742232478.667:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 17 17:28:02.496638 systemd[1]: Successfully loaded SELinux policy in 162.628ms. Mar 17 17:28:02.496648 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 10.312ms. Mar 17 17:28:02.496659 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 17 17:28:02.496668 systemd[1]: Detected virtualization microsoft. Mar 17 17:28:02.496677 systemd[1]: Detected architecture arm64. Mar 17 17:28:02.496687 systemd[1]: Detected first boot. Mar 17 17:28:02.496698 systemd[1]: Hostname set to . Mar 17 17:28:02.496707 systemd[1]: Initializing machine ID from random generator. Mar 17 17:28:02.496715 zram_generator::config[1156]: No configuration found. Mar 17 17:28:02.496725 systemd[1]: Populated /etc with preset unit settings. Mar 17 17:28:02.496734 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 17 17:28:02.496744 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 17 17:28:02.496753 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 17 17:28:02.496763 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 17 17:28:02.496772 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 17 17:28:02.496781 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 17 17:28:02.496790 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 17 17:28:02.496800 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 17 17:28:02.496810 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 17 17:28:02.496820 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 17 17:28:02.496829 systemd[1]: Created slice user.slice - User and Session Slice. Mar 17 17:28:02.496838 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:28:02.496847 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:28:02.496856 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 17 17:28:02.496866 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 17 17:28:02.496875 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 17 17:28:02.496885 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:28:02.496896 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 17 17:28:02.496905 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:28:02.496914 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 17 17:28:02.496925 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 17 17:28:02.496935 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 17 17:28:02.496946 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 17 17:28:02.496955 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:28:02.496966 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:28:02.496975 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:28:02.496984 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:28:02.496994 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 17 17:28:02.497003 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 17 17:28:02.497012 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:28:02.497021 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:28:02.497032 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:28:02.497042 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 17 17:28:02.497051 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 17 17:28:02.497061 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 17 17:28:02.497070 systemd[1]: Mounting media.mount - External Media Directory... Mar 17 17:28:02.497079 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 17 17:28:02.497090 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 17 17:28:02.497100 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 17 17:28:02.497110 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 17 17:28:02.497120 systemd[1]: Reached target machines.target - Containers. Mar 17 17:28:02.497129 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 17 17:28:02.497139 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:28:02.497148 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:28:02.497157 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 17 17:28:02.497168 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:28:02.497177 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 17:28:02.497187 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:28:02.497196 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 17 17:28:02.497205 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:28:02.497215 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 17 17:28:02.497224 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 17 17:28:02.497244 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 17 17:28:02.497254 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 17 17:28:02.497265 systemd[1]: Stopped systemd-fsck-usr.service. Mar 17 17:28:02.497274 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:28:02.497284 kernel: fuse: init (API version 7.39) Mar 17 17:28:02.497293 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:28:02.497302 kernel: loop: module loaded Mar 17 17:28:02.497311 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 17 17:28:02.497321 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 17 17:28:02.497330 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:28:02.497339 systemd[1]: verity-setup.service: Deactivated successfully. Mar 17 17:28:02.497350 systemd[1]: Stopped verity-setup.service. Mar 17 17:28:02.497360 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 17 17:28:02.497383 systemd-journald[1252]: Collecting audit messages is disabled. Mar 17 17:28:02.497403 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 17 17:28:02.497414 systemd-journald[1252]: Journal started Mar 17 17:28:02.497434 systemd-journald[1252]: Runtime Journal (/run/log/journal/ed2f9639c55e4cb8a801d1bb62dced70) is 8.0M, max 78.5M, 70.5M free. Mar 17 17:28:01.341609 systemd[1]: Queued start job for default target multi-user.target. Mar 17 17:28:01.483123 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 17 17:28:01.483491 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 17 17:28:01.483803 systemd[1]: systemd-journald.service: Consumed 3.276s CPU time. Mar 17 17:28:02.520932 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:28:02.521749 systemd[1]: Mounted media.mount - External Media Directory. Mar 17 17:28:02.527383 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 17 17:28:02.533904 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 17 17:28:02.540356 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 17 17:28:02.550020 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 17 17:28:02.551247 kernel: ACPI: bus type drm_connector registered Mar 17 17:28:02.557785 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:28:02.565470 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 17 17:28:02.565606 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 17 17:28:02.572630 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:28:02.572757 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:28:02.579685 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 17:28:02.579818 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 17:28:02.586058 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:28:02.586182 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:28:02.593613 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 17 17:28:02.593738 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 17 17:28:02.600122 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:28:02.600258 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:28:02.607016 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:28:02.613831 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 17 17:28:02.621284 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 17 17:28:02.628779 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:28:02.646196 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 17 17:28:02.658309 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 17 17:28:02.665467 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 17 17:28:02.671965 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 17 17:28:02.672002 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:28:02.678689 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 17 17:28:02.686933 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 17 17:28:02.694495 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 17 17:28:02.700339 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:28:02.741398 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 17 17:28:02.748691 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 17 17:28:02.755335 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 17:28:02.756312 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 17 17:28:02.763413 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 17:28:02.765288 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:28:02.785400 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 17 17:28:02.795728 systemd-journald[1252]: Time spent on flushing to /var/log/journal/ed2f9639c55e4cb8a801d1bb62dced70 is 12.231ms for 894 entries. Mar 17 17:28:02.795728 systemd-journald[1252]: System Journal (/var/log/journal/ed2f9639c55e4cb8a801d1bb62dced70) is 8.0M, max 2.6G, 2.6G free. Mar 17 17:28:03.916847 systemd-journald[1252]: Received client request to flush runtime journal. Mar 17 17:28:03.916914 kernel: loop0: detected capacity change from 0 to 28720 Mar 17 17:28:02.803977 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 17:28:02.812414 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 17 17:28:02.823423 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 17 17:28:02.833773 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 17 17:28:02.842249 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 17 17:28:02.852956 udevadm[1293]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 17 17:28:02.877550 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:28:02.911595 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 17 17:28:02.918534 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 17 17:28:02.938509 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 17 17:28:03.328977 systemd-tmpfiles[1292]: ACLs are not supported, ignoring. Mar 17 17:28:03.328989 systemd-tmpfiles[1292]: ACLs are not supported, ignoring. Mar 17 17:28:03.332574 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:28:03.347387 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 17 17:28:03.917957 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 17 17:28:05.205773 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 17 17:28:05.206398 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 17 17:28:05.335062 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 17 17:28:05.352284 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 17 17:28:05.356437 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:28:05.373115 systemd-tmpfiles[1311]: ACLs are not supported, ignoring. Mar 17 17:28:05.373137 systemd-tmpfiles[1311]: ACLs are not supported, ignoring. Mar 17 17:28:05.377041 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:28:05.525248 kernel: loop1: detected capacity change from 0 to 116808 Mar 17 17:28:06.505675 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 17 17:28:06.517431 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:28:06.532289 kernel: loop2: detected capacity change from 0 to 113536 Mar 17 17:28:06.544746 systemd-udevd[1317]: Using default interface naming scheme 'v255'. Mar 17 17:28:07.221282 kernel: loop3: detected capacity change from 0 to 194096 Mar 17 17:28:07.255257 kernel: loop4: detected capacity change from 0 to 28720 Mar 17 17:28:07.265279 kernel: loop5: detected capacity change from 0 to 116808 Mar 17 17:28:07.276246 kernel: loop6: detected capacity change from 0 to 113536 Mar 17 17:28:07.286248 kernel: loop7: detected capacity change from 0 to 194096 Mar 17 17:28:07.291140 (sd-merge)[1320]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Mar 17 17:28:07.291599 (sd-merge)[1320]: Merged extensions into '/usr'. Mar 17 17:28:07.294762 systemd[1]: Reloading requested from client PID 1290 ('systemd-sysext') (unit systemd-sysext.service)... Mar 17 17:28:07.294866 systemd[1]: Reloading... Mar 17 17:28:07.344258 zram_generator::config[1347]: No configuration found. Mar 17 17:28:07.477433 kernel: mousedev: PS/2 mouse device common for all mice Mar 17 17:28:07.539250 kernel: hv_vmbus: registering driver hv_balloon Mar 17 17:28:07.539334 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Mar 17 17:28:07.549681 kernel: hv_balloon: Memory hot add disabled on ARM64 Mar 17 17:28:07.558105 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:28:07.565909 kernel: hv_vmbus: registering driver hyperv_fb Mar 17 17:28:07.570289 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Mar 17 17:28:07.577124 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Mar 17 17:28:07.582810 kernel: Console: switching to colour dummy device 80x25 Mar 17 17:28:07.590765 kernel: Console: switching to colour frame buffer device 128x48 Mar 17 17:28:07.624788 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 17 17:28:07.625180 systemd[1]: Reloading finished in 329 ms. Mar 17 17:28:07.649203 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:28:07.658255 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 17 17:28:07.680410 systemd[1]: Starting ensure-sysext.service... Mar 17 17:28:07.688399 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:28:07.696443 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:28:07.711998 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:28:07.728959 systemd[1]: Reloading requested from client PID 1444 ('systemctl') (unit ensure-sysext.service)... Mar 17 17:28:07.729068 systemd[1]: Reloading... Mar 17 17:28:07.760285 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (1390) Mar 17 17:28:07.772081 systemd-tmpfiles[1446]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 17 17:28:07.772376 systemd-tmpfiles[1446]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 17 17:28:07.773030 systemd-tmpfiles[1446]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 17 17:28:07.777386 systemd-tmpfiles[1446]: ACLs are not supported, ignoring. Mar 17 17:28:07.777447 systemd-tmpfiles[1446]: ACLs are not supported, ignoring. Mar 17 17:28:07.782951 systemd-tmpfiles[1446]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 17:28:07.782966 systemd-tmpfiles[1446]: Skipping /boot Mar 17 17:28:07.795738 systemd-tmpfiles[1446]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 17:28:07.795753 systemd-tmpfiles[1446]: Skipping /boot Mar 17 17:28:07.852257 zram_generator::config[1531]: No configuration found. Mar 17 17:28:07.969564 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:28:08.044145 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 17 17:28:08.051590 systemd[1]: Reloading finished in 322 ms. Mar 17 17:28:08.071853 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:28:08.101893 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 17 17:28:08.123244 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:28:08.129749 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 17 17:28:08.136677 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:28:08.138554 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 17 17:28:08.147401 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:28:08.159002 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:28:08.166746 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:28:08.172611 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:28:08.174548 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 17 17:28:08.183182 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 17 17:28:08.198498 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:28:08.208432 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 17 17:28:08.218399 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 17 17:28:08.226334 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:28:08.226895 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:28:08.237146 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:28:08.237452 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:28:08.245197 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:28:08.245506 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:28:08.258429 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:28:08.263455 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:28:08.270748 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 17:28:08.279063 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:28:08.290741 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:28:08.296654 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:28:08.296848 systemd[1]: Reached target time-set.target - System Time Set. Mar 17 17:28:08.303763 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:28:08.305270 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:28:08.312483 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 17:28:08.312614 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 17:28:08.318986 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:28:08.319109 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:28:08.329782 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:28:08.329954 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:28:08.330861 lvm[1593]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 17:28:08.337659 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 17 17:28:08.346779 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:28:08.346976 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:28:08.355936 systemd[1]: Finished ensure-sysext.service. Mar 17 17:28:08.363454 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 17 17:28:08.377943 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:28:08.394599 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 17 17:28:08.405264 lvm[1628]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 17:28:08.405547 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 17:28:08.405711 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 17:28:08.414532 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:28:08.424726 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 17 17:28:08.434292 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 17 17:28:08.446291 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 17 17:28:08.458914 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 17 17:28:08.538546 systemd-resolved[1605]: Positive Trust Anchors: Mar 17 17:28:08.538854 systemd-resolved[1605]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:28:08.538933 systemd-resolved[1605]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:28:08.542153 systemd-resolved[1605]: Using system hostname 'ci-4152.2.2-a-6c46d54d7c'. Mar 17 17:28:08.543683 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:28:08.551002 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:28:08.581840 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:28:08.643713 systemd-networkd[1445]: lo: Link UP Mar 17 17:28:08.643723 systemd-networkd[1445]: lo: Gained carrier Mar 17 17:28:08.645679 systemd-networkd[1445]: Enumeration completed Mar 17 17:28:08.645778 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:28:08.646194 systemd-networkd[1445]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:28:08.646199 systemd-networkd[1445]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:28:08.652704 systemd[1]: Reached target network.target - Network. Mar 17 17:28:08.662385 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 17 17:28:08.708254 kernel: mlx5_core f5b5:00:02.0 enP62901s1: Link up Mar 17 17:28:08.734307 kernel: hv_netvsc 00224878-0b63-0022-4878-0b6300224878 eth0: Data path switched to VF: enP62901s1 Mar 17 17:28:08.735465 systemd-networkd[1445]: enP62901s1: Link UP Mar 17 17:28:08.735564 systemd-networkd[1445]: eth0: Link UP Mar 17 17:28:08.735568 systemd-networkd[1445]: eth0: Gained carrier Mar 17 17:28:08.735580 systemd-networkd[1445]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:28:08.742619 systemd-networkd[1445]: enP62901s1: Gained carrier Mar 17 17:28:08.747272 augenrules[1655]: No rules Mar 17 17:28:08.748297 systemd-networkd[1445]: eth0: DHCPv4 address 10.200.20.22/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 17 17:28:08.749351 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:28:08.749547 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:28:09.064716 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 17 17:28:09.072557 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 17:28:09.835413 systemd-networkd[1445]: eth0: Gained IPv6LL Mar 17 17:28:09.837671 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 17 17:28:09.845319 systemd[1]: Reached target network-online.target - Network is Online. Mar 17 17:28:10.219319 systemd-networkd[1445]: enP62901s1: Gained IPv6LL Mar 17 17:28:12.105979 ldconfig[1285]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 17 17:28:12.130854 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 17 17:28:12.143445 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 17 17:28:12.152392 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 17 17:28:12.159843 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:28:12.165970 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 17 17:28:12.172972 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 17 17:28:12.180372 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 17 17:28:12.186805 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 17 17:28:12.193937 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 17 17:28:12.201302 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 17 17:28:12.201335 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:28:12.206436 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:28:12.228074 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 17 17:28:12.235874 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 17 17:28:12.245797 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 17 17:28:12.252114 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 17 17:28:12.258257 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:28:12.263528 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:28:12.268915 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 17 17:28:12.268945 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 17 17:28:12.271131 systemd[1]: Starting chronyd.service - NTP client/server... Mar 17 17:28:12.278417 systemd[1]: Starting containerd.service - containerd container runtime... Mar 17 17:28:12.287402 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 17 17:28:12.306320 (chronyd)[1668]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS Mar 17 17:28:12.306392 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 17 17:28:12.312499 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 17 17:28:12.321446 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 17 17:28:12.327214 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 17 17:28:12.327351 systemd[1]: hv_fcopy_daemon.service - Hyper-V FCOPY daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_fcopy). Mar 17 17:28:12.330423 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Mar 17 17:28:12.340674 KVP[1677]: KVP starting; pid is:1677 Mar 17 17:28:12.342270 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Mar 17 17:28:12.344362 jq[1675]: false Mar 17 17:28:12.344643 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:28:12.353189 chronyd[1681]: chronyd version 4.6 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) Mar 17 17:28:12.357438 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 17 17:28:12.367408 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 17 17:28:12.379456 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 17 17:28:12.395145 extend-filesystems[1676]: Found loop4 Mar 17 17:28:12.395145 extend-filesystems[1676]: Found loop5 Mar 17 17:28:12.395145 extend-filesystems[1676]: Found loop6 Mar 17 17:28:12.395145 extend-filesystems[1676]: Found loop7 Mar 17 17:28:12.395145 extend-filesystems[1676]: Found sda Mar 17 17:28:12.395145 extend-filesystems[1676]: Found sda1 Mar 17 17:28:12.395145 extend-filesystems[1676]: Found sda2 Mar 17 17:28:12.395145 extend-filesystems[1676]: Found sda3 Mar 17 17:28:12.395145 extend-filesystems[1676]: Found usr Mar 17 17:28:12.395145 extend-filesystems[1676]: Found sda4 Mar 17 17:28:12.395145 extend-filesystems[1676]: Found sda6 Mar 17 17:28:12.395145 extend-filesystems[1676]: Found sda7 Mar 17 17:28:12.395145 extend-filesystems[1676]: Found sda9 Mar 17 17:28:12.395145 extend-filesystems[1676]: Checking size of /dev/sda9 Mar 17 17:28:12.529206 kernel: hv_utils: KVP IC version 4.0 Mar 17 17:28:12.388394 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 17 17:28:12.422442 chronyd[1681]: Timezone right/UTC failed leap second check, ignoring Mar 17 17:28:12.401205 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 17 17:28:12.437116 KVP[1677]: KVP LIC Version: 3.1 Mar 17 17:28:12.427371 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 17 17:28:12.440772 chronyd[1681]: Loaded seccomp filter (level 2) Mar 17 17:28:12.441045 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 17 17:28:12.441514 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 17 17:28:12.442136 systemd[1]: Starting update-engine.service - Update Engine... Mar 17 17:28:12.530213 jq[1702]: true Mar 17 17:28:12.466080 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 17 17:28:12.477402 systemd[1]: Started chronyd.service - NTP client/server. Mar 17 17:28:12.487606 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 17 17:28:12.488362 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 17 17:28:12.501509 systemd[1]: motdgen.service: Deactivated successfully. Mar 17 17:28:12.501668 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 17 17:28:12.508645 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 17 17:28:12.515683 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 17 17:28:12.518298 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 17 17:28:12.532646 extend-filesystems[1676]: Old size kept for /dev/sda9 Mar 17 17:28:12.532646 extend-filesystems[1676]: Found sr0 Mar 17 17:28:12.546803 dbus-daemon[1674]: [system] SELinux support is enabled Mar 17 17:28:12.538553 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 17 17:28:12.543154 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 17 17:28:12.561389 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 17 17:28:12.595114 (ntainerd)[1721]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 17 17:28:12.596170 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 17 17:28:12.596208 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 17 17:28:12.606612 jq[1718]: true Mar 17 17:28:12.607085 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 17 17:28:12.607110 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 17 17:28:12.615196 systemd-logind[1696]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Mar 17 17:28:12.617034 systemd-logind[1696]: New seat seat0. Mar 17 17:28:12.621983 systemd[1]: Started systemd-logind.service - User Login Management. Mar 17 17:28:12.652967 tar[1708]: linux-arm64/helm Mar 17 17:28:12.657270 update_engine[1699]: I20250317 17:28:12.656371 1699 main.cc:92] Flatcar Update Engine starting Mar 17 17:28:12.663243 systemd[1]: Started update-engine.service - Update Engine. Mar 17 17:28:12.663504 update_engine[1699]: I20250317 17:28:12.663290 1699 update_check_scheduler.cc:74] Next update check in 11m31s Mar 17 17:28:12.680990 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 17 17:28:12.745865 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (1738) Mar 17 17:28:12.746489 coreos-metadata[1670]: Mar 17 17:28:12.746 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 17 17:28:12.749314 coreos-metadata[1670]: Mar 17 17:28:12.749 INFO Fetch successful Mar 17 17:28:12.749469 coreos-metadata[1670]: Mar 17 17:28:12.749 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Mar 17 17:28:12.753283 coreos-metadata[1670]: Mar 17 17:28:12.753 INFO Fetch successful Mar 17 17:28:12.753958 coreos-metadata[1670]: Mar 17 17:28:12.753 INFO Fetching http://168.63.129.16/machine/cc04e4fe-8a31-4d76-a414-93dd17d8bdda/0ae79945%2D3976%2D4c95%2D9b87%2D8e4c7952db45.%5Fci%2D4152.2.2%2Da%2D6c46d54d7c?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Mar 17 17:28:12.756347 coreos-metadata[1670]: Mar 17 17:28:12.756 INFO Fetch successful Mar 17 17:28:12.756855 coreos-metadata[1670]: Mar 17 17:28:12.756 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Mar 17 17:28:12.771040 coreos-metadata[1670]: Mar 17 17:28:12.768 INFO Fetch successful Mar 17 17:28:12.818342 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 17 17:28:12.829854 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 17 17:28:13.412377 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:28:13.419065 (kubelet)[1820]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:28:13.487251 tar[1708]: linux-arm64/LICENSE Mar 17 17:28:13.487251 tar[1708]: linux-arm64/README.md Mar 17 17:28:13.499546 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 17 17:28:13.666281 bash[1764]: Updated "/home/core/.ssh/authorized_keys" Mar 17 17:28:13.667599 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 17 17:28:13.679269 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 17 17:28:13.696596 locksmithd[1761]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 17 17:28:13.859645 kubelet[1820]: E0317 17:28:13.859591 1820 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:28:13.865750 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:28:13.865879 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:28:13.892288 containerd[1721]: time="2025-03-17T17:28:13.892194560Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Mar 17 17:28:13.921373 containerd[1721]: time="2025-03-17T17:28:13.921275680Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:28:13.922808 containerd[1721]: time="2025-03-17T17:28:13.922772480Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.83-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:28:13.922808 containerd[1721]: time="2025-03-17T17:28:13.922806920Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 17 17:28:13.922879 containerd[1721]: time="2025-03-17T17:28:13.922825320Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 17 17:28:13.923003 containerd[1721]: time="2025-03-17T17:28:13.922980960Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 17 17:28:13.923040 containerd[1721]: time="2025-03-17T17:28:13.923003960Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 17 17:28:13.923082 containerd[1721]: time="2025-03-17T17:28:13.923063640Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:28:13.923106 containerd[1721]: time="2025-03-17T17:28:13.923081280Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:28:13.923272 containerd[1721]: time="2025-03-17T17:28:13.923252000Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:28:13.923298 containerd[1721]: time="2025-03-17T17:28:13.923271200Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 17 17:28:13.923298 containerd[1721]: time="2025-03-17T17:28:13.923284880Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:28:13.923298 containerd[1721]: time="2025-03-17T17:28:13.923294760Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 17 17:28:13.923935 containerd[1721]: time="2025-03-17T17:28:13.923364040Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:28:13.923935 containerd[1721]: time="2025-03-17T17:28:13.923557080Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:28:13.923935 containerd[1721]: time="2025-03-17T17:28:13.923649600Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:28:13.923935 containerd[1721]: time="2025-03-17T17:28:13.923663520Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 17 17:28:13.923935 containerd[1721]: time="2025-03-17T17:28:13.923741000Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 17 17:28:13.923935 containerd[1721]: time="2025-03-17T17:28:13.923780200Z" level=info msg="metadata content store policy set" policy=shared Mar 17 17:28:14.009969 containerd[1721]: time="2025-03-17T17:28:14.009904520Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 17 17:28:14.010294 containerd[1721]: time="2025-03-17T17:28:14.010117440Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 17 17:28:14.010294 containerd[1721]: time="2025-03-17T17:28:14.010160560Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 17 17:28:14.010294 containerd[1721]: time="2025-03-17T17:28:14.010177600Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 17 17:28:14.010294 containerd[1721]: time="2025-03-17T17:28:14.010192040Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 17 17:28:14.010595 containerd[1721]: time="2025-03-17T17:28:14.010525520Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 17 17:28:14.012245 containerd[1721]: time="2025-03-17T17:28:14.011044640Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 17 17:28:14.012245 containerd[1721]: time="2025-03-17T17:28:14.011206960Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 17 17:28:14.012245 containerd[1721]: time="2025-03-17T17:28:14.011263960Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 17 17:28:14.012245 containerd[1721]: time="2025-03-17T17:28:14.011287120Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 17 17:28:14.012245 containerd[1721]: time="2025-03-17T17:28:14.011306360Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 17 17:28:14.012245 containerd[1721]: time="2025-03-17T17:28:14.011323440Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 17 17:28:14.012245 containerd[1721]: time="2025-03-17T17:28:14.011339120Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 17 17:28:14.012245 containerd[1721]: time="2025-03-17T17:28:14.011517040Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 17 17:28:14.012245 containerd[1721]: time="2025-03-17T17:28:14.011537440Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 17 17:28:14.012245 containerd[1721]: time="2025-03-17T17:28:14.011554160Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 17 17:28:14.012245 containerd[1721]: time="2025-03-17T17:28:14.011570720Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 17 17:28:14.012245 containerd[1721]: time="2025-03-17T17:28:14.011585680Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 17 17:28:14.012245 containerd[1721]: time="2025-03-17T17:28:14.011610600Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012245 containerd[1721]: time="2025-03-17T17:28:14.011630200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011646360Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011663480Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011678840Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011702000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011715000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011730640Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011747560Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011767400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011783040Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011797640Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011812720Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011831440Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011856560Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011874120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012552 containerd[1721]: time="2025-03-17T17:28:14.011888480Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 17 17:28:14.012790 containerd[1721]: time="2025-03-17T17:28:14.011996440Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 17 17:28:14.012790 containerd[1721]: time="2025-03-17T17:28:14.012019960Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 17 17:28:14.012790 containerd[1721]: time="2025-03-17T17:28:14.012033640Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 17 17:28:14.012790 containerd[1721]: time="2025-03-17T17:28:14.012050160Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 17 17:28:14.012790 containerd[1721]: time="2025-03-17T17:28:14.012095640Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.012790 containerd[1721]: time="2025-03-17T17:28:14.012112600Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 17 17:28:14.012790 containerd[1721]: time="2025-03-17T17:28:14.012129160Z" level=info msg="NRI interface is disabled by configuration." Mar 17 17:28:14.012790 containerd[1721]: time="2025-03-17T17:28:14.012142600Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 17 17:28:14.014713 containerd[1721]: time="2025-03-17T17:28:14.014572440Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 17 17:28:14.014903 containerd[1721]: time="2025-03-17T17:28:14.014887120Z" level=info msg="Connect containerd service" Mar 17 17:28:14.015204 containerd[1721]: time="2025-03-17T17:28:14.015183000Z" level=info msg="using legacy CRI server" Mar 17 17:28:14.015285 containerd[1721]: time="2025-03-17T17:28:14.015271080Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 17 17:28:14.015481 containerd[1721]: time="2025-03-17T17:28:14.015463160Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 17 17:28:14.016769 containerd[1721]: time="2025-03-17T17:28:14.016736120Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 17:28:14.017223 containerd[1721]: time="2025-03-17T17:28:14.017109600Z" level=info msg="Start subscribing containerd event" Mar 17 17:28:14.017460 containerd[1721]: time="2025-03-17T17:28:14.017199760Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 17 17:28:14.017636 containerd[1721]: time="2025-03-17T17:28:14.017447200Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 17 17:28:14.019399 containerd[1721]: time="2025-03-17T17:28:14.019286400Z" level=info msg="Start recovering state" Mar 17 17:28:14.019399 containerd[1721]: time="2025-03-17T17:28:14.019380760Z" level=info msg="Start event monitor" Mar 17 17:28:14.019561 containerd[1721]: time="2025-03-17T17:28:14.019490480Z" level=info msg="Start snapshots syncer" Mar 17 17:28:14.019561 containerd[1721]: time="2025-03-17T17:28:14.019506640Z" level=info msg="Start cni network conf syncer for default" Mar 17 17:28:14.019561 containerd[1721]: time="2025-03-17T17:28:14.019514680Z" level=info msg="Start streaming server" Mar 17 17:28:14.019797 containerd[1721]: time="2025-03-17T17:28:14.019684680Z" level=info msg="containerd successfully booted in 0.128328s" Mar 17 17:28:14.019761 systemd[1]: Started containerd.service - containerd container runtime. Mar 17 17:28:14.423183 sshd_keygen[1704]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 17 17:28:14.442280 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 17 17:28:14.458456 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 17 17:28:14.464718 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Mar 17 17:28:14.470889 systemd[1]: issuegen.service: Deactivated successfully. Mar 17 17:28:14.471043 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 17 17:28:14.481551 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 17 17:28:14.489394 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Mar 17 17:28:14.503256 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 17 17:28:14.518531 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 17 17:28:14.525499 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 17 17:28:14.532596 systemd[1]: Reached target getty.target - Login Prompts. Mar 17 17:28:14.539118 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 17 17:28:14.547057 systemd[1]: Startup finished in 666ms (kernel) + 16.643s (initrd) + 16.040s (userspace) = 33.350s. Mar 17 17:28:14.788702 login[1867]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Mar 17 17:28:14.789320 login[1866]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:28:14.798284 systemd-logind[1696]: New session 2 of user core. Mar 17 17:28:14.799637 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 17 17:28:14.809667 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 17 17:28:14.820296 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 17 17:28:14.829485 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 17 17:28:14.831719 (systemd)[1874]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 17 17:28:14.991148 systemd[1874]: Queued start job for default target default.target. Mar 17 17:28:15.002335 systemd[1874]: Created slice app.slice - User Application Slice. Mar 17 17:28:15.002552 systemd[1874]: Reached target paths.target - Paths. Mar 17 17:28:15.002626 systemd[1874]: Reached target timers.target - Timers. Mar 17 17:28:15.003833 systemd[1874]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 17 17:28:15.013094 systemd[1874]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 17 17:28:15.013142 systemd[1874]: Reached target sockets.target - Sockets. Mar 17 17:28:15.013154 systemd[1874]: Reached target basic.target - Basic System. Mar 17 17:28:15.013192 systemd[1874]: Reached target default.target - Main User Target. Mar 17 17:28:15.013215 systemd[1874]: Startup finished in 175ms. Mar 17 17:28:15.013494 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 17 17:28:15.014786 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 17 17:28:15.790162 login[1867]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:28:15.794721 systemd-logind[1696]: New session 1 of user core. Mar 17 17:28:15.806388 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 17 17:28:16.390509 waagent[1863]: 2025-03-17T17:28:16.390429Z INFO Daemon Daemon Azure Linux Agent Version: 2.9.1.1 Mar 17 17:28:16.397082 waagent[1863]: 2025-03-17T17:28:16.397024Z INFO Daemon Daemon OS: flatcar 4152.2.2 Mar 17 17:28:16.402057 waagent[1863]: 2025-03-17T17:28:16.402008Z INFO Daemon Daemon Python: 3.11.10 Mar 17 17:28:16.406970 waagent[1863]: 2025-03-17T17:28:16.406906Z INFO Daemon Daemon Run daemon Mar 17 17:28:16.411608 waagent[1863]: 2025-03-17T17:28:16.411564Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4152.2.2' Mar 17 17:28:16.420911 waagent[1863]: 2025-03-17T17:28:16.420865Z INFO Daemon Daemon Using waagent for provisioning Mar 17 17:28:16.426597 waagent[1863]: 2025-03-17T17:28:16.426558Z INFO Daemon Daemon Activate resource disk Mar 17 17:28:16.431434 waagent[1863]: 2025-03-17T17:28:16.431390Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Mar 17 17:28:16.444055 waagent[1863]: 2025-03-17T17:28:16.444010Z INFO Daemon Daemon Found device: None Mar 17 17:28:16.448702 waagent[1863]: 2025-03-17T17:28:16.448658Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Mar 17 17:28:16.457580 waagent[1863]: 2025-03-17T17:28:16.457538Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Mar 17 17:28:16.469801 waagent[1863]: 2025-03-17T17:28:16.469758Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 17 17:28:16.479249 waagent[1863]: 2025-03-17T17:28:16.476295Z INFO Daemon Daemon Running default provisioning handler Mar 17 17:28:16.488347 waagent[1863]: 2025-03-17T17:28:16.488281Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Mar 17 17:28:16.503122 waagent[1863]: 2025-03-17T17:28:16.503064Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Mar 17 17:28:16.513884 waagent[1863]: 2025-03-17T17:28:16.513829Z INFO Daemon Daemon cloud-init is enabled: False Mar 17 17:28:16.519384 waagent[1863]: 2025-03-17T17:28:16.519342Z INFO Daemon Daemon Copying ovf-env.xml Mar 17 17:28:16.807301 waagent[1863]: 2025-03-17T17:28:16.803624Z INFO Daemon Daemon Successfully mounted dvd Mar 17 17:28:16.837938 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Mar 17 17:28:16.839484 waagent[1863]: 2025-03-17T17:28:16.839404Z INFO Daemon Daemon Detect protocol endpoint Mar 17 17:28:16.844775 waagent[1863]: 2025-03-17T17:28:16.844726Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 17 17:28:16.851079 waagent[1863]: 2025-03-17T17:28:16.851029Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Mar 17 17:28:16.858343 waagent[1863]: 2025-03-17T17:28:16.858292Z INFO Daemon Daemon Test for route to 168.63.129.16 Mar 17 17:28:16.864074 waagent[1863]: 2025-03-17T17:28:16.864021Z INFO Daemon Daemon Route to 168.63.129.16 exists Mar 17 17:28:16.870008 waagent[1863]: 2025-03-17T17:28:16.869958Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Mar 17 17:28:16.906403 waagent[1863]: 2025-03-17T17:28:16.906356Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Mar 17 17:28:16.913688 waagent[1863]: 2025-03-17T17:28:16.913650Z INFO Daemon Daemon Wire protocol version:2012-11-30 Mar 17 17:28:16.919712 waagent[1863]: 2025-03-17T17:28:16.919667Z INFO Daemon Daemon Server preferred version:2015-04-05 Mar 17 17:28:17.171318 waagent[1863]: 2025-03-17T17:28:17.170550Z INFO Daemon Daemon Initializing goal state during protocol detection Mar 17 17:28:17.177778 waagent[1863]: 2025-03-17T17:28:17.177708Z INFO Daemon Daemon Forcing an update of the goal state. Mar 17 17:28:17.187184 waagent[1863]: 2025-03-17T17:28:17.187132Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 17 17:28:17.232484 waagent[1863]: 2025-03-17T17:28:17.232435Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.164 Mar 17 17:28:17.238955 waagent[1863]: 2025-03-17T17:28:17.238899Z INFO Daemon Mar 17 17:28:17.242007 waagent[1863]: 2025-03-17T17:28:17.241959Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 8b419b24-9e8a-492f-b1a8-c4f36d3ec09a eTag: 15234587566479947677 source: Fabric] Mar 17 17:28:17.254665 waagent[1863]: 2025-03-17T17:28:17.254615Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Mar 17 17:28:17.262074 waagent[1863]: 2025-03-17T17:28:17.262018Z INFO Daemon Mar 17 17:28:17.265524 waagent[1863]: 2025-03-17T17:28:17.265473Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Mar 17 17:28:17.277527 waagent[1863]: 2025-03-17T17:28:17.277488Z INFO Daemon Daemon Downloading artifacts profile blob Mar 17 17:28:17.376379 waagent[1863]: 2025-03-17T17:28:17.376291Z INFO Daemon Downloaded certificate {'thumbprint': '99F181BF8F98C95F2096B79358B60C31829B0D8D', 'hasPrivateKey': True} Mar 17 17:28:17.387351 waagent[1863]: 2025-03-17T17:28:17.387298Z INFO Daemon Downloaded certificate {'thumbprint': '77684ECED44B9790E8F413F1A232B0AF51736A0F', 'hasPrivateKey': False} Mar 17 17:28:17.398013 waagent[1863]: 2025-03-17T17:28:17.397954Z INFO Daemon Fetch goal state completed Mar 17 17:28:17.409508 waagent[1863]: 2025-03-17T17:28:17.409446Z INFO Daemon Daemon Starting provisioning Mar 17 17:28:17.415103 waagent[1863]: 2025-03-17T17:28:17.415041Z INFO Daemon Daemon Handle ovf-env.xml. Mar 17 17:28:17.420114 waagent[1863]: 2025-03-17T17:28:17.420060Z INFO Daemon Daemon Set hostname [ci-4152.2.2-a-6c46d54d7c] Mar 17 17:28:17.445250 waagent[1863]: 2025-03-17T17:28:17.443049Z INFO Daemon Daemon Publish hostname [ci-4152.2.2-a-6c46d54d7c] Mar 17 17:28:17.450330 waagent[1863]: 2025-03-17T17:28:17.450270Z INFO Daemon Daemon Examine /proc/net/route for primary interface Mar 17 17:28:17.458241 waagent[1863]: 2025-03-17T17:28:17.458181Z INFO Daemon Daemon Primary interface is [eth0] Mar 17 17:28:17.505932 systemd-networkd[1445]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:28:17.505941 systemd-networkd[1445]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:28:17.505968 systemd-networkd[1445]: eth0: DHCP lease lost Mar 17 17:28:17.507334 waagent[1863]: 2025-03-17T17:28:17.507009Z INFO Daemon Daemon Create user account if not exists Mar 17 17:28:17.513117 waagent[1863]: 2025-03-17T17:28:17.513057Z INFO Daemon Daemon User core already exists, skip useradd Mar 17 17:28:17.520972 waagent[1863]: 2025-03-17T17:28:17.520892Z INFO Daemon Daemon Configure sudoer Mar 17 17:28:17.521344 systemd-networkd[1445]: eth0: DHCPv6 lease lost Mar 17 17:28:17.526311 waagent[1863]: 2025-03-17T17:28:17.526240Z INFO Daemon Daemon Configure sshd Mar 17 17:28:17.531975 waagent[1863]: 2025-03-17T17:28:17.531631Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Mar 17 17:28:17.545721 waagent[1863]: 2025-03-17T17:28:17.545651Z INFO Daemon Daemon Deploy ssh public key. Mar 17 17:28:17.567281 systemd-networkd[1445]: eth0: DHCPv4 address 10.200.20.22/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 17 17:28:18.729251 waagent[1863]: 2025-03-17T17:28:18.728345Z INFO Daemon Daemon Provisioning complete Mar 17 17:28:18.747938 waagent[1863]: 2025-03-17T17:28:18.747892Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Mar 17 17:28:18.754387 waagent[1863]: 2025-03-17T17:28:18.754342Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Mar 17 17:28:18.764602 waagent[1863]: 2025-03-17T17:28:18.764557Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.9.1.1 is the most current agent Mar 17 17:28:18.890043 waagent[1929]: 2025-03-17T17:28:18.889971Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.9.1.1) Mar 17 17:28:18.890842 waagent[1929]: 2025-03-17T17:28:18.890476Z INFO ExtHandler ExtHandler OS: flatcar 4152.2.2 Mar 17 17:28:18.890842 waagent[1929]: 2025-03-17T17:28:18.890550Z INFO ExtHandler ExtHandler Python: 3.11.10 Mar 17 17:28:18.931259 waagent[1929]: 2025-03-17T17:28:18.929823Z INFO ExtHandler ExtHandler Distro: flatcar-4152.2.2; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.10; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.20.1; Mar 17 17:28:18.931259 waagent[1929]: 2025-03-17T17:28:18.930054Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 17:28:18.931259 waagent[1929]: 2025-03-17T17:28:18.930112Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 17:28:18.940740 waagent[1929]: 2025-03-17T17:28:18.940679Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 17 17:28:18.954535 waagent[1929]: 2025-03-17T17:28:18.954489Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.164 Mar 17 17:28:18.955166 waagent[1929]: 2025-03-17T17:28:18.955126Z INFO ExtHandler Mar 17 17:28:18.955334 waagent[1929]: 2025-03-17T17:28:18.955298Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 2e7d2d9c-ce9c-4499-9c12-2b35d22643ff eTag: 15234587566479947677 source: Fabric] Mar 17 17:28:18.955728 waagent[1929]: 2025-03-17T17:28:18.955690Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 17 17:28:18.956390 waagent[1929]: 2025-03-17T17:28:18.956347Z INFO ExtHandler Mar 17 17:28:18.956527 waagent[1929]: 2025-03-17T17:28:18.956497Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Mar 17 17:28:18.960498 waagent[1929]: 2025-03-17T17:28:18.960468Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 17 17:28:19.052897 waagent[1929]: 2025-03-17T17:28:19.052757Z INFO ExtHandler Downloaded certificate {'thumbprint': '99F181BF8F98C95F2096B79358B60C31829B0D8D', 'hasPrivateKey': True} Mar 17 17:28:19.053296 waagent[1929]: 2025-03-17T17:28:19.053248Z INFO ExtHandler Downloaded certificate {'thumbprint': '77684ECED44B9790E8F413F1A232B0AF51736A0F', 'hasPrivateKey': False} Mar 17 17:28:19.053720 waagent[1929]: 2025-03-17T17:28:19.053675Z INFO ExtHandler Fetch goal state completed Mar 17 17:28:19.072856 waagent[1929]: 2025-03-17T17:28:19.072798Z INFO ExtHandler ExtHandler WALinuxAgent-2.9.1.1 running as process 1929 Mar 17 17:28:19.073013 waagent[1929]: 2025-03-17T17:28:19.072978Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Mar 17 17:28:19.074645 waagent[1929]: 2025-03-17T17:28:19.074600Z INFO ExtHandler ExtHandler Cgroup monitoring is not supported on ['flatcar', '4152.2.2', '', 'Flatcar Container Linux by Kinvolk'] Mar 17 17:28:19.075020 waagent[1929]: 2025-03-17T17:28:19.074981Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Mar 17 17:28:19.111831 waagent[1929]: 2025-03-17T17:28:19.111785Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Mar 17 17:28:19.112028 waagent[1929]: 2025-03-17T17:28:19.111988Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Mar 17 17:28:19.117820 waagent[1929]: 2025-03-17T17:28:19.117785Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Mar 17 17:28:19.123937 systemd[1]: Reloading requested from client PID 1944 ('systemctl') (unit waagent.service)... Mar 17 17:28:19.123954 systemd[1]: Reloading... Mar 17 17:28:19.199254 zram_generator::config[1976]: No configuration found. Mar 17 17:28:19.301563 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:28:19.381893 systemd[1]: Reloading finished in 257 ms. Mar 17 17:28:19.404817 waagent[1929]: 2025-03-17T17:28:19.404471Z INFO ExtHandler ExtHandler Executing systemctl daemon-reload for setting up waagent-network-setup.service Mar 17 17:28:19.410450 systemd[1]: Reloading requested from client PID 2032 ('systemctl') (unit waagent.service)... Mar 17 17:28:19.410464 systemd[1]: Reloading... Mar 17 17:28:19.481312 zram_generator::config[2066]: No configuration found. Mar 17 17:28:19.591038 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:28:19.670952 systemd[1]: Reloading finished in 260 ms. Mar 17 17:28:19.693294 waagent[1929]: 2025-03-17T17:28:19.692641Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Mar 17 17:28:19.693294 waagent[1929]: 2025-03-17T17:28:19.692814Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Mar 17 17:28:20.003009 waagent[1929]: 2025-03-17T17:28:20.002887Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Mar 17 17:28:20.003945 waagent[1929]: 2025-03-17T17:28:20.003899Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: configuration enabled [True], cgroups enabled [False], python supported: [True] Mar 17 17:28:20.004825 waagent[1929]: 2025-03-17T17:28:20.004771Z INFO ExtHandler ExtHandler Starting env monitor service. Mar 17 17:28:20.004976 waagent[1929]: 2025-03-17T17:28:20.004916Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 17:28:20.005169 waagent[1929]: 2025-03-17T17:28:20.005105Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 17:28:20.005411 waagent[1929]: 2025-03-17T17:28:20.005359Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Mar 17 17:28:20.005813 waagent[1929]: 2025-03-17T17:28:20.005746Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Mar 17 17:28:20.005948 waagent[1929]: 2025-03-17T17:28:20.005885Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 17:28:20.006299 waagent[1929]: 2025-03-17T17:28:20.006222Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Mar 17 17:28:20.006476 waagent[1929]: 2025-03-17T17:28:20.006433Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 17:28:20.006636 waagent[1929]: 2025-03-17T17:28:20.006593Z INFO EnvHandler ExtHandler Configure routes Mar 17 17:28:20.006699 waagent[1929]: 2025-03-17T17:28:20.006667Z INFO EnvHandler ExtHandler Gateway:None Mar 17 17:28:20.006744 waagent[1929]: 2025-03-17T17:28:20.006720Z INFO EnvHandler ExtHandler Routes:None Mar 17 17:28:20.007576 waagent[1929]: 2025-03-17T17:28:20.007519Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Mar 17 17:28:20.007576 waagent[1929]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Mar 17 17:28:20.007576 waagent[1929]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Mar 17 17:28:20.007576 waagent[1929]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Mar 17 17:28:20.007576 waagent[1929]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Mar 17 17:28:20.007576 waagent[1929]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 17 17:28:20.007576 waagent[1929]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 17 17:28:20.009277 waagent[1929]: 2025-03-17T17:28:20.007327Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Mar 17 17:28:20.009277 waagent[1929]: 2025-03-17T17:28:20.008127Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Mar 17 17:28:20.009277 waagent[1929]: 2025-03-17T17:28:20.008079Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Mar 17 17:28:20.009277 waagent[1929]: 2025-03-17T17:28:20.008641Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Mar 17 17:28:20.016537 waagent[1929]: 2025-03-17T17:28:20.016489Z INFO ExtHandler ExtHandler Mar 17 17:28:20.016730 waagent[1929]: 2025-03-17T17:28:20.016686Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: f0e06bdc-ce32-454c-877f-84133817e6cc correlation 89cb9120-e583-4329-9a21-dffaf33d2c8e created: 2025-03-17T17:26:52.964317Z] Mar 17 17:28:20.017171 waagent[1929]: 2025-03-17T17:28:20.017123Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 17 17:28:20.018494 waagent[1929]: 2025-03-17T17:28:20.018449Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 1 ms] Mar 17 17:28:20.071490 waagent[1929]: 2025-03-17T17:28:20.071437Z INFO EnvHandler ExtHandler Successfully added Azure fabric firewall rules. Current Firewall rules: Mar 17 17:28:20.071490 waagent[1929]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 17:28:20.071490 waagent[1929]: pkts bytes target prot opt in out source destination Mar 17 17:28:20.071490 waagent[1929]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 17 17:28:20.071490 waagent[1929]: pkts bytes target prot opt in out source destination Mar 17 17:28:20.071490 waagent[1929]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 17:28:20.071490 waagent[1929]: pkts bytes target prot opt in out source destination Mar 17 17:28:20.071490 waagent[1929]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 17 17:28:20.071490 waagent[1929]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 17 17:28:20.071490 waagent[1929]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 17 17:28:20.072746 waagent[1929]: 2025-03-17T17:28:20.072685Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.9.1.1 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 81D3A1CC-3085-42A0-AF7C-0FC8554FACF9;DroppedPackets: 0;UpdateGSErrors: 0;AutoUpdate: 0] Mar 17 17:28:20.074966 waagent[1929]: 2025-03-17T17:28:20.074906Z INFO EnvHandler ExtHandler Current Firewall rules: Mar 17 17:28:20.074966 waagent[1929]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 17:28:20.074966 waagent[1929]: pkts bytes target prot opt in out source destination Mar 17 17:28:20.074966 waagent[1929]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 17 17:28:20.074966 waagent[1929]: pkts bytes target prot opt in out source destination Mar 17 17:28:20.074966 waagent[1929]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 17:28:20.074966 waagent[1929]: pkts bytes target prot opt in out source destination Mar 17 17:28:20.074966 waagent[1929]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 17 17:28:20.074966 waagent[1929]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 17 17:28:20.074966 waagent[1929]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 17 17:28:20.075501 waagent[1929]: 2025-03-17T17:28:20.075468Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Mar 17 17:28:20.095266 waagent[1929]: 2025-03-17T17:28:20.095171Z INFO MonitorHandler ExtHandler Network interfaces: Mar 17 17:28:20.095266 waagent[1929]: Executing ['ip', '-a', '-o', 'link']: Mar 17 17:28:20.095266 waagent[1929]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Mar 17 17:28:20.095266 waagent[1929]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:22:48:78:0b:63 brd ff:ff:ff:ff:ff:ff Mar 17 17:28:20.095266 waagent[1929]: 3: enP62901s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:22:48:78:0b:63 brd ff:ff:ff:ff:ff:ff\ altname enP62901p0s2 Mar 17 17:28:20.095266 waagent[1929]: Executing ['ip', '-4', '-a', '-o', 'address']: Mar 17 17:28:20.095266 waagent[1929]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Mar 17 17:28:20.095266 waagent[1929]: 2: eth0 inet 10.200.20.22/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Mar 17 17:28:20.095266 waagent[1929]: Executing ['ip', '-6', '-a', '-o', 'address']: Mar 17 17:28:20.095266 waagent[1929]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Mar 17 17:28:20.095266 waagent[1929]: 2: eth0 inet6 fe80::222:48ff:fe78:b63/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Mar 17 17:28:20.095266 waagent[1929]: 3: enP62901s1 inet6 fe80::222:48ff:fe78:b63/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Mar 17 17:28:24.116566 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 17 17:28:24.125403 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:28:25.237789 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:28:25.241013 (kubelet)[2161]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:28:25.288420 kubelet[2161]: E0317 17:28:25.288348 2161 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:28:25.291046 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:28:25.291162 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:28:35.541785 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 17 17:28:35.548422 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:28:35.638390 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:28:35.642008 (kubelet)[2177]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:28:35.697781 kubelet[2177]: E0317 17:28:35.697720 2177 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:28:35.700327 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:28:35.700472 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:28:36.233523 chronyd[1681]: Selected source PHC0 Mar 17 17:28:45.720463 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 17 17:28:45.729407 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:28:45.882134 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:28:45.885776 (kubelet)[2193]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:28:45.923164 kubelet[2193]: E0317 17:28:45.923075 2193 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:28:45.925353 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:28:45.925496 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:28:55.643338 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Mar 17 17:28:55.970429 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 17 17:28:55.975391 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:28:56.235982 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:28:56.239202 (kubelet)[2209]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:28:56.274906 kubelet[2209]: E0317 17:28:56.274820 2209 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:28:56.276776 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:28:56.276896 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:28:58.237929 update_engine[1699]: I20250317 17:28:58.237351 1699 update_attempter.cc:509] Updating boot flags... Mar 17 17:28:58.287324 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (2232) Mar 17 17:28:58.374265 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (2240) Mar 17 17:28:59.859706 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 17 17:28:59.861215 systemd[1]: Started sshd@0-10.200.20.22:22-10.200.16.10:41694.service - OpenSSH per-connection server daemon (10.200.16.10:41694). Mar 17 17:29:00.479876 sshd[2332]: Accepted publickey for core from 10.200.16.10 port 41694 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:29:00.481096 sshd-session[2332]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:00.485169 systemd-logind[1696]: New session 3 of user core. Mar 17 17:29:00.495367 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 17 17:29:00.903171 systemd[1]: Started sshd@1-10.200.20.22:22-10.200.16.10:41708.service - OpenSSH per-connection server daemon (10.200.16.10:41708). Mar 17 17:29:01.355305 sshd[2337]: Accepted publickey for core from 10.200.16.10 port 41708 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:29:01.356572 sshd-session[2337]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:01.361633 systemd-logind[1696]: New session 4 of user core. Mar 17 17:29:01.367411 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 17 17:29:01.678249 sshd[2339]: Connection closed by 10.200.16.10 port 41708 Mar 17 17:29:01.678764 sshd-session[2337]: pam_unix(sshd:session): session closed for user core Mar 17 17:29:01.681849 systemd[1]: sshd@1-10.200.20.22:22-10.200.16.10:41708.service: Deactivated successfully. Mar 17 17:29:01.683253 systemd[1]: session-4.scope: Deactivated successfully. Mar 17 17:29:01.683811 systemd-logind[1696]: Session 4 logged out. Waiting for processes to exit. Mar 17 17:29:01.684890 systemd-logind[1696]: Removed session 4. Mar 17 17:29:01.765354 systemd[1]: Started sshd@2-10.200.20.22:22-10.200.16.10:41712.service - OpenSSH per-connection server daemon (10.200.16.10:41712). Mar 17 17:29:02.259903 sshd[2344]: Accepted publickey for core from 10.200.16.10 port 41712 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:29:02.261137 sshd-session[2344]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:02.264870 systemd-logind[1696]: New session 5 of user core. Mar 17 17:29:02.275373 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 17 17:29:02.603181 sshd[2346]: Connection closed by 10.200.16.10 port 41712 Mar 17 17:29:02.603828 sshd-session[2344]: pam_unix(sshd:session): session closed for user core Mar 17 17:29:02.606210 systemd[1]: sshd@2-10.200.20.22:22-10.200.16.10:41712.service: Deactivated successfully. Mar 17 17:29:02.608506 systemd-logind[1696]: Session 5 logged out. Waiting for processes to exit. Mar 17 17:29:02.608816 systemd[1]: session-5.scope: Deactivated successfully. Mar 17 17:29:02.610805 systemd-logind[1696]: Removed session 5. Mar 17 17:29:02.693030 systemd[1]: Started sshd@3-10.200.20.22:22-10.200.16.10:41720.service - OpenSSH per-connection server daemon (10.200.16.10:41720). Mar 17 17:29:03.183017 sshd[2351]: Accepted publickey for core from 10.200.16.10 port 41720 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:29:03.184223 sshd-session[2351]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:03.188106 systemd-logind[1696]: New session 6 of user core. Mar 17 17:29:03.195359 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 17 17:29:03.550683 sshd[2353]: Connection closed by 10.200.16.10 port 41720 Mar 17 17:29:03.549801 sshd-session[2351]: pam_unix(sshd:session): session closed for user core Mar 17 17:29:03.552418 systemd[1]: sshd@3-10.200.20.22:22-10.200.16.10:41720.service: Deactivated successfully. Mar 17 17:29:03.553823 systemd[1]: session-6.scope: Deactivated successfully. Mar 17 17:29:03.555828 systemd-logind[1696]: Session 6 logged out. Waiting for processes to exit. Mar 17 17:29:03.556872 systemd-logind[1696]: Removed session 6. Mar 17 17:29:03.633144 systemd[1]: Started sshd@4-10.200.20.22:22-10.200.16.10:41734.service - OpenSSH per-connection server daemon (10.200.16.10:41734). Mar 17 17:29:04.080298 sshd[2358]: Accepted publickey for core from 10.200.16.10 port 41734 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:29:04.081522 sshd-session[2358]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:04.085497 systemd-logind[1696]: New session 7 of user core. Mar 17 17:29:04.091361 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 17 17:29:04.461873 sudo[2361]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 17 17:29:04.462141 sudo[2361]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:29:04.477913 sudo[2361]: pam_unix(sudo:session): session closed for user root Mar 17 17:29:04.559849 sshd[2360]: Connection closed by 10.200.16.10 port 41734 Mar 17 17:29:04.559705 sshd-session[2358]: pam_unix(sshd:session): session closed for user core Mar 17 17:29:04.562895 systemd[1]: sshd@4-10.200.20.22:22-10.200.16.10:41734.service: Deactivated successfully. Mar 17 17:29:04.565566 systemd[1]: session-7.scope: Deactivated successfully. Mar 17 17:29:04.567066 systemd-logind[1696]: Session 7 logged out. Waiting for processes to exit. Mar 17 17:29:04.568027 systemd-logind[1696]: Removed session 7. Mar 17 17:29:04.639833 systemd[1]: Started sshd@5-10.200.20.22:22-10.200.16.10:41742.service - OpenSSH per-connection server daemon (10.200.16.10:41742). Mar 17 17:29:05.088174 sshd[2366]: Accepted publickey for core from 10.200.16.10 port 41742 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:29:05.089483 sshd-session[2366]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:05.093052 systemd-logind[1696]: New session 8 of user core. Mar 17 17:29:05.100363 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 17 17:29:05.340446 sudo[2370]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 17 17:29:05.340704 sudo[2370]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:29:05.343774 sudo[2370]: pam_unix(sudo:session): session closed for user root Mar 17 17:29:05.348438 sudo[2369]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 17 17:29:05.348685 sudo[2369]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:29:05.364507 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:29:05.385598 augenrules[2392]: No rules Mar 17 17:29:05.386733 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:29:05.386917 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:29:05.388190 sudo[2369]: pam_unix(sudo:session): session closed for user root Mar 17 17:29:05.469423 sshd[2368]: Connection closed by 10.200.16.10 port 41742 Mar 17 17:29:05.469923 sshd-session[2366]: pam_unix(sshd:session): session closed for user core Mar 17 17:29:05.473439 systemd[1]: sshd@5-10.200.20.22:22-10.200.16.10:41742.service: Deactivated successfully. Mar 17 17:29:05.474958 systemd[1]: session-8.scope: Deactivated successfully. Mar 17 17:29:05.475636 systemd-logind[1696]: Session 8 logged out. Waiting for processes to exit. Mar 17 17:29:05.476585 systemd-logind[1696]: Removed session 8. Mar 17 17:29:05.554600 systemd[1]: Started sshd@6-10.200.20.22:22-10.200.16.10:41756.service - OpenSSH per-connection server daemon (10.200.16.10:41756). Mar 17 17:29:06.002619 sshd[2400]: Accepted publickey for core from 10.200.16.10 port 41756 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:29:06.003841 sshd-session[2400]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:06.008394 systemd-logind[1696]: New session 9 of user core. Mar 17 17:29:06.017384 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 17 17:29:06.255518 sudo[2403]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 17 17:29:06.255794 sudo[2403]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:29:06.470326 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 17 17:29:06.478412 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:29:06.696933 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:29:06.700405 (kubelet)[2420]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:29:06.737779 kubelet[2420]: E0317 17:29:06.737727 2420 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:29:06.740379 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:29:06.740515 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:29:07.849573 (dockerd)[2436]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 17 17:29:07.850095 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 17 17:29:08.580448 dockerd[2436]: time="2025-03-17T17:29:08.580390231Z" level=info msg="Starting up" Mar 17 17:29:09.169640 dockerd[2436]: time="2025-03-17T17:29:09.169450364Z" level=info msg="Loading containers: start." Mar 17 17:29:09.384256 kernel: Initializing XFRM netlink socket Mar 17 17:29:09.458932 systemd-networkd[1445]: docker0: Link UP Mar 17 17:29:09.498244 dockerd[2436]: time="2025-03-17T17:29:09.498187743Z" level=info msg="Loading containers: done." Mar 17 17:29:09.525427 dockerd[2436]: time="2025-03-17T17:29:09.525370881Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 17 17:29:09.525569 dockerd[2436]: time="2025-03-17T17:29:09.525477961Z" level=info msg="Docker daemon" commit=8b539b8df24032dabeaaa099cf1d0535ef0286a3 containerd-snapshotter=false storage-driver=overlay2 version=27.2.1 Mar 17 17:29:09.525601 dockerd[2436]: time="2025-03-17T17:29:09.525579761Z" level=info msg="Daemon has completed initialization" Mar 17 17:29:09.598584 dockerd[2436]: time="2025-03-17T17:29:09.598023664Z" level=info msg="API listen on /run/docker.sock" Mar 17 17:29:09.598212 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 17 17:29:11.120079 containerd[1721]: time="2025-03-17T17:29:11.120036217Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 17 17:29:12.102171 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3092348418.mount: Deactivated successfully. Mar 17 17:29:13.734274 containerd[1721]: time="2025-03-17T17:29:13.733259985Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:13.737878 containerd[1721]: time="2025-03-17T17:29:13.737634342Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.11: active requests=0, bytes read=29793524" Mar 17 17:29:13.745967 containerd[1721]: time="2025-03-17T17:29:13.745896655Z" level=info msg="ImageCreate event name:\"sha256:fcbef283ab16167d1ca4acb66836af518e9fe445111fbc618fdbe196858f9530\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:13.753425 containerd[1721]: time="2025-03-17T17:29:13.752590330Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:13.753425 containerd[1721]: time="2025-03-17T17:29:13.753277130Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.11\" with image id \"sha256:fcbef283ab16167d1ca4acb66836af518e9fe445111fbc618fdbe196858f9530\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\", size \"29790324\" in 2.633201313s" Mar 17 17:29:13.753425 containerd[1721]: time="2025-03-17T17:29:13.753301250Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:fcbef283ab16167d1ca4acb66836af518e9fe445111fbc618fdbe196858f9530\"" Mar 17 17:29:13.774154 containerd[1721]: time="2025-03-17T17:29:13.774102193Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 17 17:29:15.368915 containerd[1721]: time="2025-03-17T17:29:15.368873609Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:15.374817 containerd[1721]: time="2025-03-17T17:29:15.374779524Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.11: active requests=0, bytes read=26861167" Mar 17 17:29:15.381279 containerd[1721]: time="2025-03-17T17:29:15.380870799Z" level=info msg="ImageCreate event name:\"sha256:9469d949b9e8c03b6cb06af513f683dd2975b57092f3deb2a9e125e0d05188d3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:15.387081 containerd[1721]: time="2025-03-17T17:29:15.387028554Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:15.388270 containerd[1721]: time="2025-03-17T17:29:15.388142753Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.11\" with image id \"sha256:9469d949b9e8c03b6cb06af513f683dd2975b57092f3deb2a9e125e0d05188d3\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\", size \"28301963\" in 1.61386572s" Mar 17 17:29:15.388270 containerd[1721]: time="2025-03-17T17:29:15.388171273Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:9469d949b9e8c03b6cb06af513f683dd2975b57092f3deb2a9e125e0d05188d3\"" Mar 17 17:29:15.407632 containerd[1721]: time="2025-03-17T17:29:15.407598178Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 17 17:29:16.587942 containerd[1721]: time="2025-03-17T17:29:16.587891873Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:16.592931 containerd[1721]: time="2025-03-17T17:29:16.592719748Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.11: active requests=0, bytes read=16264636" Mar 17 17:29:16.600085 containerd[1721]: time="2025-03-17T17:29:16.600035821Z" level=info msg="ImageCreate event name:\"sha256:3540cd10f52fac0a58ba43c004c6d3941e2a9f53e06440b982b9c130a72c0213\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:16.610328 containerd[1721]: time="2025-03-17T17:29:16.610259851Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:16.611792 containerd[1721]: time="2025-03-17T17:29:16.611440290Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.11\" with image id \"sha256:3540cd10f52fac0a58ba43c004c6d3941e2a9f53e06440b982b9c130a72c0213\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\", size \"17705450\" in 1.203807032s" Mar 17 17:29:16.611792 containerd[1721]: time="2025-03-17T17:29:16.611472690Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:3540cd10f52fac0a58ba43c004c6d3941e2a9f53e06440b982b9c130a72c0213\"" Mar 17 17:29:16.629987 containerd[1721]: time="2025-03-17T17:29:16.629952753Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 17 17:29:16.970302 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Mar 17 17:29:16.976485 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:29:17.160402 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:29:17.164066 (kubelet)[2708]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:29:17.200506 kubelet[2708]: E0317 17:29:17.200465 2708 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:29:17.202909 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:29:17.203224 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:29:18.289517 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3772587947.mount: Deactivated successfully. Mar 17 17:29:19.185270 containerd[1721]: time="2025-03-17T17:29:19.184734668Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:19.191998 containerd[1721]: time="2025-03-17T17:29:19.191823421Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.11: active requests=0, bytes read=25771848" Mar 17 17:29:19.197085 containerd[1721]: time="2025-03-17T17:29:19.197036096Z" level=info msg="ImageCreate event name:\"sha256:fe83790bf8a35411788b67fe5f0ce35309056c40530484d516af2ca01375220c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:19.203268 containerd[1721]: time="2025-03-17T17:29:19.203168010Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:19.203921 containerd[1721]: time="2025-03-17T17:29:19.203742730Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.11\" with image id \"sha256:fe83790bf8a35411788b67fe5f0ce35309056c40530484d516af2ca01375220c\", repo tag \"registry.k8s.io/kube-proxy:v1.30.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\", size \"25770867\" in 2.573752337s" Mar 17 17:29:19.203921 containerd[1721]: time="2025-03-17T17:29:19.203775770Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:fe83790bf8a35411788b67fe5f0ce35309056c40530484d516af2ca01375220c\"" Mar 17 17:29:19.221612 containerd[1721]: time="2025-03-17T17:29:19.221573193Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 17 17:29:19.980550 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount540724110.mount: Deactivated successfully. Mar 17 17:29:21.233274 containerd[1721]: time="2025-03-17T17:29:21.232434269Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:21.239552 containerd[1721]: time="2025-03-17T17:29:21.239372462Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485381" Mar 17 17:29:21.249322 containerd[1721]: time="2025-03-17T17:29:21.249273013Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:21.255781 containerd[1721]: time="2025-03-17T17:29:21.255728926Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:21.256811 containerd[1721]: time="2025-03-17T17:29:21.256784925Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 2.034942092s" Mar 17 17:29:21.257085 containerd[1721]: time="2025-03-17T17:29:21.256892005Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" Mar 17 17:29:21.275910 containerd[1721]: time="2025-03-17T17:29:21.275693867Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 17 17:29:21.924028 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1156357536.mount: Deactivated successfully. Mar 17 17:29:21.964532 containerd[1721]: time="2025-03-17T17:29:21.964482408Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:21.967782 containerd[1721]: time="2025-03-17T17:29:21.967608725Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=268821" Mar 17 17:29:21.973693 containerd[1721]: time="2025-03-17T17:29:21.973644479Z" level=info msg="ImageCreate event name:\"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:21.981809 containerd[1721]: time="2025-03-17T17:29:21.981762792Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:21.982601 containerd[1721]: time="2025-03-17T17:29:21.982479231Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"268051\" in 706.750604ms" Mar 17 17:29:21.982601 containerd[1721]: time="2025-03-17T17:29:21.982511071Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\"" Mar 17 17:29:22.000753 containerd[1721]: time="2025-03-17T17:29:22.000712694Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 17 17:29:22.717290 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2660238040.mount: Deactivated successfully. Mar 17 17:29:26.250272 containerd[1721]: time="2025-03-17T17:29:26.249781700Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:26.253119 containerd[1721]: time="2025-03-17T17:29:26.253074296Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=66191472" Mar 17 17:29:26.260340 containerd[1721]: time="2025-03-17T17:29:26.260297530Z" level=info msg="ImageCreate event name:\"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:26.267294 containerd[1721]: time="2025-03-17T17:29:26.267251043Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:26.268459 containerd[1721]: time="2025-03-17T17:29:26.268335242Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"66189079\" in 4.267583508s" Mar 17 17:29:26.268459 containerd[1721]: time="2025-03-17T17:29:26.268367202Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\"" Mar 17 17:29:27.220783 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Mar 17 17:29:27.230497 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:29:27.317925 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:29:27.322664 (kubelet)[2898]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:29:27.362954 kubelet[2898]: E0317 17:29:27.362914 2898 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:29:27.366742 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:29:27.367004 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:29:31.055215 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:29:31.066496 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:29:31.086437 systemd[1]: Reloading requested from client PID 2912 ('systemctl') (unit session-9.scope)... Mar 17 17:29:31.086454 systemd[1]: Reloading... Mar 17 17:29:31.184271 zram_generator::config[2952]: No configuration found. Mar 17 17:29:31.293613 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:29:31.372921 systemd[1]: Reloading finished in 286 ms. Mar 17 17:29:31.530256 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 17 17:29:31.530506 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 17 17:29:31.530854 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:29:31.540515 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:29:31.637799 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:29:31.641831 (kubelet)[3020]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 17:29:31.679684 kubelet[3020]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:29:31.679684 kubelet[3020]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 17:29:31.679684 kubelet[3020]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:29:31.680025 kubelet[3020]: I0317 17:29:31.679723 3020 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 17:29:32.646153 kubelet[3020]: I0317 17:29:32.646070 3020 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 17:29:32.646153 kubelet[3020]: I0317 17:29:32.646101 3020 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 17:29:32.646485 kubelet[3020]: I0317 17:29:32.646464 3020 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 17:29:32.659634 kubelet[3020]: E0317 17:29:32.659593 3020 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.200.20.22:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:32.660030 kubelet[3020]: I0317 17:29:32.659925 3020 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:29:32.667689 kubelet[3020]: I0317 17:29:32.667657 3020 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 17:29:32.668990 kubelet[3020]: I0317 17:29:32.668948 3020 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 17:29:32.669180 kubelet[3020]: I0317 17:29:32.668995 3020 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4152.2.2-a-6c46d54d7c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 17:29:32.669292 kubelet[3020]: I0317 17:29:32.669192 3020 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 17:29:32.669292 kubelet[3020]: I0317 17:29:32.669201 3020 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 17:29:32.669368 kubelet[3020]: I0317 17:29:32.669348 3020 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:29:32.670096 kubelet[3020]: I0317 17:29:32.670077 3020 kubelet.go:400] "Attempting to sync node with API server" Mar 17 17:29:32.670148 kubelet[3020]: I0317 17:29:32.670102 3020 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 17:29:32.670328 kubelet[3020]: I0317 17:29:32.670313 3020 kubelet.go:312] "Adding apiserver pod source" Mar 17 17:29:32.670376 kubelet[3020]: I0317 17:29:32.670339 3020 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 17:29:32.672381 kubelet[3020]: W0317 17:29:32.672331 3020 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.22:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152.2.2-a-6c46d54d7c&limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:32.672479 kubelet[3020]: E0317 17:29:32.672390 3020 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.200.20.22:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152.2.2-a-6c46d54d7c&limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:32.673286 kubelet[3020]: I0317 17:29:32.673258 3020 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Mar 17 17:29:32.673453 kubelet[3020]: I0317 17:29:32.673433 3020 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 17:29:32.673495 kubelet[3020]: W0317 17:29:32.673481 3020 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 17 17:29:32.674015 kubelet[3020]: I0317 17:29:32.673993 3020 server.go:1264] "Started kubelet" Mar 17 17:29:32.675679 kubelet[3020]: I0317 17:29:32.675645 3020 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 17:29:32.682259 kubelet[3020]: I0317 17:29:32.681813 3020 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 17:29:32.684494 kubelet[3020]: I0317 17:29:32.683274 3020 server.go:455] "Adding debug handlers to kubelet server" Mar 17 17:29:32.685708 kubelet[3020]: I0317 17:29:32.685655 3020 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 17:29:32.686010 kubelet[3020]: I0317 17:29:32.685993 3020 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 17:29:32.691021 kubelet[3020]: I0317 17:29:32.691002 3020 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 17:29:32.691983 kubelet[3020]: I0317 17:29:32.691963 3020 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 17:29:32.692162 kubelet[3020]: I0317 17:29:32.692152 3020 reconciler.go:26] "Reconciler: start to sync state" Mar 17 17:29:32.692610 kubelet[3020]: W0317 17:29:32.692573 3020 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.20.22:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:32.692718 kubelet[3020]: E0317 17:29:32.692706 3020 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.200.20.22:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:32.692841 kubelet[3020]: E0317 17:29:32.692816 3020 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.22:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152.2.2-a-6c46d54d7c?timeout=10s\": dial tcp 10.200.20.22:6443: connect: connection refused" interval="200ms" Mar 17 17:29:32.693046 kubelet[3020]: E0317 17:29:32.692942 3020 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.22:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.22:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4152.2.2-a-6c46d54d7c.182da74b4a58dce3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4152.2.2-a-6c46d54d7c,UID:ci-4152.2.2-a-6c46d54d7c,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4152.2.2-a-6c46d54d7c,},FirstTimestamp:2025-03-17 17:29:32.673973475 +0000 UTC m=+1.029266860,LastTimestamp:2025-03-17 17:29:32.673973475 +0000 UTC m=+1.029266860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4152.2.2-a-6c46d54d7c,}" Mar 17 17:29:32.694992 kubelet[3020]: W0317 17:29:32.694933 3020 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.20.22:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:32.694992 kubelet[3020]: E0317 17:29:32.694991 3020 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.200.20.22:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:32.695116 kubelet[3020]: I0317 17:29:32.695101 3020 factory.go:221] Registration of the systemd container factory successfully Mar 17 17:29:32.695387 kubelet[3020]: I0317 17:29:32.695364 3020 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 17:29:32.696925 kubelet[3020]: E0317 17:29:32.696890 3020 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 17:29:32.697055 kubelet[3020]: I0317 17:29:32.697041 3020 factory.go:221] Registration of the containerd container factory successfully Mar 17 17:29:32.715864 kubelet[3020]: I0317 17:29:32.715825 3020 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 17:29:32.718259 kubelet[3020]: I0317 17:29:32.718171 3020 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 17:29:32.718259 kubelet[3020]: I0317 17:29:32.718210 3020 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 17:29:32.718664 kubelet[3020]: I0317 17:29:32.718382 3020 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 17:29:32.718664 kubelet[3020]: E0317 17:29:32.718430 3020 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 17:29:32.719598 kubelet[3020]: W0317 17:29:32.719547 3020 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.22:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:32.719685 kubelet[3020]: E0317 17:29:32.719604 3020 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.200.20.22:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:32.729253 kubelet[3020]: I0317 17:29:32.729183 3020 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 17:29:32.729253 kubelet[3020]: I0317 17:29:32.729201 3020 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 17:29:32.729253 kubelet[3020]: I0317 17:29:32.729242 3020 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:29:32.739140 kubelet[3020]: I0317 17:29:32.739109 3020 policy_none.go:49] "None policy: Start" Mar 17 17:29:32.739845 kubelet[3020]: I0317 17:29:32.739825 3020 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 17:29:32.739917 kubelet[3020]: I0317 17:29:32.739851 3020 state_mem.go:35] "Initializing new in-memory state store" Mar 17 17:29:32.749108 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 17 17:29:32.761971 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 17 17:29:32.772327 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 17 17:29:32.773770 kubelet[3020]: I0317 17:29:32.773598 3020 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 17:29:32.773879 kubelet[3020]: I0317 17:29:32.773795 3020 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 17:29:32.773914 kubelet[3020]: I0317 17:29:32.773893 3020 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 17:29:32.776514 kubelet[3020]: E0317 17:29:32.776483 3020 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:32.793920 kubelet[3020]: I0317 17:29:32.793603 3020 kubelet_node_status.go:73] "Attempting to register node" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.794056 kubelet[3020]: E0317 17:29:32.793963 3020 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.20.22:6443/api/v1/nodes\": dial tcp 10.200.20.22:6443: connect: connection refused" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.819252 kubelet[3020]: I0317 17:29:32.819189 3020 topology_manager.go:215] "Topology Admit Handler" podUID="0de18bcbb30c148aaf2ee9b40f32a370" podNamespace="kube-system" podName="kube-apiserver-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.820926 kubelet[3020]: I0317 17:29:32.820888 3020 topology_manager.go:215] "Topology Admit Handler" podUID="22ee4920bd8ad80f1688009569db84f3" podNamespace="kube-system" podName="kube-controller-manager-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.822829 kubelet[3020]: I0317 17:29:32.822694 3020 topology_manager.go:215] "Topology Admit Handler" podUID="6c5af4d0f53e75a39041573f1ca15106" podNamespace="kube-system" podName="kube-scheduler-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.830039 systemd[1]: Created slice kubepods-burstable-pod0de18bcbb30c148aaf2ee9b40f32a370.slice - libcontainer container kubepods-burstable-pod0de18bcbb30c148aaf2ee9b40f32a370.slice. Mar 17 17:29:32.844321 systemd[1]: Created slice kubepods-burstable-pod22ee4920bd8ad80f1688009569db84f3.slice - libcontainer container kubepods-burstable-pod22ee4920bd8ad80f1688009569db84f3.slice. Mar 17 17:29:32.858223 systemd[1]: Created slice kubepods-burstable-pod6c5af4d0f53e75a39041573f1ca15106.slice - libcontainer container kubepods-burstable-pod6c5af4d0f53e75a39041573f1ca15106.slice. Mar 17 17:29:32.893712 kubelet[3020]: I0317 17:29:32.893676 3020 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0de18bcbb30c148aaf2ee9b40f32a370-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4152.2.2-a-6c46d54d7c\" (UID: \"0de18bcbb30c148aaf2ee9b40f32a370\") " pod="kube-system/kube-apiserver-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.894192 kubelet[3020]: E0317 17:29:32.893983 3020 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.22:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152.2.2-a-6c46d54d7c?timeout=10s\": dial tcp 10.200.20.22:6443: connect: connection refused" interval="400ms" Mar 17 17:29:32.894192 kubelet[3020]: I0317 17:29:32.894056 3020 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6c5af4d0f53e75a39041573f1ca15106-kubeconfig\") pod \"kube-scheduler-ci-4152.2.2-a-6c46d54d7c\" (UID: \"6c5af4d0f53e75a39041573f1ca15106\") " pod="kube-system/kube-scheduler-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.894192 kubelet[3020]: I0317 17:29:32.894074 3020 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0de18bcbb30c148aaf2ee9b40f32a370-ca-certs\") pod \"kube-apiserver-ci-4152.2.2-a-6c46d54d7c\" (UID: \"0de18bcbb30c148aaf2ee9b40f32a370\") " pod="kube-system/kube-apiserver-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.894192 kubelet[3020]: I0317 17:29:32.894089 3020 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0de18bcbb30c148aaf2ee9b40f32a370-k8s-certs\") pod \"kube-apiserver-ci-4152.2.2-a-6c46d54d7c\" (UID: \"0de18bcbb30c148aaf2ee9b40f32a370\") " pod="kube-system/kube-apiserver-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.894192 kubelet[3020]: I0317 17:29:32.894136 3020 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/22ee4920bd8ad80f1688009569db84f3-ca-certs\") pod \"kube-controller-manager-ci-4152.2.2-a-6c46d54d7c\" (UID: \"22ee4920bd8ad80f1688009569db84f3\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.894548 kubelet[3020]: I0317 17:29:32.894180 3020 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/22ee4920bd8ad80f1688009569db84f3-flexvolume-dir\") pod \"kube-controller-manager-ci-4152.2.2-a-6c46d54d7c\" (UID: \"22ee4920bd8ad80f1688009569db84f3\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.894548 kubelet[3020]: I0317 17:29:32.894430 3020 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/22ee4920bd8ad80f1688009569db84f3-k8s-certs\") pod \"kube-controller-manager-ci-4152.2.2-a-6c46d54d7c\" (UID: \"22ee4920bd8ad80f1688009569db84f3\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.894548 kubelet[3020]: I0317 17:29:32.894480 3020 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/22ee4920bd8ad80f1688009569db84f3-kubeconfig\") pod \"kube-controller-manager-ci-4152.2.2-a-6c46d54d7c\" (UID: \"22ee4920bd8ad80f1688009569db84f3\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.894548 kubelet[3020]: I0317 17:29:32.894502 3020 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/22ee4920bd8ad80f1688009569db84f3-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4152.2.2-a-6c46d54d7c\" (UID: \"22ee4920bd8ad80f1688009569db84f3\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.996636 kubelet[3020]: I0317 17:29:32.996270 3020 kubelet_node_status.go:73] "Attempting to register node" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:32.996636 kubelet[3020]: E0317 17:29:32.996607 3020 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.20.22:6443/api/v1/nodes\": dial tcp 10.200.20.22:6443: connect: connection refused" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:33.144124 containerd[1721]: time="2025-03-17T17:29:33.144066358Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4152.2.2-a-6c46d54d7c,Uid:0de18bcbb30c148aaf2ee9b40f32a370,Namespace:kube-system,Attempt:0,}" Mar 17 17:29:33.156451 containerd[1721]: time="2025-03-17T17:29:33.156411026Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4152.2.2-a-6c46d54d7c,Uid:22ee4920bd8ad80f1688009569db84f3,Namespace:kube-system,Attempt:0,}" Mar 17 17:29:33.161495 containerd[1721]: time="2025-03-17T17:29:33.161455661Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4152.2.2-a-6c46d54d7c,Uid:6c5af4d0f53e75a39041573f1ca15106,Namespace:kube-system,Attempt:0,}" Mar 17 17:29:33.295453 kubelet[3020]: E0317 17:29:33.295334 3020 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.22:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152.2.2-a-6c46d54d7c?timeout=10s\": dial tcp 10.200.20.22:6443: connect: connection refused" interval="800ms" Mar 17 17:29:33.398209 kubelet[3020]: I0317 17:29:33.398175 3020 kubelet_node_status.go:73] "Attempting to register node" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:33.398577 kubelet[3020]: E0317 17:29:33.398536 3020 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.20.22:6443/api/v1/nodes\": dial tcp 10.200.20.22:6443: connect: connection refused" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:33.487391 kubelet[3020]: W0317 17:29:33.487311 3020 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.22:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152.2.2-a-6c46d54d7c&limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:33.487391 kubelet[3020]: E0317 17:29:33.487369 3020 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.200.20.22:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152.2.2-a-6c46d54d7c&limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:33.576190 kubelet[3020]: W0317 17:29:33.576067 3020 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.20.22:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:33.576190 kubelet[3020]: E0317 17:29:33.576110 3020 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.200.20.22:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:33.791834 kubelet[3020]: W0317 17:29:33.791791 3020 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.20.22:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:33.791834 kubelet[3020]: E0317 17:29:33.791837 3020 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.200.20.22:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:33.861866 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3032806777.mount: Deactivated successfully. Mar 17 17:29:33.914202 containerd[1721]: time="2025-03-17T17:29:33.914147138Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:29:33.934419 containerd[1721]: time="2025-03-17T17:29:33.934353478Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269173" Mar 17 17:29:33.951082 containerd[1721]: time="2025-03-17T17:29:33.951039701Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:29:33.959447 containerd[1721]: time="2025-03-17T17:29:33.958417733Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:29:33.968627 containerd[1721]: time="2025-03-17T17:29:33.968475403Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 17 17:29:33.974247 containerd[1721]: time="2025-03-17T17:29:33.973505118Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:29:33.977447 containerd[1721]: time="2025-03-17T17:29:33.977407474Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:29:33.978356 containerd[1721]: time="2025-03-17T17:29:33.978325593Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 834.170875ms" Mar 17 17:29:33.981524 containerd[1721]: time="2025-03-17T17:29:33.981474510Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 17 17:29:33.986921 containerd[1721]: time="2025-03-17T17:29:33.986750185Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 825.212404ms" Mar 17 17:29:34.011688 containerd[1721]: time="2025-03-17T17:29:34.011642479Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 855.155213ms" Mar 17 17:29:34.096659 kubelet[3020]: E0317 17:29:34.096611 3020 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.22:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152.2.2-a-6c46d54d7c?timeout=10s\": dial tcp 10.200.20.22:6443: connect: connection refused" interval="1.6s" Mar 17 17:29:34.116325 kubelet[3020]: W0317 17:29:34.116216 3020 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.22:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:34.116325 kubelet[3020]: E0317 17:29:34.116279 3020 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.200.20.22:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:34.200422 kubelet[3020]: I0317 17:29:34.200397 3020 kubelet_node_status.go:73] "Attempting to register node" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:34.201025 kubelet[3020]: E0317 17:29:34.201000 3020 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.20.22:6443/api/v1/nodes\": dial tcp 10.200.20.22:6443: connect: connection refused" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:34.793997 containerd[1721]: time="2025-03-17T17:29:34.793502647Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:29:34.793997 containerd[1721]: time="2025-03-17T17:29:34.793572727Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:29:34.793997 containerd[1721]: time="2025-03-17T17:29:34.793585567Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:34.795096 containerd[1721]: time="2025-03-17T17:29:34.794736206Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:34.800981 containerd[1721]: time="2025-03-17T17:29:34.800861160Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:29:34.800981 containerd[1721]: time="2025-03-17T17:29:34.800922280Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:29:34.800981 containerd[1721]: time="2025-03-17T17:29:34.800942680Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:34.801208 containerd[1721]: time="2025-03-17T17:29:34.801033000Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:34.809037 containerd[1721]: time="2025-03-17T17:29:34.808843792Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:29:34.809037 containerd[1721]: time="2025-03-17T17:29:34.808898232Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:29:34.809037 containerd[1721]: time="2025-03-17T17:29:34.808910712Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:34.809367 containerd[1721]: time="2025-03-17T17:29:34.809045672Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:34.822480 kubelet[3020]: E0317 17:29:34.822446 3020 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.200.20.22:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:34.855692 systemd[1]: Started cri-containerd-11dc2ed58c9a6498bb77e02c8ad35be151b08132d28f9db5cc7e5f65063b1235.scope - libcontainer container 11dc2ed58c9a6498bb77e02c8ad35be151b08132d28f9db5cc7e5f65063b1235. Mar 17 17:29:34.859131 systemd[1]: Started cri-containerd-3a8aabb57ab8c8111dbc4f17dc4d3e760c6d9e8d1e62320b7bfbbc68ec320365.scope - libcontainer container 3a8aabb57ab8c8111dbc4f17dc4d3e760c6d9e8d1e62320b7bfbbc68ec320365. Mar 17 17:29:34.861397 systemd[1]: Started cri-containerd-9c7265b76f69985791f3f9cc2de021902d4544ec6cc2b31761ecc6d4258eb30d.scope - libcontainer container 9c7265b76f69985791f3f9cc2de021902d4544ec6cc2b31761ecc6d4258eb30d. Mar 17 17:29:34.921359 containerd[1721]: time="2025-03-17T17:29:34.921104598Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4152.2.2-a-6c46d54d7c,Uid:22ee4920bd8ad80f1688009569db84f3,Namespace:kube-system,Attempt:0,} returns sandbox id \"3a8aabb57ab8c8111dbc4f17dc4d3e760c6d9e8d1e62320b7bfbbc68ec320365\"" Mar 17 17:29:34.927104 containerd[1721]: time="2025-03-17T17:29:34.926952792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4152.2.2-a-6c46d54d7c,Uid:0de18bcbb30c148aaf2ee9b40f32a370,Namespace:kube-system,Attempt:0,} returns sandbox id \"11dc2ed58c9a6498bb77e02c8ad35be151b08132d28f9db5cc7e5f65063b1235\"" Mar 17 17:29:34.930524 containerd[1721]: time="2025-03-17T17:29:34.930362989Z" level=info msg="CreateContainer within sandbox \"3a8aabb57ab8c8111dbc4f17dc4d3e760c6d9e8d1e62320b7bfbbc68ec320365\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 17 17:29:34.930817 containerd[1721]: time="2025-03-17T17:29:34.930733388Z" level=info msg="CreateContainer within sandbox \"11dc2ed58c9a6498bb77e02c8ad35be151b08132d28f9db5cc7e5f65063b1235\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 17 17:29:34.935307 containerd[1721]: time="2025-03-17T17:29:34.935113464Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4152.2.2-a-6c46d54d7c,Uid:6c5af4d0f53e75a39041573f1ca15106,Namespace:kube-system,Attempt:0,} returns sandbox id \"9c7265b76f69985791f3f9cc2de021902d4544ec6cc2b31761ecc6d4258eb30d\"" Mar 17 17:29:34.937997 containerd[1721]: time="2025-03-17T17:29:34.937935261Z" level=info msg="CreateContainer within sandbox \"9c7265b76f69985791f3f9cc2de021902d4544ec6cc2b31761ecc6d4258eb30d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 17 17:29:35.004626 kubelet[3020]: E0317 17:29:35.004522 3020 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.22:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.22:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4152.2.2-a-6c46d54d7c.182da74b4a58dce3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4152.2.2-a-6c46d54d7c,UID:ci-4152.2.2-a-6c46d54d7c,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4152.2.2-a-6c46d54d7c,},FirstTimestamp:2025-03-17 17:29:32.673973475 +0000 UTC m=+1.029266860,LastTimestamp:2025-03-17 17:29:32.673973475 +0000 UTC m=+1.029266860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4152.2.2-a-6c46d54d7c,}" Mar 17 17:29:35.630437 kubelet[3020]: W0317 17:29:35.630376 3020 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.22:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152.2.2-a-6c46d54d7c&limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:35.630437 kubelet[3020]: E0317 17:29:35.630443 3020 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.200.20.22:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152.2.2-a-6c46d54d7c&limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:35.697371 kubelet[3020]: E0317 17:29:35.697322 3020 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.22:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152.2.2-a-6c46d54d7c?timeout=10s\": dial tcp 10.200.20.22:6443: connect: connection refused" interval="3.2s" Mar 17 17:29:35.803134 kubelet[3020]: I0317 17:29:35.803095 3020 kubelet_node_status.go:73] "Attempting to register node" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:35.803468 kubelet[3020]: E0317 17:29:35.803426 3020 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.20.22:6443/api/v1/nodes\": dial tcp 10.200.20.22:6443: connect: connection refused" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:35.822813 containerd[1721]: time="2025-03-17T17:29:35.822755964Z" level=info msg="CreateContainer within sandbox \"3a8aabb57ab8c8111dbc4f17dc4d3e760c6d9e8d1e62320b7bfbbc68ec320365\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"8131f87cb11179e0979b7219decba7f08a976a0b23303a67d5737b599daf50bc\"" Mar 17 17:29:35.824053 containerd[1721]: time="2025-03-17T17:29:35.823471364Z" level=info msg="StartContainer for \"8131f87cb11179e0979b7219decba7f08a976a0b23303a67d5737b599daf50bc\"" Mar 17 17:29:35.845398 systemd[1]: Started cri-containerd-8131f87cb11179e0979b7219decba7f08a976a0b23303a67d5737b599daf50bc.scope - libcontainer container 8131f87cb11179e0979b7219decba7f08a976a0b23303a67d5737b599daf50bc. Mar 17 17:29:35.853725 containerd[1721]: time="2025-03-17T17:29:35.852876374Z" level=info msg="CreateContainer within sandbox \"11dc2ed58c9a6498bb77e02c8ad35be151b08132d28f9db5cc7e5f65063b1235\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"485396ee58f3c22f8e3042e659321cbd9294e9e5db15188e6111df8fd5b07c25\"" Mar 17 17:29:35.853891 kubelet[3020]: W0317 17:29:35.853589 3020 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.22:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:35.853891 kubelet[3020]: E0317 17:29:35.853653 3020 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.200.20.22:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused Mar 17 17:29:35.854191 containerd[1721]: time="2025-03-17T17:29:35.854129893Z" level=info msg="StartContainer for \"485396ee58f3c22f8e3042e659321cbd9294e9e5db15188e6111df8fd5b07c25\"" Mar 17 17:29:35.865407 containerd[1721]: time="2025-03-17T17:29:35.864905922Z" level=info msg="CreateContainer within sandbox \"9c7265b76f69985791f3f9cc2de021902d4544ec6cc2b31761ecc6d4258eb30d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"770ab1e0b271dc1fb9191e9cbb42acc42402743a2090cd7db092606137754d1f\"" Mar 17 17:29:35.866828 containerd[1721]: time="2025-03-17T17:29:35.866559840Z" level=info msg="StartContainer for \"770ab1e0b271dc1fb9191e9cbb42acc42402743a2090cd7db092606137754d1f\"" Mar 17 17:29:35.896960 systemd[1]: run-containerd-runc-k8s.io-485396ee58f3c22f8e3042e659321cbd9294e9e5db15188e6111df8fd5b07c25-runc.i4YkBL.mount: Deactivated successfully. Mar 17 17:29:35.906408 systemd[1]: Started cri-containerd-485396ee58f3c22f8e3042e659321cbd9294e9e5db15188e6111df8fd5b07c25.scope - libcontainer container 485396ee58f3c22f8e3042e659321cbd9294e9e5db15188e6111df8fd5b07c25. Mar 17 17:29:35.927470 containerd[1721]: time="2025-03-17T17:29:35.927425898Z" level=info msg="StartContainer for \"8131f87cb11179e0979b7219decba7f08a976a0b23303a67d5737b599daf50bc\" returns successfully" Mar 17 17:29:35.929504 systemd[1]: Started cri-containerd-770ab1e0b271dc1fb9191e9cbb42acc42402743a2090cd7db092606137754d1f.scope - libcontainer container 770ab1e0b271dc1fb9191e9cbb42acc42402743a2090cd7db092606137754d1f. Mar 17 17:29:35.974259 containerd[1721]: time="2025-03-17T17:29:35.973997011Z" level=info msg="StartContainer for \"485396ee58f3c22f8e3042e659321cbd9294e9e5db15188e6111df8fd5b07c25\" returns successfully" Mar 17 17:29:35.989605 containerd[1721]: time="2025-03-17T17:29:35.989551475Z" level=info msg="StartContainer for \"770ab1e0b271dc1fb9191e9cbb42acc42402743a2090cd7db092606137754d1f\" returns successfully" Mar 17 17:29:38.834273 kubelet[3020]: E0317 17:29:38.834202 3020 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4152.2.2-a-6c46d54d7c" not found Mar 17 17:29:38.900176 kubelet[3020]: E0317 17:29:38.900121 3020 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4152.2.2-a-6c46d54d7c\" not found" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:39.006296 kubelet[3020]: I0317 17:29:39.005989 3020 kubelet_node_status.go:73] "Attempting to register node" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:39.016022 kubelet[3020]: I0317 17:29:39.015981 3020 kubelet_node_status.go:76] "Successfully registered node" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:39.024090 kubelet[3020]: E0317 17:29:39.024040 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:39.125269 kubelet[3020]: E0317 17:29:39.124358 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:39.225309 kubelet[3020]: E0317 17:29:39.225274 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:39.325505 kubelet[3020]: E0317 17:29:39.325449 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:39.426478 kubelet[3020]: E0317 17:29:39.426361 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:39.527144 kubelet[3020]: E0317 17:29:39.527059 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:39.627784 kubelet[3020]: E0317 17:29:39.627713 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:39.728693 kubelet[3020]: E0317 17:29:39.728650 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:39.828972 kubelet[3020]: E0317 17:29:39.828927 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:39.929249 kubelet[3020]: E0317 17:29:39.929133 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:40.029697 kubelet[3020]: E0317 17:29:40.029583 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:40.129998 kubelet[3020]: E0317 17:29:40.129946 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:40.230731 kubelet[3020]: E0317 17:29:40.230685 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:40.331389 kubelet[3020]: E0317 17:29:40.331286 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:40.431774 kubelet[3020]: E0317 17:29:40.431732 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:40.532209 kubelet[3020]: E0317 17:29:40.532166 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:40.593914 systemd[1]: Reloading requested from client PID 3293 ('systemctl') (unit session-9.scope)... Mar 17 17:29:40.593928 systemd[1]: Reloading... Mar 17 17:29:40.632895 kubelet[3020]: E0317 17:29:40.632811 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:40.686303 zram_generator::config[3336]: No configuration found. Mar 17 17:29:40.733839 kubelet[3020]: E0317 17:29:40.733789 3020 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4152.2.2-a-6c46d54d7c\" not found" Mar 17 17:29:40.786076 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:29:40.878019 systemd[1]: Reloading finished in 283 ms. Mar 17 17:29:40.907359 kubelet[3020]: E0317 17:29:40.907250 3020 event.go:319] "Unable to write event (broadcaster is shut down)" event="&Event{ObjectMeta:{ci-4152.2.2-a-6c46d54d7c.182da74b4a58dce3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4152.2.2-a-6c46d54d7c,UID:ci-4152.2.2-a-6c46d54d7c,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4152.2.2-a-6c46d54d7c,},FirstTimestamp:2025-03-17 17:29:32.673973475 +0000 UTC m=+1.029266860,LastTimestamp:2025-03-17 17:29:32.673973475 +0000 UTC m=+1.029266860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4152.2.2-a-6c46d54d7c,}" Mar 17 17:29:40.909365 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:29:40.918337 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 17:29:40.918557 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:29:40.918603 systemd[1]: kubelet.service: Consumed 1.321s CPU time, 110.7M memory peak, 0B memory swap peak. Mar 17 17:29:40.923450 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:29:41.295528 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:29:41.305112 (kubelet)[3397]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 17:29:41.354971 kubelet[3397]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:29:41.354971 kubelet[3397]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 17:29:41.354971 kubelet[3397]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:29:41.354971 kubelet[3397]: I0317 17:29:41.354418 3397 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 17:29:41.359519 kubelet[3397]: I0317 17:29:41.359486 3397 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 17:29:41.359519 kubelet[3397]: I0317 17:29:41.359510 3397 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 17:29:41.359984 kubelet[3397]: I0317 17:29:41.359686 3397 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 17:29:41.361163 kubelet[3397]: I0317 17:29:41.361135 3397 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 17 17:29:41.362434 kubelet[3397]: I0317 17:29:41.362405 3397 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:29:41.371679 kubelet[3397]: I0317 17:29:41.371599 3397 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 17:29:41.371803 kubelet[3397]: I0317 17:29:41.371773 3397 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 17:29:41.371954 kubelet[3397]: I0317 17:29:41.371800 3397 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4152.2.2-a-6c46d54d7c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 17:29:41.372028 kubelet[3397]: I0317 17:29:41.371958 3397 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 17:29:41.372028 kubelet[3397]: I0317 17:29:41.371968 3397 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 17:29:41.372028 kubelet[3397]: I0317 17:29:41.372005 3397 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:29:41.372149 kubelet[3397]: I0317 17:29:41.372132 3397 kubelet.go:400] "Attempting to sync node with API server" Mar 17 17:29:41.372181 kubelet[3397]: I0317 17:29:41.372150 3397 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 17:29:41.372181 kubelet[3397]: I0317 17:29:41.372172 3397 kubelet.go:312] "Adding apiserver pod source" Mar 17 17:29:41.372181 kubelet[3397]: I0317 17:29:41.372184 3397 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 17:29:41.374678 kubelet[3397]: I0317 17:29:41.374652 3397 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Mar 17 17:29:41.374837 kubelet[3397]: I0317 17:29:41.374817 3397 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 17:29:41.375192 kubelet[3397]: I0317 17:29:41.375169 3397 server.go:1264] "Started kubelet" Mar 17 17:29:41.378489 kubelet[3397]: I0317 17:29:41.378462 3397 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 17:29:41.379353 kubelet[3397]: I0317 17:29:41.379335 3397 server.go:455] "Adding debug handlers to kubelet server" Mar 17 17:29:41.381043 kubelet[3397]: I0317 17:29:41.381024 3397 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 17:29:41.392162 kubelet[3397]: I0317 17:29:41.378495 3397 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 17:29:41.392487 kubelet[3397]: I0317 17:29:41.392469 3397 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 17:29:41.394182 kubelet[3397]: I0317 17:29:41.394163 3397 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 17:29:41.396183 kubelet[3397]: I0317 17:29:41.396166 3397 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 17:29:41.396587 kubelet[3397]: I0317 17:29:41.396424 3397 reconciler.go:26] "Reconciler: start to sync state" Mar 17 17:29:41.398181 kubelet[3397]: I0317 17:29:41.398153 3397 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 17:29:41.399526 kubelet[3397]: I0317 17:29:41.399506 3397 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 17:29:41.399632 kubelet[3397]: I0317 17:29:41.399622 3397 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 17:29:41.399902 kubelet[3397]: I0317 17:29:41.399686 3397 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 17:29:41.399902 kubelet[3397]: E0317 17:29:41.399726 3397 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 17:29:41.414259 kubelet[3397]: I0317 17:29:41.412421 3397 factory.go:221] Registration of the systemd container factory successfully Mar 17 17:29:41.414259 kubelet[3397]: I0317 17:29:41.412526 3397 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 17:29:41.415542 kubelet[3397]: I0317 17:29:41.415507 3397 factory.go:221] Registration of the containerd container factory successfully Mar 17 17:29:41.431783 kubelet[3397]: E0317 17:29:41.431730 3397 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 17:29:41.468756 kubelet[3397]: I0317 17:29:41.468728 3397 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 17:29:41.469179 kubelet[3397]: I0317 17:29:41.468902 3397 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 17:29:41.469179 kubelet[3397]: I0317 17:29:41.468925 3397 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:29:41.469179 kubelet[3397]: I0317 17:29:41.469076 3397 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 17 17:29:41.469179 kubelet[3397]: I0317 17:29:41.469086 3397 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 17 17:29:41.469179 kubelet[3397]: I0317 17:29:41.469103 3397 policy_none.go:49] "None policy: Start" Mar 17 17:29:41.469791 kubelet[3397]: I0317 17:29:41.469766 3397 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 17:29:41.469831 kubelet[3397]: I0317 17:29:41.469795 3397 state_mem.go:35] "Initializing new in-memory state store" Mar 17 17:29:41.470065 kubelet[3397]: I0317 17:29:41.470044 3397 state_mem.go:75] "Updated machine memory state" Mar 17 17:29:41.474164 kubelet[3397]: I0317 17:29:41.473831 3397 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 17:29:41.474164 kubelet[3397]: I0317 17:29:41.473971 3397 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 17:29:41.474164 kubelet[3397]: I0317 17:29:41.474052 3397 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 17:29:41.498267 kubelet[3397]: I0317 17:29:41.497753 3397 kubelet_node_status.go:73] "Attempting to register node" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:41.499905 kubelet[3397]: I0317 17:29:41.499868 3397 topology_manager.go:215] "Topology Admit Handler" podUID="0de18bcbb30c148aaf2ee9b40f32a370" podNamespace="kube-system" podName="kube-apiserver-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:41.500003 kubelet[3397]: I0317 17:29:41.499966 3397 topology_manager.go:215] "Topology Admit Handler" podUID="22ee4920bd8ad80f1688009569db84f3" podNamespace="kube-system" podName="kube-controller-manager-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:41.500033 kubelet[3397]: I0317 17:29:41.500003 3397 topology_manager.go:215] "Topology Admit Handler" podUID="6c5af4d0f53e75a39041573f1ca15106" podNamespace="kube-system" podName="kube-scheduler-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:41.513278 kubelet[3397]: W0317 17:29:41.513189 3397 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 17 17:29:41.518407 kubelet[3397]: W0317 17:29:41.518024 3397 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 17 17:29:41.518407 kubelet[3397]: W0317 17:29:41.518068 3397 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 17 17:29:41.518407 kubelet[3397]: I0317 17:29:41.518122 3397 kubelet_node_status.go:112] "Node was previously registered" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:41.518407 kubelet[3397]: I0317 17:29:41.518193 3397 kubelet_node_status.go:76] "Successfully registered node" node="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:41.698115 kubelet[3397]: I0317 17:29:41.698012 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/22ee4920bd8ad80f1688009569db84f3-k8s-certs\") pod \"kube-controller-manager-ci-4152.2.2-a-6c46d54d7c\" (UID: \"22ee4920bd8ad80f1688009569db84f3\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:41.698611 kubelet[3397]: I0317 17:29:41.698591 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0de18bcbb30c148aaf2ee9b40f32a370-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4152.2.2-a-6c46d54d7c\" (UID: \"0de18bcbb30c148aaf2ee9b40f32a370\") " pod="kube-system/kube-apiserver-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:41.698708 kubelet[3397]: I0317 17:29:41.698696 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/22ee4920bd8ad80f1688009569db84f3-ca-certs\") pod \"kube-controller-manager-ci-4152.2.2-a-6c46d54d7c\" (UID: \"22ee4920bd8ad80f1688009569db84f3\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:41.699351 kubelet[3397]: I0317 17:29:41.698781 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/22ee4920bd8ad80f1688009569db84f3-flexvolume-dir\") pod \"kube-controller-manager-ci-4152.2.2-a-6c46d54d7c\" (UID: \"22ee4920bd8ad80f1688009569db84f3\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:41.699351 kubelet[3397]: I0317 17:29:41.698802 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/22ee4920bd8ad80f1688009569db84f3-kubeconfig\") pod \"kube-controller-manager-ci-4152.2.2-a-6c46d54d7c\" (UID: \"22ee4920bd8ad80f1688009569db84f3\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:41.699351 kubelet[3397]: I0317 17:29:41.698820 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/22ee4920bd8ad80f1688009569db84f3-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4152.2.2-a-6c46d54d7c\" (UID: \"22ee4920bd8ad80f1688009569db84f3\") " pod="kube-system/kube-controller-manager-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:41.699351 kubelet[3397]: I0317 17:29:41.698840 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6c5af4d0f53e75a39041573f1ca15106-kubeconfig\") pod \"kube-scheduler-ci-4152.2.2-a-6c46d54d7c\" (UID: \"6c5af4d0f53e75a39041573f1ca15106\") " pod="kube-system/kube-scheduler-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:41.699351 kubelet[3397]: I0317 17:29:41.698857 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0de18bcbb30c148aaf2ee9b40f32a370-ca-certs\") pod \"kube-apiserver-ci-4152.2.2-a-6c46d54d7c\" (UID: \"0de18bcbb30c148aaf2ee9b40f32a370\") " pod="kube-system/kube-apiserver-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:41.699869 kubelet[3397]: I0317 17:29:41.698874 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0de18bcbb30c148aaf2ee9b40f32a370-k8s-certs\") pod \"kube-apiserver-ci-4152.2.2-a-6c46d54d7c\" (UID: \"0de18bcbb30c148aaf2ee9b40f32a370\") " pod="kube-system/kube-apiserver-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:42.372932 kubelet[3397]: I0317 17:29:42.372894 3397 apiserver.go:52] "Watching apiserver" Mar 17 17:29:42.396780 kubelet[3397]: I0317 17:29:42.396728 3397 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 17:29:42.484863 kubelet[3397]: W0317 17:29:42.484824 3397 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 17 17:29:42.485036 kubelet[3397]: E0317 17:29:42.484893 3397 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4152.2.2-a-6c46d54d7c\" already exists" pod="kube-system/kube-apiserver-ci-4152.2.2-a-6c46d54d7c" Mar 17 17:29:42.550816 kubelet[3397]: I0317 17:29:42.550739 3397 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4152.2.2-a-6c46d54d7c" podStartSLOduration=1.5507037000000001 podStartE2EDuration="1.5507037s" podCreationTimestamp="2025-03-17 17:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:29:42.55011058 +0000 UTC m=+1.242117591" watchObservedRunningTime="2025-03-17 17:29:42.5507037 +0000 UTC m=+1.242710711" Mar 17 17:29:42.551026 kubelet[3397]: I0317 17:29:42.550902 3397 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4152.2.2-a-6c46d54d7c" podStartSLOduration=1.55089618 podStartE2EDuration="1.55089618s" podCreationTimestamp="2025-03-17 17:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:29:42.499812586 +0000 UTC m=+1.191819557" watchObservedRunningTime="2025-03-17 17:29:42.55089618 +0000 UTC m=+1.242903191" Mar 17 17:29:42.601342 kubelet[3397]: I0317 17:29:42.601188 3397 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4152.2.2-a-6c46d54d7c" podStartSLOduration=1.601170014 podStartE2EDuration="1.601170014s" podCreationTimestamp="2025-03-17 17:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:29:42.600888294 +0000 UTC m=+1.292895305" watchObservedRunningTime="2025-03-17 17:29:42.601170014 +0000 UTC m=+1.293177025" Mar 17 17:29:46.176563 sudo[2403]: pam_unix(sudo:session): session closed for user root Mar 17 17:29:46.247272 sshd[2402]: Connection closed by 10.200.16.10 port 41756 Mar 17 17:29:46.247915 sshd-session[2400]: pam_unix(sshd:session): session closed for user core Mar 17 17:29:46.252015 systemd[1]: sshd@6-10.200.20.22:22-10.200.16.10:41756.service: Deactivated successfully. Mar 17 17:29:46.253770 systemd[1]: session-9.scope: Deactivated successfully. Mar 17 17:29:46.254512 systemd[1]: session-9.scope: Consumed 6.180s CPU time, 192.5M memory peak, 0B memory swap peak. Mar 17 17:29:46.255399 systemd-logind[1696]: Session 9 logged out. Waiting for processes to exit. Mar 17 17:29:46.257331 systemd-logind[1696]: Removed session 9. Mar 17 17:29:54.822309 kubelet[3397]: I0317 17:29:54.821907 3397 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 17 17:29:54.822730 containerd[1721]: time="2025-03-17T17:29:54.822403307Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 17 17:29:54.822923 kubelet[3397]: I0317 17:29:54.822731 3397 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 17 17:29:55.488631 kubelet[3397]: I0317 17:29:55.488568 3397 topology_manager.go:215] "Topology Admit Handler" podUID="f364a07d-5952-4760-a148-7015b4cd0fc2" podNamespace="kube-system" podName="kube-proxy-hbhwv" Mar 17 17:29:55.498130 systemd[1]: Created slice kubepods-besteffort-podf364a07d_5952_4760_a148_7015b4cd0fc2.slice - libcontainer container kubepods-besteffort-podf364a07d_5952_4760_a148_7015b4cd0fc2.slice. Mar 17 17:29:55.582523 kubelet[3397]: I0317 17:29:55.582430 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f364a07d-5952-4760-a148-7015b4cd0fc2-lib-modules\") pod \"kube-proxy-hbhwv\" (UID: \"f364a07d-5952-4760-a148-7015b4cd0fc2\") " pod="kube-system/kube-proxy-hbhwv" Mar 17 17:29:55.582727 kubelet[3397]: I0317 17:29:55.582530 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2mtw\" (UniqueName: \"kubernetes.io/projected/f364a07d-5952-4760-a148-7015b4cd0fc2-kube-api-access-f2mtw\") pod \"kube-proxy-hbhwv\" (UID: \"f364a07d-5952-4760-a148-7015b4cd0fc2\") " pod="kube-system/kube-proxy-hbhwv" Mar 17 17:29:55.582727 kubelet[3397]: I0317 17:29:55.582558 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/f364a07d-5952-4760-a148-7015b4cd0fc2-kube-proxy\") pod \"kube-proxy-hbhwv\" (UID: \"f364a07d-5952-4760-a148-7015b4cd0fc2\") " pod="kube-system/kube-proxy-hbhwv" Mar 17 17:29:55.582727 kubelet[3397]: I0317 17:29:55.582574 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f364a07d-5952-4760-a148-7015b4cd0fc2-xtables-lock\") pod \"kube-proxy-hbhwv\" (UID: \"f364a07d-5952-4760-a148-7015b4cd0fc2\") " pod="kube-system/kube-proxy-hbhwv" Mar 17 17:29:55.808181 containerd[1721]: time="2025-03-17T17:29:55.808076417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hbhwv,Uid:f364a07d-5952-4760-a148-7015b4cd0fc2,Namespace:kube-system,Attempt:0,}" Mar 17 17:29:55.875602 containerd[1721]: time="2025-03-17T17:29:55.875454836Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:29:55.875602 containerd[1721]: time="2025-03-17T17:29:55.875507356Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:29:55.875602 containerd[1721]: time="2025-03-17T17:29:55.875518836Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:55.876205 containerd[1721]: time="2025-03-17T17:29:55.875626876Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:55.896396 systemd[1]: Started cri-containerd-eafb51c1128d61381cfb006d777d70501f5c386a8542dc11ac99852d748c7366.scope - libcontainer container eafb51c1128d61381cfb006d777d70501f5c386a8542dc11ac99852d748c7366. Mar 17 17:29:55.919266 containerd[1721]: time="2025-03-17T17:29:55.919211877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hbhwv,Uid:f364a07d-5952-4760-a148-7015b4cd0fc2,Namespace:kube-system,Attempt:0,} returns sandbox id \"eafb51c1128d61381cfb006d777d70501f5c386a8542dc11ac99852d748c7366\"" Mar 17 17:29:55.924590 containerd[1721]: time="2025-03-17T17:29:55.924538472Z" level=info msg="CreateContainer within sandbox \"eafb51c1128d61381cfb006d777d70501f5c386a8542dc11ac99852d748c7366\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 17 17:29:55.962036 kubelet[3397]: I0317 17:29:55.960356 3397 topology_manager.go:215] "Topology Admit Handler" podUID="91d6d7a3-fbad-4cd6-b259-89a02161e546" podNamespace="tigera-operator" podName="tigera-operator-6479d6dc54-qqgxn" Mar 17 17:29:55.967381 kubelet[3397]: W0317 17:29:55.967213 3397 reflector.go:547] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4152.2.2-a-6c46d54d7c" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4152.2.2-a-6c46d54d7c' and this object Mar 17 17:29:55.967574 kubelet[3397]: E0317 17:29:55.967557 3397 reflector.go:150] object-"tigera-operator"/"kubernetes-services-endpoint": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4152.2.2-a-6c46d54d7c" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4152.2.2-a-6c46d54d7c' and this object Mar 17 17:29:55.967697 kubelet[3397]: W0317 17:29:55.967345 3397 reflector.go:547] object-"tigera-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4152.2.2-a-6c46d54d7c" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4152.2.2-a-6c46d54d7c' and this object Mar 17 17:29:55.967697 kubelet[3397]: E0317 17:29:55.967626 3397 reflector.go:150] object-"tigera-operator"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4152.2.2-a-6c46d54d7c" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4152.2.2-a-6c46d54d7c' and this object Mar 17 17:29:55.972991 systemd[1]: Created slice kubepods-besteffort-pod91d6d7a3_fbad_4cd6_b259_89a02161e546.slice - libcontainer container kubepods-besteffort-pod91d6d7a3_fbad_4cd6_b259_89a02161e546.slice. Mar 17 17:29:55.981384 containerd[1721]: time="2025-03-17T17:29:55.981347341Z" level=info msg="CreateContainer within sandbox \"eafb51c1128d61381cfb006d777d70501f5c386a8542dc11ac99852d748c7366\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"8c114b0d18a98444bf55727ee2237899ba4d76ef9b7c19bfcbfd7a19189dd523\"" Mar 17 17:29:55.982301 containerd[1721]: time="2025-03-17T17:29:55.982261820Z" level=info msg="StartContainer for \"8c114b0d18a98444bf55727ee2237899ba4d76ef9b7c19bfcbfd7a19189dd523\"" Mar 17 17:29:55.985390 kubelet[3397]: I0317 17:29:55.985354 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8lmb\" (UniqueName: \"kubernetes.io/projected/91d6d7a3-fbad-4cd6-b259-89a02161e546-kube-api-access-j8lmb\") pod \"tigera-operator-6479d6dc54-qqgxn\" (UID: \"91d6d7a3-fbad-4cd6-b259-89a02161e546\") " pod="tigera-operator/tigera-operator-6479d6dc54-qqgxn" Mar 17 17:29:55.985390 kubelet[3397]: I0317 17:29:55.985398 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/91d6d7a3-fbad-4cd6-b259-89a02161e546-var-lib-calico\") pod \"tigera-operator-6479d6dc54-qqgxn\" (UID: \"91d6d7a3-fbad-4cd6-b259-89a02161e546\") " pod="tigera-operator/tigera-operator-6479d6dc54-qqgxn" Mar 17 17:29:56.011428 systemd[1]: Started cri-containerd-8c114b0d18a98444bf55727ee2237899ba4d76ef9b7c19bfcbfd7a19189dd523.scope - libcontainer container 8c114b0d18a98444bf55727ee2237899ba4d76ef9b7c19bfcbfd7a19189dd523. Mar 17 17:29:56.044626 containerd[1721]: time="2025-03-17T17:29:56.044569284Z" level=info msg="StartContainer for \"8c114b0d18a98444bf55727ee2237899ba4d76ef9b7c19bfcbfd7a19189dd523\" returns successfully" Mar 17 17:29:57.179014 containerd[1721]: time="2025-03-17T17:29:57.178896822Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-qqgxn,Uid:91d6d7a3-fbad-4cd6-b259-89a02161e546,Namespace:tigera-operator,Attempt:0,}" Mar 17 17:29:57.247834 containerd[1721]: time="2025-03-17T17:29:57.247695680Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:29:57.247834 containerd[1721]: time="2025-03-17T17:29:57.247756920Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:29:57.247834 containerd[1721]: time="2025-03-17T17:29:57.247772720Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:57.247834 containerd[1721]: time="2025-03-17T17:29:57.247844760Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:57.268425 systemd[1]: Started cri-containerd-c276322b5deffb9c9d8e3d146b66869ee9e6e2080b4722cc719bd093d7405ac8.scope - libcontainer container c276322b5deffb9c9d8e3d146b66869ee9e6e2080b4722cc719bd093d7405ac8. Mar 17 17:29:57.296099 containerd[1721]: time="2025-03-17T17:29:57.295983956Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-qqgxn,Uid:91d6d7a3-fbad-4cd6-b259-89a02161e546,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c276322b5deffb9c9d8e3d146b66869ee9e6e2080b4722cc719bd093d7405ac8\"" Mar 17 17:29:57.298177 containerd[1721]: time="2025-03-17T17:29:57.298140234Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 17 17:29:59.345031 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount359903253.mount: Deactivated successfully. Mar 17 17:30:00.293704 containerd[1721]: time="2025-03-17T17:30:00.293648978Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:00.298585 containerd[1721]: time="2025-03-17T17:30:00.298444213Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=19271115" Mar 17 17:30:00.305597 containerd[1721]: time="2025-03-17T17:30:00.305537967Z" level=info msg="ImageCreate event name:\"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:00.312330 containerd[1721]: time="2025-03-17T17:30:00.312274241Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:00.313182 containerd[1721]: time="2025-03-17T17:30:00.313022720Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"19267110\" in 3.014843206s" Mar 17 17:30:00.313182 containerd[1721]: time="2025-03-17T17:30:00.313054000Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\"" Mar 17 17:30:00.316553 containerd[1721]: time="2025-03-17T17:30:00.316340237Z" level=info msg="CreateContainer within sandbox \"c276322b5deffb9c9d8e3d146b66869ee9e6e2080b4722cc719bd093d7405ac8\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 17 17:30:00.399829 containerd[1721]: time="2025-03-17T17:30:00.399764402Z" level=info msg="CreateContainer within sandbox \"c276322b5deffb9c9d8e3d146b66869ee9e6e2080b4722cc719bd093d7405ac8\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"5cbda67c84f07a81fafe57b081c02c9b8324cf0c660a2bd033102b33d38696b9\"" Mar 17 17:30:00.400804 containerd[1721]: time="2025-03-17T17:30:00.400429202Z" level=info msg="StartContainer for \"5cbda67c84f07a81fafe57b081c02c9b8324cf0c660a2bd033102b33d38696b9\"" Mar 17 17:30:00.429405 systemd[1]: Started cri-containerd-5cbda67c84f07a81fafe57b081c02c9b8324cf0c660a2bd033102b33d38696b9.scope - libcontainer container 5cbda67c84f07a81fafe57b081c02c9b8324cf0c660a2bd033102b33d38696b9. Mar 17 17:30:00.457609 containerd[1721]: time="2025-03-17T17:30:00.457484150Z" level=info msg="StartContainer for \"5cbda67c84f07a81fafe57b081c02c9b8324cf0c660a2bd033102b33d38696b9\" returns successfully" Mar 17 17:30:00.549829 kubelet[3397]: I0317 17:30:00.549471 3397 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-hbhwv" podStartSLOduration=5.549452827 podStartE2EDuration="5.549452827s" podCreationTimestamp="2025-03-17 17:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:29:56.490067442 +0000 UTC m=+15.182074453" watchObservedRunningTime="2025-03-17 17:30:00.549452827 +0000 UTC m=+19.241459838" Mar 17 17:30:05.305628 kubelet[3397]: I0317 17:30:05.305517 3397 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6479d6dc54-qqgxn" podStartSLOduration=7.288510541 podStartE2EDuration="10.305495665s" podCreationTimestamp="2025-03-17 17:29:55 +0000 UTC" firstStartedPulling="2025-03-17 17:29:57.297347955 +0000 UTC m=+15.989354966" lastFinishedPulling="2025-03-17 17:30:00.314333079 +0000 UTC m=+19.006340090" observedRunningTime="2025-03-17 17:30:00.550357587 +0000 UTC m=+19.242364598" watchObservedRunningTime="2025-03-17 17:30:05.305495665 +0000 UTC m=+23.997502716" Mar 17 17:30:05.306723 kubelet[3397]: I0317 17:30:05.306115 3397 topology_manager.go:215] "Topology Admit Handler" podUID="ee7cb6bf-40da-4e3f-96f1-fb1f72db822f" podNamespace="calico-system" podName="calico-typha-f4855d5b4-g4x29" Mar 17 17:30:05.311665 kubelet[3397]: W0317 17:30:05.311066 3397 reflector.go:547] object-"calico-system"/"typha-certs": failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:ci-4152.2.2-a-6c46d54d7c" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4152.2.2-a-6c46d54d7c' and this object Mar 17 17:30:05.311813 kubelet[3397]: E0317 17:30:05.311798 3397 reflector.go:150] object-"calico-system"/"typha-certs": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:ci-4152.2.2-a-6c46d54d7c" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4152.2.2-a-6c46d54d7c' and this object Mar 17 17:30:05.313844 systemd[1]: Created slice kubepods-besteffort-podee7cb6bf_40da_4e3f_96f1_fb1f72db822f.slice - libcontainer container kubepods-besteffort-podee7cb6bf_40da_4e3f_96f1_fb1f72db822f.slice. Mar 17 17:30:05.343040 kubelet[3397]: I0317 17:30:05.342813 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee7cb6bf-40da-4e3f-96f1-fb1f72db822f-tigera-ca-bundle\") pod \"calico-typha-f4855d5b4-g4x29\" (UID: \"ee7cb6bf-40da-4e3f-96f1-fb1f72db822f\") " pod="calico-system/calico-typha-f4855d5b4-g4x29" Mar 17 17:30:05.343040 kubelet[3397]: I0317 17:30:05.342924 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/ee7cb6bf-40da-4e3f-96f1-fb1f72db822f-typha-certs\") pod \"calico-typha-f4855d5b4-g4x29\" (UID: \"ee7cb6bf-40da-4e3f-96f1-fb1f72db822f\") " pod="calico-system/calico-typha-f4855d5b4-g4x29" Mar 17 17:30:05.343040 kubelet[3397]: I0317 17:30:05.342959 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkzwz\" (UniqueName: \"kubernetes.io/projected/ee7cb6bf-40da-4e3f-96f1-fb1f72db822f-kube-api-access-xkzwz\") pod \"calico-typha-f4855d5b4-g4x29\" (UID: \"ee7cb6bf-40da-4e3f-96f1-fb1f72db822f\") " pod="calico-system/calico-typha-f4855d5b4-g4x29" Mar 17 17:30:05.464952 kubelet[3397]: I0317 17:30:05.464881 3397 topology_manager.go:215] "Topology Admit Handler" podUID="a3d77dae-3481-4354-9058-5b613284c6ca" podNamespace="calico-system" podName="calico-node-v4t8h" Mar 17 17:30:05.476953 systemd[1]: Created slice kubepods-besteffort-poda3d77dae_3481_4354_9058_5b613284c6ca.slice - libcontainer container kubepods-besteffort-poda3d77dae_3481_4354_9058_5b613284c6ca.slice. Mar 17 17:30:05.544815 kubelet[3397]: I0317 17:30:05.544457 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a3d77dae-3481-4354-9058-5b613284c6ca-flexvol-driver-host\") pod \"calico-node-v4t8h\" (UID: \"a3d77dae-3481-4354-9058-5b613284c6ca\") " pod="calico-system/calico-node-v4t8h" Mar 17 17:30:05.544815 kubelet[3397]: I0317 17:30:05.544513 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3d77dae-3481-4354-9058-5b613284c6ca-lib-modules\") pod \"calico-node-v4t8h\" (UID: \"a3d77dae-3481-4354-9058-5b613284c6ca\") " pod="calico-system/calico-node-v4t8h" Mar 17 17:30:05.544815 kubelet[3397]: I0317 17:30:05.544533 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a3d77dae-3481-4354-9058-5b613284c6ca-policysync\") pod \"calico-node-v4t8h\" (UID: \"a3d77dae-3481-4354-9058-5b613284c6ca\") " pod="calico-system/calico-node-v4t8h" Mar 17 17:30:05.544815 kubelet[3397]: I0317 17:30:05.544551 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a3d77dae-3481-4354-9058-5b613284c6ca-var-run-calico\") pod \"calico-node-v4t8h\" (UID: \"a3d77dae-3481-4354-9058-5b613284c6ca\") " pod="calico-system/calico-node-v4t8h" Mar 17 17:30:05.544815 kubelet[3397]: I0317 17:30:05.544570 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a3d77dae-3481-4354-9058-5b613284c6ca-node-certs\") pod \"calico-node-v4t8h\" (UID: \"a3d77dae-3481-4354-9058-5b613284c6ca\") " pod="calico-system/calico-node-v4t8h" Mar 17 17:30:05.545038 kubelet[3397]: I0317 17:30:05.544587 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a3d77dae-3481-4354-9058-5b613284c6ca-cni-bin-dir\") pod \"calico-node-v4t8h\" (UID: \"a3d77dae-3481-4354-9058-5b613284c6ca\") " pod="calico-system/calico-node-v4t8h" Mar 17 17:30:05.545038 kubelet[3397]: I0317 17:30:05.544605 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3d77dae-3481-4354-9058-5b613284c6ca-tigera-ca-bundle\") pod \"calico-node-v4t8h\" (UID: \"a3d77dae-3481-4354-9058-5b613284c6ca\") " pod="calico-system/calico-node-v4t8h" Mar 17 17:30:05.545038 kubelet[3397]: I0317 17:30:05.544621 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a3d77dae-3481-4354-9058-5b613284c6ca-xtables-lock\") pod \"calico-node-v4t8h\" (UID: \"a3d77dae-3481-4354-9058-5b613284c6ca\") " pod="calico-system/calico-node-v4t8h" Mar 17 17:30:05.545038 kubelet[3397]: I0317 17:30:05.544636 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a3d77dae-3481-4354-9058-5b613284c6ca-var-lib-calico\") pod \"calico-node-v4t8h\" (UID: \"a3d77dae-3481-4354-9058-5b613284c6ca\") " pod="calico-system/calico-node-v4t8h" Mar 17 17:30:05.545038 kubelet[3397]: I0317 17:30:05.544654 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sj5c\" (UniqueName: \"kubernetes.io/projected/a3d77dae-3481-4354-9058-5b613284c6ca-kube-api-access-8sj5c\") pod \"calico-node-v4t8h\" (UID: \"a3d77dae-3481-4354-9058-5b613284c6ca\") " pod="calico-system/calico-node-v4t8h" Mar 17 17:30:05.545139 kubelet[3397]: I0317 17:30:05.544672 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a3d77dae-3481-4354-9058-5b613284c6ca-cni-net-dir\") pod \"calico-node-v4t8h\" (UID: \"a3d77dae-3481-4354-9058-5b613284c6ca\") " pod="calico-system/calico-node-v4t8h" Mar 17 17:30:05.545139 kubelet[3397]: I0317 17:30:05.544687 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a3d77dae-3481-4354-9058-5b613284c6ca-cni-log-dir\") pod \"calico-node-v4t8h\" (UID: \"a3d77dae-3481-4354-9058-5b613284c6ca\") " pod="calico-system/calico-node-v4t8h" Mar 17 17:30:05.655112 kubelet[3397]: E0317 17:30:05.654527 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.655112 kubelet[3397]: W0317 17:30:05.654549 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.655112 kubelet[3397]: E0317 17:30:05.654569 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.662605 kubelet[3397]: I0317 17:30:05.662570 3397 topology_manager.go:215] "Topology Admit Handler" podUID="feb1e339-9a1a-480e-9e83-ea79ab0971ae" podNamespace="calico-system" podName="csi-node-driver-vjh47" Mar 17 17:30:05.662874 kubelet[3397]: E0317 17:30:05.662837 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vjh47" podUID="feb1e339-9a1a-480e-9e83-ea79ab0971ae" Mar 17 17:30:05.672183 kubelet[3397]: E0317 17:30:05.672091 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.672183 kubelet[3397]: W0317 17:30:05.672124 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.672183 kubelet[3397]: E0317 17:30:05.672143 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.741974 kubelet[3397]: E0317 17:30:05.741934 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.741974 kubelet[3397]: W0317 17:30:05.741959 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.741974 kubelet[3397]: E0317 17:30:05.741980 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.742301 kubelet[3397]: E0317 17:30:05.742275 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.742301 kubelet[3397]: W0317 17:30:05.742287 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.742301 kubelet[3397]: E0317 17:30:05.742297 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.742774 kubelet[3397]: E0317 17:30:05.742755 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.742815 kubelet[3397]: W0317 17:30:05.742777 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.742815 kubelet[3397]: E0317 17:30:05.742789 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.743018 kubelet[3397]: E0317 17:30:05.742997 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.743065 kubelet[3397]: W0317 17:30:05.743011 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.743065 kubelet[3397]: E0317 17:30:05.743038 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.743215 kubelet[3397]: E0317 17:30:05.743201 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.743273 kubelet[3397]: W0317 17:30:05.743216 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.743273 kubelet[3397]: E0317 17:30:05.743225 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.743439 kubelet[3397]: E0317 17:30:05.743424 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.743439 kubelet[3397]: W0317 17:30:05.743437 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.743497 kubelet[3397]: E0317 17:30:05.743446 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.743619 kubelet[3397]: E0317 17:30:05.743605 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.743619 kubelet[3397]: W0317 17:30:05.743617 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.743674 kubelet[3397]: E0317 17:30:05.743627 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.743827 kubelet[3397]: E0317 17:30:05.743798 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.743827 kubelet[3397]: W0317 17:30:05.743826 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.743882 kubelet[3397]: E0317 17:30:05.743838 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.744014 kubelet[3397]: E0317 17:30:05.744001 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.744014 kubelet[3397]: W0317 17:30:05.744013 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.744088 kubelet[3397]: E0317 17:30:05.744021 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.744281 kubelet[3397]: E0317 17:30:05.744263 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.744281 kubelet[3397]: W0317 17:30:05.744277 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.744373 kubelet[3397]: E0317 17:30:05.744300 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.744913 kubelet[3397]: E0317 17:30:05.744456 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.744913 kubelet[3397]: W0317 17:30:05.744468 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.744913 kubelet[3397]: E0317 17:30:05.744477 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.744913 kubelet[3397]: E0317 17:30:05.744649 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.744913 kubelet[3397]: W0317 17:30:05.744657 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.744913 kubelet[3397]: E0317 17:30:05.744666 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.745088 kubelet[3397]: E0317 17:30:05.744858 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.745113 kubelet[3397]: W0317 17:30:05.745092 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.745113 kubelet[3397]: E0317 17:30:05.745104 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.745409 kubelet[3397]: E0317 17:30:05.745389 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.745409 kubelet[3397]: W0317 17:30:05.745403 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.745819 kubelet[3397]: E0317 17:30:05.745421 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.746202 kubelet[3397]: E0317 17:30:05.746178 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.746202 kubelet[3397]: W0317 17:30:05.746195 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.746328 kubelet[3397]: E0317 17:30:05.746208 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.746656 kubelet[3397]: E0317 17:30:05.746636 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.746656 kubelet[3397]: W0317 17:30:05.746651 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.746656 kubelet[3397]: E0317 17:30:05.746663 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.747075 kubelet[3397]: E0317 17:30:05.747024 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.747075 kubelet[3397]: W0317 17:30:05.747042 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.747075 kubelet[3397]: E0317 17:30:05.747053 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.747615 kubelet[3397]: E0317 17:30:05.747576 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.747615 kubelet[3397]: W0317 17:30:05.747595 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.747615 kubelet[3397]: E0317 17:30:05.747607 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.747938 kubelet[3397]: E0317 17:30:05.747913 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.748049 kubelet[3397]: W0317 17:30:05.748028 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.748049 kubelet[3397]: E0317 17:30:05.748048 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.749429 kubelet[3397]: E0317 17:30:05.749405 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.749429 kubelet[3397]: W0317 17:30:05.749423 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.749539 kubelet[3397]: E0317 17:30:05.749436 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.749796 kubelet[3397]: E0317 17:30:05.749773 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.749796 kubelet[3397]: W0317 17:30:05.749790 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.750086 kubelet[3397]: E0317 17:30:05.749801 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.750086 kubelet[3397]: I0317 17:30:05.749830 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j56g\" (UniqueName: \"kubernetes.io/projected/feb1e339-9a1a-480e-9e83-ea79ab0971ae-kube-api-access-7j56g\") pod \"csi-node-driver-vjh47\" (UID: \"feb1e339-9a1a-480e-9e83-ea79ab0971ae\") " pod="calico-system/csi-node-driver-vjh47" Mar 17 17:30:05.750150 kubelet[3397]: E0317 17:30:05.750107 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.750150 kubelet[3397]: W0317 17:30:05.750118 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.750150 kubelet[3397]: E0317 17:30:05.750129 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.750150 kubelet[3397]: I0317 17:30:05.750144 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/feb1e339-9a1a-480e-9e83-ea79ab0971ae-socket-dir\") pod \"csi-node-driver-vjh47\" (UID: \"feb1e339-9a1a-480e-9e83-ea79ab0971ae\") " pod="calico-system/csi-node-driver-vjh47" Mar 17 17:30:05.750472 kubelet[3397]: E0317 17:30:05.750337 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.750472 kubelet[3397]: W0317 17:30:05.750355 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.750472 kubelet[3397]: E0317 17:30:05.750365 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.751143 kubelet[3397]: E0317 17:30:05.750988 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.751143 kubelet[3397]: W0317 17:30:05.751006 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.751143 kubelet[3397]: E0317 17:30:05.751032 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.751511 kubelet[3397]: E0317 17:30:05.751427 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.751511 kubelet[3397]: W0317 17:30:05.751443 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.751511 kubelet[3397]: E0317 17:30:05.751462 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.752669 kubelet[3397]: E0317 17:30:05.751735 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.752669 kubelet[3397]: W0317 17:30:05.751747 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.752669 kubelet[3397]: E0317 17:30:05.751763 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.752669 kubelet[3397]: I0317 17:30:05.751803 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/feb1e339-9a1a-480e-9e83-ea79ab0971ae-registration-dir\") pod \"csi-node-driver-vjh47\" (UID: \"feb1e339-9a1a-480e-9e83-ea79ab0971ae\") " pod="calico-system/csi-node-driver-vjh47" Mar 17 17:30:05.753013 kubelet[3397]: E0317 17:30:05.752990 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.753013 kubelet[3397]: W0317 17:30:05.753008 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.753117 kubelet[3397]: E0317 17:30:05.753034 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.753282 kubelet[3397]: E0317 17:30:05.753262 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.753282 kubelet[3397]: W0317 17:30:05.753276 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.753429 kubelet[3397]: E0317 17:30:05.753330 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.753740 kubelet[3397]: E0317 17:30:05.753715 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.753740 kubelet[3397]: W0317 17:30:05.753734 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.753918 kubelet[3397]: E0317 17:30:05.753814 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.753918 kubelet[3397]: I0317 17:30:05.753845 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/feb1e339-9a1a-480e-9e83-ea79ab0971ae-varrun\") pod \"csi-node-driver-vjh47\" (UID: \"feb1e339-9a1a-480e-9e83-ea79ab0971ae\") " pod="calico-system/csi-node-driver-vjh47" Mar 17 17:30:05.754418 kubelet[3397]: E0317 17:30:05.754388 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.754418 kubelet[3397]: W0317 17:30:05.754412 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.754580 kubelet[3397]: E0317 17:30:05.754443 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.755362 kubelet[3397]: E0317 17:30:05.755336 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.755362 kubelet[3397]: W0317 17:30:05.755356 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.755495 kubelet[3397]: E0317 17:30:05.755377 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.755571 kubelet[3397]: E0317 17:30:05.755555 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.755571 kubelet[3397]: W0317 17:30:05.755567 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.755628 kubelet[3397]: E0317 17:30:05.755584 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.755628 kubelet[3397]: I0317 17:30:05.755603 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/feb1e339-9a1a-480e-9e83-ea79ab0971ae-kubelet-dir\") pod \"csi-node-driver-vjh47\" (UID: \"feb1e339-9a1a-480e-9e83-ea79ab0971ae\") " pod="calico-system/csi-node-driver-vjh47" Mar 17 17:30:05.755843 kubelet[3397]: E0317 17:30:05.755820 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.755843 kubelet[3397]: W0317 17:30:05.755837 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.755995 kubelet[3397]: E0317 17:30:05.755904 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.757294 kubelet[3397]: E0317 17:30:05.756000 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.757294 kubelet[3397]: W0317 17:30:05.756007 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.757294 kubelet[3397]: E0317 17:30:05.756021 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.757294 kubelet[3397]: E0317 17:30:05.756213 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.757294 kubelet[3397]: W0317 17:30:05.756225 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.757294 kubelet[3397]: E0317 17:30:05.756266 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.757294 kubelet[3397]: E0317 17:30:05.756683 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.757294 kubelet[3397]: W0317 17:30:05.756697 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.757294 kubelet[3397]: E0317 17:30:05.756710 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.781558 containerd[1721]: time="2025-03-17T17:30:05.781478396Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-v4t8h,Uid:a3d77dae-3481-4354-9058-5b613284c6ca,Namespace:calico-system,Attempt:0,}" Mar 17 17:30:05.857903 kubelet[3397]: E0317 17:30:05.857675 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.857903 kubelet[3397]: W0317 17:30:05.857722 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.857903 kubelet[3397]: E0317 17:30:05.857743 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.858326 kubelet[3397]: E0317 17:30:05.858220 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.858326 kubelet[3397]: W0317 17:30:05.858253 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.858326 kubelet[3397]: E0317 17:30:05.858274 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.858671 kubelet[3397]: E0317 17:30:05.858642 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.858671 kubelet[3397]: W0317 17:30:05.858663 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.858740 kubelet[3397]: E0317 17:30:05.858683 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.859108 kubelet[3397]: E0317 17:30:05.859075 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.859108 kubelet[3397]: W0317 17:30:05.859095 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.859260 kubelet[3397]: E0317 17:30:05.859199 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.859956 kubelet[3397]: E0317 17:30:05.859484 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.859956 kubelet[3397]: W0317 17:30:05.859501 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.859956 kubelet[3397]: E0317 17:30:05.859517 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.859956 kubelet[3397]: E0317 17:30:05.859737 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.859956 kubelet[3397]: W0317 17:30:05.859746 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.859956 kubelet[3397]: E0317 17:30:05.859840 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.860175 kubelet[3397]: E0317 17:30:05.860108 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.860175 kubelet[3397]: W0317 17:30:05.860119 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.861048 kubelet[3397]: E0317 17:30:05.860522 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.861048 kubelet[3397]: E0317 17:30:05.860628 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.861048 kubelet[3397]: W0317 17:30:05.860636 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.861048 kubelet[3397]: E0317 17:30:05.860650 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.861048 kubelet[3397]: E0317 17:30:05.860845 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.861048 kubelet[3397]: W0317 17:30:05.860854 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.861048 kubelet[3397]: E0317 17:30:05.860868 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.861213 kubelet[3397]: E0317 17:30:05.861085 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.861213 kubelet[3397]: W0317 17:30:05.861093 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.861213 kubelet[3397]: E0317 17:30:05.861105 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.861800 kubelet[3397]: E0317 17:30:05.861336 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.861800 kubelet[3397]: W0317 17:30:05.861352 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.861800 kubelet[3397]: E0317 17:30:05.861367 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.861800 kubelet[3397]: E0317 17:30:05.861598 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.861800 kubelet[3397]: W0317 17:30:05.861609 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.861800 kubelet[3397]: E0317 17:30:05.861625 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.862923 kubelet[3397]: E0317 17:30:05.861870 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.862923 kubelet[3397]: W0317 17:30:05.861879 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.862923 kubelet[3397]: E0317 17:30:05.861896 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.862923 kubelet[3397]: E0317 17:30:05.862293 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.862923 kubelet[3397]: W0317 17:30:05.862304 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.862923 kubelet[3397]: E0317 17:30:05.862378 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.862923 kubelet[3397]: E0317 17:30:05.862610 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.862923 kubelet[3397]: W0317 17:30:05.862621 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.862923 kubelet[3397]: E0317 17:30:05.862652 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.863360 kubelet[3397]: E0317 17:30:05.863144 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.863360 kubelet[3397]: W0317 17:30:05.863167 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.863360 kubelet[3397]: E0317 17:30:05.863254 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.864783 kubelet[3397]: E0317 17:30:05.864759 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.864783 kubelet[3397]: W0317 17:30:05.864776 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.864783 kubelet[3397]: E0317 17:30:05.864808 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.865120 kubelet[3397]: E0317 17:30:05.864966 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.865120 kubelet[3397]: W0317 17:30:05.864975 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.865120 kubelet[3397]: E0317 17:30:05.865019 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.865120 kubelet[3397]: E0317 17:30:05.865104 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.865120 kubelet[3397]: W0317 17:30:05.865111 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.865536 kubelet[3397]: E0317 17:30:05.865166 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.866302 kubelet[3397]: E0317 17:30:05.866254 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.866302 kubelet[3397]: W0317 17:30:05.866272 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.866302 kubelet[3397]: E0317 17:30:05.866301 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.866682 kubelet[3397]: E0317 17:30:05.866552 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.866682 kubelet[3397]: W0317 17:30:05.866568 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.866682 kubelet[3397]: E0317 17:30:05.866598 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.866964 kubelet[3397]: E0317 17:30:05.866842 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.866964 kubelet[3397]: W0317 17:30:05.866854 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.868450 kubelet[3397]: E0317 17:30:05.868417 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.868450 kubelet[3397]: W0317 17:30:05.868436 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.869633 kubelet[3397]: E0317 17:30:05.869455 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.869633 kubelet[3397]: E0317 17:30:05.869495 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.871319 kubelet[3397]: E0317 17:30:05.871284 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.871319 kubelet[3397]: W0317 17:30:05.871311 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.871319 kubelet[3397]: E0317 17:30:05.871331 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.871768 kubelet[3397]: E0317 17:30:05.871525 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.871768 kubelet[3397]: W0317 17:30:05.871540 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.871768 kubelet[3397]: E0317 17:30:05.871553 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.871768 kubelet[3397]: E0317 17:30:05.871694 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.871768 kubelet[3397]: W0317 17:30:05.871701 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.871768 kubelet[3397]: E0317 17:30:05.871708 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.874333 containerd[1721]: time="2025-03-17T17:30:05.874025232Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:05.874333 containerd[1721]: time="2025-03-17T17:30:05.874087192Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:05.874333 containerd[1721]: time="2025-03-17T17:30:05.874102992Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:05.874333 containerd[1721]: time="2025-03-17T17:30:05.874182752Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:05.884124 kubelet[3397]: E0317 17:30:05.883862 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.884124 kubelet[3397]: W0317 17:30:05.883891 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.884124 kubelet[3397]: E0317 17:30:05.883913 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:05.891405 systemd[1]: Started cri-containerd-ce869e18967f2888ac53864b8c7b19dd32d646c40c9071e6c9acd750187f0578.scope - libcontainer container ce869e18967f2888ac53864b8c7b19dd32d646c40c9071e6c9acd750187f0578. Mar 17 17:30:05.910950 containerd[1721]: time="2025-03-17T17:30:05.910594399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-v4t8h,Uid:a3d77dae-3481-4354-9058-5b613284c6ca,Namespace:calico-system,Attempt:0,} returns sandbox id \"ce869e18967f2888ac53864b8c7b19dd32d646c40c9071e6c9acd750187f0578\"" Mar 17 17:30:05.915673 containerd[1721]: time="2025-03-17T17:30:05.915381595Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 17 17:30:05.963194 kubelet[3397]: E0317 17:30:05.963159 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:05.963194 kubelet[3397]: W0317 17:30:05.963193 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:05.963338 kubelet[3397]: E0317 17:30:05.963213 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.064753 kubelet[3397]: E0317 17:30:06.064719 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.064753 kubelet[3397]: W0317 17:30:06.064743 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.065050 kubelet[3397]: E0317 17:30:06.064765 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.166371 kubelet[3397]: E0317 17:30:06.166196 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.166371 kubelet[3397]: W0317 17:30:06.166261 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.166371 kubelet[3397]: E0317 17:30:06.166281 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.267141 kubelet[3397]: E0317 17:30:06.267023 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.267141 kubelet[3397]: W0317 17:30:06.267077 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.267141 kubelet[3397]: E0317 17:30:06.267095 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.368253 kubelet[3397]: E0317 17:30:06.368174 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.368253 kubelet[3397]: W0317 17:30:06.368217 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.368253 kubelet[3397]: E0317 17:30:06.368261 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.444800 kubelet[3397]: E0317 17:30:06.444695 3397 secret.go:194] Couldn't get secret calico-system/typha-certs: failed to sync secret cache: timed out waiting for the condition Mar 17 17:30:06.444800 kubelet[3397]: E0317 17:30:06.444786 3397 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee7cb6bf-40da-4e3f-96f1-fb1f72db822f-typha-certs podName:ee7cb6bf-40da-4e3f-96f1-fb1f72db822f nodeName:}" failed. No retries permitted until 2025-03-17 17:30:06.944764238 +0000 UTC m=+25.636771249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "typha-certs" (UniqueName: "kubernetes.io/secret/ee7cb6bf-40da-4e3f-96f1-fb1f72db822f-typha-certs") pod "calico-typha-f4855d5b4-g4x29" (UID: "ee7cb6bf-40da-4e3f-96f1-fb1f72db822f") : failed to sync secret cache: timed out waiting for the condition Mar 17 17:30:06.469568 kubelet[3397]: E0317 17:30:06.469539 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.469568 kubelet[3397]: W0317 17:30:06.469561 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.469732 kubelet[3397]: E0317 17:30:06.469580 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.571592 kubelet[3397]: E0317 17:30:06.571263 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.571592 kubelet[3397]: W0317 17:30:06.571289 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.571592 kubelet[3397]: E0317 17:30:06.571309 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.672100 kubelet[3397]: E0317 17:30:06.672011 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.672100 kubelet[3397]: W0317 17:30:06.672033 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.672100 kubelet[3397]: E0317 17:30:06.672052 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.772894 kubelet[3397]: E0317 17:30:06.772844 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.772894 kubelet[3397]: W0317 17:30:06.772869 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.772894 kubelet[3397]: E0317 17:30:06.772890 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.873687 kubelet[3397]: E0317 17:30:06.873634 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.873687 kubelet[3397]: W0317 17:30:06.873657 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.873687 kubelet[3397]: E0317 17:30:06.873676 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.974206 kubelet[3397]: E0317 17:30:06.974154 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.974206 kubelet[3397]: W0317 17:30:06.974179 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.974206 kubelet[3397]: E0317 17:30:06.974197 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.974424 kubelet[3397]: E0317 17:30:06.974409 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.974424 kubelet[3397]: W0317 17:30:06.974418 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.974491 kubelet[3397]: E0317 17:30:06.974428 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.974640 kubelet[3397]: E0317 17:30:06.974621 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.974640 kubelet[3397]: W0317 17:30:06.974636 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.974715 kubelet[3397]: E0317 17:30:06.974645 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.974804 kubelet[3397]: E0317 17:30:06.974789 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.974804 kubelet[3397]: W0317 17:30:06.974802 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.974863 kubelet[3397]: E0317 17:30:06.974810 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.974983 kubelet[3397]: E0317 17:30:06.974969 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.974983 kubelet[3397]: W0317 17:30:06.974980 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.975035 kubelet[3397]: E0317 17:30:06.974988 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:06.979932 kubelet[3397]: E0317 17:30:06.979842 3397 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:06.979932 kubelet[3397]: W0317 17:30:06.979861 3397 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:06.979932 kubelet[3397]: E0317 17:30:06.979894 3397 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:07.119551 containerd[1721]: time="2025-03-17T17:30:07.119444150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-f4855d5b4-g4x29,Uid:ee7cb6bf-40da-4e3f-96f1-fb1f72db822f,Namespace:calico-system,Attempt:0,}" Mar 17 17:30:07.169362 containerd[1721]: time="2025-03-17T17:30:07.169168185Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:07.169362 containerd[1721]: time="2025-03-17T17:30:07.169257705Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:07.169362 containerd[1721]: time="2025-03-17T17:30:07.169280945Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:07.169362 containerd[1721]: time="2025-03-17T17:30:07.169364785Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:07.189395 systemd[1]: Started cri-containerd-a0d2c46f2a6f3964622423115728ac837e854b30b577fb76d5ff13a670df68de.scope - libcontainer container a0d2c46f2a6f3964622423115728ac837e854b30b577fb76d5ff13a670df68de. Mar 17 17:30:07.216883 containerd[1721]: time="2025-03-17T17:30:07.216836422Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-f4855d5b4-g4x29,Uid:ee7cb6bf-40da-4e3f-96f1-fb1f72db822f,Namespace:calico-system,Attempt:0,} returns sandbox id \"a0d2c46f2a6f3964622423115728ac837e854b30b577fb76d5ff13a670df68de\"" Mar 17 17:30:07.400934 kubelet[3397]: E0317 17:30:07.400204 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vjh47" podUID="feb1e339-9a1a-480e-9e83-ea79ab0971ae" Mar 17 17:30:07.656506 containerd[1721]: time="2025-03-17T17:30:07.656362586Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:07.663246 containerd[1721]: time="2025-03-17T17:30:07.663160980Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5120152" Mar 17 17:30:07.671282 containerd[1721]: time="2025-03-17T17:30:07.670978773Z" level=info msg="ImageCreate event name:\"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:07.678449 containerd[1721]: time="2025-03-17T17:30:07.678392406Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:07.679097 containerd[1721]: time="2025-03-17T17:30:07.678967726Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6489869\" in 1.763548291s" Mar 17 17:30:07.679097 containerd[1721]: time="2025-03-17T17:30:07.679002766Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\"" Mar 17 17:30:07.680669 containerd[1721]: time="2025-03-17T17:30:07.680408844Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 17 17:30:07.681661 containerd[1721]: time="2025-03-17T17:30:07.681433843Z" level=info msg="CreateContainer within sandbox \"ce869e18967f2888ac53864b8c7b19dd32d646c40c9071e6c9acd750187f0578\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 17:30:07.768727 containerd[1721]: time="2025-03-17T17:30:07.768527765Z" level=info msg="CreateContainer within sandbox \"ce869e18967f2888ac53864b8c7b19dd32d646c40c9071e6c9acd750187f0578\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"396a2b9b4522875b3ceffa72d00d7f5dc14116a2a085b62dc20feb1257b9bc77\"" Mar 17 17:30:07.769966 containerd[1721]: time="2025-03-17T17:30:07.769920644Z" level=info msg="StartContainer for \"396a2b9b4522875b3ceffa72d00d7f5dc14116a2a085b62dc20feb1257b9bc77\"" Mar 17 17:30:07.802510 systemd[1]: Started cri-containerd-396a2b9b4522875b3ceffa72d00d7f5dc14116a2a085b62dc20feb1257b9bc77.scope - libcontainer container 396a2b9b4522875b3ceffa72d00d7f5dc14116a2a085b62dc20feb1257b9bc77. Mar 17 17:30:07.851993 containerd[1721]: time="2025-03-17T17:30:07.851938730Z" level=info msg="StartContainer for \"396a2b9b4522875b3ceffa72d00d7f5dc14116a2a085b62dc20feb1257b9bc77\" returns successfully" Mar 17 17:30:07.859501 systemd[1]: cri-containerd-396a2b9b4522875b3ceffa72d00d7f5dc14116a2a085b62dc20feb1257b9bc77.scope: Deactivated successfully. Mar 17 17:30:08.393113 containerd[1721]: time="2025-03-17T17:30:08.393020082Z" level=info msg="shim disconnected" id=396a2b9b4522875b3ceffa72d00d7f5dc14116a2a085b62dc20feb1257b9bc77 namespace=k8s.io Mar 17 17:30:08.393113 containerd[1721]: time="2025-03-17T17:30:08.393092162Z" level=warning msg="cleaning up after shim disconnected" id=396a2b9b4522875b3ceffa72d00d7f5dc14116a2a085b62dc20feb1257b9bc77 namespace=k8s.io Mar 17 17:30:08.393746 containerd[1721]: time="2025-03-17T17:30:08.393573242Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:30:08.461518 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-396a2b9b4522875b3ceffa72d00d7f5dc14116a2a085b62dc20feb1257b9bc77-rootfs.mount: Deactivated successfully. Mar 17 17:30:09.400430 kubelet[3397]: E0317 17:30:09.400079 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vjh47" podUID="feb1e339-9a1a-480e-9e83-ea79ab0971ae" Mar 17 17:30:10.350067 containerd[1721]: time="2025-03-17T17:30:10.349563439Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:10.352913 containerd[1721]: time="2025-03-17T17:30:10.352869796Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=28363957" Mar 17 17:30:10.357722 containerd[1721]: time="2025-03-17T17:30:10.357676391Z" level=info msg="ImageCreate event name:\"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:10.363560 containerd[1721]: time="2025-03-17T17:30:10.363487626Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:10.364425 containerd[1721]: time="2025-03-17T17:30:10.364308425Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"29733706\" in 2.683859341s" Mar 17 17:30:10.364425 containerd[1721]: time="2025-03-17T17:30:10.364340785Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\"" Mar 17 17:30:10.366377 containerd[1721]: time="2025-03-17T17:30:10.366177104Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 17 17:30:10.379213 containerd[1721]: time="2025-03-17T17:30:10.379174132Z" level=info msg="CreateContainer within sandbox \"a0d2c46f2a6f3964622423115728ac837e854b30b577fb76d5ff13a670df68de\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 17 17:30:10.419649 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1631907269.mount: Deactivated successfully. Mar 17 17:30:10.433445 containerd[1721]: time="2025-03-17T17:30:10.433394123Z" level=info msg="CreateContainer within sandbox \"a0d2c46f2a6f3964622423115728ac837e854b30b577fb76d5ff13a670df68de\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"0ddfeeb67220ce0e33c03255073d308c0dc23fff3e20d4b88de83f8dd49ad5e3\"" Mar 17 17:30:10.434260 containerd[1721]: time="2025-03-17T17:30:10.434118682Z" level=info msg="StartContainer for \"0ddfeeb67220ce0e33c03255073d308c0dc23fff3e20d4b88de83f8dd49ad5e3\"" Mar 17 17:30:10.459437 systemd[1]: Started cri-containerd-0ddfeeb67220ce0e33c03255073d308c0dc23fff3e20d4b88de83f8dd49ad5e3.scope - libcontainer container 0ddfeeb67220ce0e33c03255073d308c0dc23fff3e20d4b88de83f8dd49ad5e3. Mar 17 17:30:10.493260 containerd[1721]: time="2025-03-17T17:30:10.493005069Z" level=info msg="StartContainer for \"0ddfeeb67220ce0e33c03255073d308c0dc23fff3e20d4b88de83f8dd49ad5e3\" returns successfully" Mar 17 17:30:10.537680 kubelet[3397]: I0317 17:30:10.537166 3397 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-f4855d5b4-g4x29" podStartSLOduration=2.390183866 podStartE2EDuration="5.53714815s" podCreationTimestamp="2025-03-17 17:30:05 +0000 UTC" firstStartedPulling="2025-03-17 17:30:07.218207701 +0000 UTC m=+25.910214712" lastFinishedPulling="2025-03-17 17:30:10.365171985 +0000 UTC m=+29.057178996" observedRunningTime="2025-03-17 17:30:10.53682963 +0000 UTC m=+29.228836641" watchObservedRunningTime="2025-03-17 17:30:10.53714815 +0000 UTC m=+29.229155161" Mar 17 17:30:11.401442 kubelet[3397]: E0317 17:30:11.401082 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vjh47" podUID="feb1e339-9a1a-480e-9e83-ea79ab0971ae" Mar 17 17:30:11.521196 kubelet[3397]: I0317 17:30:11.521023 3397 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:30:13.400274 kubelet[3397]: E0317 17:30:13.400137 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vjh47" podUID="feb1e339-9a1a-480e-9e83-ea79ab0971ae" Mar 17 17:30:13.613696 containerd[1721]: time="2025-03-17T17:30:13.613426218Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:13.620051 containerd[1721]: time="2025-03-17T17:30:13.619976252Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=91227396" Mar 17 17:30:13.625690 containerd[1721]: time="2025-03-17T17:30:13.625635287Z" level=info msg="ImageCreate event name:\"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:13.654220 containerd[1721]: time="2025-03-17T17:30:13.654073861Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:13.655171 containerd[1721]: time="2025-03-17T17:30:13.655038700Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"92597153\" in 3.288828276s" Mar 17 17:30:13.655171 containerd[1721]: time="2025-03-17T17:30:13.655075700Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\"" Mar 17 17:30:13.657898 containerd[1721]: time="2025-03-17T17:30:13.657714258Z" level=info msg="CreateContainer within sandbox \"ce869e18967f2888ac53864b8c7b19dd32d646c40c9071e6c9acd750187f0578\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 17:30:13.717752 containerd[1721]: time="2025-03-17T17:30:13.717664724Z" level=info msg="CreateContainer within sandbox \"ce869e18967f2888ac53864b8c7b19dd32d646c40c9071e6c9acd750187f0578\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"c62196cd8e4f262b3e8fcfbdc9e2d9c78e7d20bf3dcb9e8945584052839041aa\"" Mar 17 17:30:13.718890 containerd[1721]: time="2025-03-17T17:30:13.718456363Z" level=info msg="StartContainer for \"c62196cd8e4f262b3e8fcfbdc9e2d9c78e7d20bf3dcb9e8945584052839041aa\"" Mar 17 17:30:13.752443 systemd[1]: Started cri-containerd-c62196cd8e4f262b3e8fcfbdc9e2d9c78e7d20bf3dcb9e8945584052839041aa.scope - libcontainer container c62196cd8e4f262b3e8fcfbdc9e2d9c78e7d20bf3dcb9e8945584052839041aa. Mar 17 17:30:13.789267 containerd[1721]: time="2025-03-17T17:30:13.789132539Z" level=info msg="StartContainer for \"c62196cd8e4f262b3e8fcfbdc9e2d9c78e7d20bf3dcb9e8945584052839041aa\" returns successfully" Mar 17 17:30:14.792141 systemd[1]: cri-containerd-c62196cd8e4f262b3e8fcfbdc9e2d9c78e7d20bf3dcb9e8945584052839041aa.scope: Deactivated successfully. Mar 17 17:30:14.810549 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c62196cd8e4f262b3e8fcfbdc9e2d9c78e7d20bf3dcb9e8945584052839041aa-rootfs.mount: Deactivated successfully. Mar 17 17:30:14.861041 kubelet[3397]: I0317 17:30:14.860999 3397 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 17 17:30:14.909188 systemd[1]: Created slice kubepods-burstable-podbce6fc98_b3ee_43d8_88b1_252f92d0da22.slice - libcontainer container kubepods-burstable-podbce6fc98_b3ee_43d8_88b1_252f92d0da22.slice. Mar 17 17:30:15.160648 waagent[1929]: 2025-03-17T17:30:15.000353Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 2] Mar 17 17:30:15.160948 kubelet[3397]: I0317 17:30:14.899184 3397 topology_manager.go:215] "Topology Admit Handler" podUID="bce6fc98-b3ee-43d8-88b1-252f92d0da22" podNamespace="kube-system" podName="coredns-7db6d8ff4d-7f6jb" Mar 17 17:30:15.160948 kubelet[3397]: I0317 17:30:14.910533 3397 topology_manager.go:215] "Topology Admit Handler" podUID="f1d22644-c9fd-4c49-8bcb-aa7ce67eb937" podNamespace="kube-system" podName="coredns-7db6d8ff4d-26p8t" Mar 17 17:30:15.160948 kubelet[3397]: I0317 17:30:14.910717 3397 topology_manager.go:215] "Topology Admit Handler" podUID="4354ea3a-d606-47b2-8073-479d0b804cd3" podNamespace="calico-system" podName="calico-kube-controllers-7b9c99b997-26lfs" Mar 17 17:30:15.160948 kubelet[3397]: I0317 17:30:14.919685 3397 topology_manager.go:215] "Topology Admit Handler" podUID="f70e11b2-e153-4a1f-b987-6a5f11a3781f" podNamespace="calico-apiserver" podName="calico-apiserver-678db5fc46-9s6qs" Mar 17 17:30:15.160948 kubelet[3397]: I0317 17:30:14.920364 3397 topology_manager.go:215] "Topology Admit Handler" podUID="e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a" podNamespace="calico-apiserver" podName="calico-apiserver-678db5fc46-n7khj" Mar 17 17:30:15.160948 kubelet[3397]: I0317 17:30:14.931745 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvcd6\" (UniqueName: \"kubernetes.io/projected/bce6fc98-b3ee-43d8-88b1-252f92d0da22-kube-api-access-bvcd6\") pod \"coredns-7db6d8ff4d-7f6jb\" (UID: \"bce6fc98-b3ee-43d8-88b1-252f92d0da22\") " pod="kube-system/coredns-7db6d8ff4d-7f6jb" Mar 17 17:30:15.160948 kubelet[3397]: I0317 17:30:14.931771 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bce6fc98-b3ee-43d8-88b1-252f92d0da22-config-volume\") pod \"coredns-7db6d8ff4d-7f6jb\" (UID: \"bce6fc98-b3ee-43d8-88b1-252f92d0da22\") " pod="kube-system/coredns-7db6d8ff4d-7f6jb" Mar 17 17:30:14.930817 systemd[1]: Created slice kubepods-besteffort-pod4354ea3a_d606_47b2_8073_479d0b804cd3.slice - libcontainer container kubepods-besteffort-pod4354ea3a_d606_47b2_8073_479d0b804cd3.slice. Mar 17 17:30:15.161162 kubelet[3397]: I0317 17:30:14.931790 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1d22644-c9fd-4c49-8bcb-aa7ce67eb937-config-volume\") pod \"coredns-7db6d8ff4d-26p8t\" (UID: \"f1d22644-c9fd-4c49-8bcb-aa7ce67eb937\") " pod="kube-system/coredns-7db6d8ff4d-26p8t" Mar 17 17:30:15.161162 kubelet[3397]: I0317 17:30:14.931805 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vv9q\" (UniqueName: \"kubernetes.io/projected/f1d22644-c9fd-4c49-8bcb-aa7ce67eb937-kube-api-access-5vv9q\") pod \"coredns-7db6d8ff4d-26p8t\" (UID: \"f1d22644-c9fd-4c49-8bcb-aa7ce67eb937\") " pod="kube-system/coredns-7db6d8ff4d-26p8t" Mar 17 17:30:15.161162 kubelet[3397]: I0317 17:30:14.931826 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4354ea3a-d606-47b2-8073-479d0b804cd3-tigera-ca-bundle\") pod \"calico-kube-controllers-7b9c99b997-26lfs\" (UID: \"4354ea3a-d606-47b2-8073-479d0b804cd3\") " pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" Mar 17 17:30:15.161162 kubelet[3397]: I0317 17:30:14.931843 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkbts\" (UniqueName: \"kubernetes.io/projected/4354ea3a-d606-47b2-8073-479d0b804cd3-kube-api-access-vkbts\") pod \"calico-kube-controllers-7b9c99b997-26lfs\" (UID: \"4354ea3a-d606-47b2-8073-479d0b804cd3\") " pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" Mar 17 17:30:15.161162 kubelet[3397]: I0317 17:30:15.032718 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s754x\" (UniqueName: \"kubernetes.io/projected/e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a-kube-api-access-s754x\") pod \"calico-apiserver-678db5fc46-n7khj\" (UID: \"e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a\") " pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" Mar 17 17:30:14.943970 systemd[1]: Created slice kubepods-burstable-podf1d22644_c9fd_4c49_8bcb_aa7ce67eb937.slice - libcontainer container kubepods-burstable-podf1d22644_c9fd_4c49_8bcb_aa7ce67eb937.slice. Mar 17 17:30:15.161395 kubelet[3397]: I0317 17:30:15.032802 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a-calico-apiserver-certs\") pod \"calico-apiserver-678db5fc46-n7khj\" (UID: \"e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a\") " pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" Mar 17 17:30:15.161395 kubelet[3397]: I0317 17:30:15.032844 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f70e11b2-e153-4a1f-b987-6a5f11a3781f-calico-apiserver-certs\") pod \"calico-apiserver-678db5fc46-9s6qs\" (UID: \"f70e11b2-e153-4a1f-b987-6a5f11a3781f\") " pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" Mar 17 17:30:15.161395 kubelet[3397]: I0317 17:30:15.032862 3397 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2rrv\" (UniqueName: \"kubernetes.io/projected/f70e11b2-e153-4a1f-b987-6a5f11a3781f-kube-api-access-k2rrv\") pod \"calico-apiserver-678db5fc46-9s6qs\" (UID: \"f70e11b2-e153-4a1f-b987-6a5f11a3781f\") " pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" Mar 17 17:30:14.949311 systemd[1]: Created slice kubepods-besteffort-pode6a80aad_f4cd_48fc_a1af_2b0ba4e8f34a.slice - libcontainer container kubepods-besteffort-pode6a80aad_f4cd_48fc_a1af_2b0ba4e8f34a.slice. Mar 17 17:30:14.958102 systemd[1]: Created slice kubepods-besteffort-podf70e11b2_e153_4a1f_b987_6a5f11a3781f.slice - libcontainer container kubepods-besteffort-podf70e11b2_e153_4a1f_b987_6a5f11a3781f.slice. Mar 17 17:30:15.406395 systemd[1]: Created slice kubepods-besteffort-podfeb1e339_9a1a_480e_9e83_ea79ab0971ae.slice - libcontainer container kubepods-besteffort-podfeb1e339_9a1a_480e_9e83_ea79ab0971ae.slice. Mar 17 17:30:15.408782 containerd[1721]: time="2025-03-17T17:30:15.408695480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vjh47,Uid:feb1e339-9a1a-480e-9e83-ea79ab0971ae,Namespace:calico-system,Attempt:0,}" Mar 17 17:30:15.469723 containerd[1721]: time="2025-03-17T17:30:15.469608706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7f6jb,Uid:bce6fc98-b3ee-43d8-88b1-252f92d0da22,Namespace:kube-system,Attempt:0,}" Mar 17 17:30:15.472258 containerd[1721]: time="2025-03-17T17:30:15.472163143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-26p8t,Uid:f1d22644-c9fd-4c49-8bcb-aa7ce67eb937,Namespace:kube-system,Attempt:0,}" Mar 17 17:30:15.475820 containerd[1721]: time="2025-03-17T17:30:15.475758180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-9s6qs,Uid:f70e11b2-e153-4a1f-b987-6a5f11a3781f,Namespace:calico-apiserver,Attempt:0,}" Mar 17 17:30:15.481692 containerd[1721]: time="2025-03-17T17:30:15.481515815Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-n7khj,Uid:e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a,Namespace:calico-apiserver,Attempt:0,}" Mar 17 17:30:15.481692 containerd[1721]: time="2025-03-17T17:30:15.481562775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b9c99b997-26lfs,Uid:4354ea3a-d606-47b2-8073-479d0b804cd3,Namespace:calico-system,Attempt:0,}" Mar 17 17:30:15.860888 waagent[1929]: 2025-03-17T17:30:15.860744Z INFO ExtHandler Mar 17 17:30:15.861019 waagent[1929]: 2025-03-17T17:30:15.860985Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 2] Mar 17 17:30:15.931379 waagent[1929]: 2025-03-17T17:30:15.931329Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 17 17:30:16.019300 waagent[1929]: 2025-03-17T17:30:16.019002Z INFO ExtHandler Downloaded certificate {'thumbprint': '99F181BF8F98C95F2096B79358B60C31829B0D8D', 'hasPrivateKey': True} Mar 17 17:30:16.019568 waagent[1929]: 2025-03-17T17:30:16.019505Z INFO ExtHandler Downloaded certificate {'thumbprint': '77684ECED44B9790E8F413F1A232B0AF51736A0F', 'hasPrivateKey': False} Mar 17 17:30:16.020056 waagent[1929]: 2025-03-17T17:30:16.020007Z INFO ExtHandler Fetch goal state completed Mar 17 17:30:16.020477 waagent[1929]: 2025-03-17T17:30:16.020433Z INFO ExtHandler ExtHandler Mar 17 17:30:16.020551 waagent[1929]: 2025-03-17T17:30:16.020518Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_2 channel: WireServer source: Fabric activity: 832ab936-936c-4735-8441-ec675aacb3ab correlation 89cb9120-e583-4329-9a21-dffaf33d2c8e created: 2025-03-17T17:30:06.230858Z] Mar 17 17:30:16.020908 waagent[1929]: 2025-03-17T17:30:16.020862Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 17 17:30:16.021522 waagent[1929]: 2025-03-17T17:30:16.021481Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_2 1 ms] Mar 17 17:30:16.471945 containerd[1721]: time="2025-03-17T17:30:16.471878163Z" level=info msg="shim disconnected" id=c62196cd8e4f262b3e8fcfbdc9e2d9c78e7d20bf3dcb9e8945584052839041aa namespace=k8s.io Mar 17 17:30:16.471945 containerd[1721]: time="2025-03-17T17:30:16.471937763Z" level=warning msg="cleaning up after shim disconnected" id=c62196cd8e4f262b3e8fcfbdc9e2d9c78e7d20bf3dcb9e8945584052839041aa namespace=k8s.io Mar 17 17:30:16.471945 containerd[1721]: time="2025-03-17T17:30:16.471945403Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:30:16.482288 containerd[1721]: time="2025-03-17T17:30:16.482100274Z" level=warning msg="cleanup warnings time=\"2025-03-17T17:30:16Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Mar 17 17:30:16.535086 containerd[1721]: time="2025-03-17T17:30:16.534916906Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 17 17:30:16.865847 containerd[1721]: time="2025-03-17T17:30:16.864701089Z" level=error msg="Failed to destroy network for sandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.867538 containerd[1721]: time="2025-03-17T17:30:16.867488126Z" level=error msg="encountered an error cleaning up failed sandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.867635 containerd[1721]: time="2025-03-17T17:30:16.867577006Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7f6jb,Uid:bce6fc98-b3ee-43d8-88b1-252f92d0da22,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.868044 kubelet[3397]: E0317 17:30:16.867995 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.868372 kubelet[3397]: E0317 17:30:16.868067 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-7f6jb" Mar 17 17:30:16.868372 kubelet[3397]: E0317 17:30:16.868085 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-7f6jb" Mar 17 17:30:16.868372 kubelet[3397]: E0317 17:30:16.868127 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-7f6jb_kube-system(bce6fc98-b3ee-43d8-88b1-252f92d0da22)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-7f6jb_kube-system(bce6fc98-b3ee-43d8-88b1-252f92d0da22)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-7f6jb" podUID="bce6fc98-b3ee-43d8-88b1-252f92d0da22" Mar 17 17:30:16.869149 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47-shm.mount: Deactivated successfully. Mar 17 17:30:16.923571 containerd[1721]: time="2025-03-17T17:30:16.923515436Z" level=error msg="Failed to destroy network for sandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.925026 containerd[1721]: time="2025-03-17T17:30:16.924996075Z" level=error msg="Failed to destroy network for sandbox \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.925823 containerd[1721]: time="2025-03-17T17:30:16.925314514Z" level=error msg="encountered an error cleaning up failed sandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.925823 containerd[1721]: time="2025-03-17T17:30:16.925707674Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-9s6qs,Uid:f70e11b2-e153-4a1f-b987-6a5f11a3781f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.926157 containerd[1721]: time="2025-03-17T17:30:16.926131194Z" level=error msg="encountered an error cleaning up failed sandbox \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.926982 containerd[1721]: time="2025-03-17T17:30:16.926955313Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-26p8t,Uid:f1d22644-c9fd-4c49-8bcb-aa7ce67eb937,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.927160 kubelet[3397]: E0317 17:30:16.927049 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.927160 kubelet[3397]: E0317 17:30:16.927107 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" Mar 17 17:30:16.927160 kubelet[3397]: E0317 17:30:16.927126 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" Mar 17 17:30:16.927302 kubelet[3397]: E0317 17:30:16.927175 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-678db5fc46-9s6qs_calico-apiserver(f70e11b2-e153-4a1f-b987-6a5f11a3781f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-678db5fc46-9s6qs_calico-apiserver(f70e11b2-e153-4a1f-b987-6a5f11a3781f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" podUID="f70e11b2-e153-4a1f-b987-6a5f11a3781f" Mar 17 17:30:16.928706 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13-shm.mount: Deactivated successfully. Mar 17 17:30:16.930681 kubelet[3397]: E0317 17:30:16.929291 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.930681 kubelet[3397]: E0317 17:30:16.929352 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-26p8t" Mar 17 17:30:16.930681 kubelet[3397]: E0317 17:30:16.929371 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-26p8t" Mar 17 17:30:16.930814 kubelet[3397]: E0317 17:30:16.929565 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-26p8t_kube-system(f1d22644-c9fd-4c49-8bcb-aa7ce67eb937)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-26p8t_kube-system(f1d22644-c9fd-4c49-8bcb-aa7ce67eb937)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-26p8t" podUID="f1d22644-c9fd-4c49-8bcb-aa7ce67eb937" Mar 17 17:30:16.935574 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8-shm.mount: Deactivated successfully. Mar 17 17:30:16.944598 containerd[1721]: time="2025-03-17T17:30:16.944455577Z" level=error msg="Failed to destroy network for sandbox \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.945183 containerd[1721]: time="2025-03-17T17:30:16.945005817Z" level=error msg="encountered an error cleaning up failed sandbox \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.945183 containerd[1721]: time="2025-03-17T17:30:16.945134736Z" level=error msg="Failed to destroy network for sandbox \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.945563 containerd[1721]: time="2025-03-17T17:30:16.945386056Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-n7khj,Uid:e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.945915 containerd[1721]: time="2025-03-17T17:30:16.945877216Z" level=error msg="encountered an error cleaning up failed sandbox \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.946047 containerd[1721]: time="2025-03-17T17:30:16.946009096Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b9c99b997-26lfs,Uid:4354ea3a-d606-47b2-8073-479d0b804cd3,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.946601 kubelet[3397]: E0317 17:30:16.946570 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.946905 kubelet[3397]: E0317 17:30:16.946714 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" Mar 17 17:30:16.946905 kubelet[3397]: E0317 17:30:16.946740 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" Mar 17 17:30:16.946905 kubelet[3397]: E0317 17:30:16.946781 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7b9c99b997-26lfs_calico-system(4354ea3a-d606-47b2-8073-479d0b804cd3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7b9c99b997-26lfs_calico-system(4354ea3a-d606-47b2-8073-479d0b804cd3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" podUID="4354ea3a-d606-47b2-8073-479d0b804cd3" Mar 17 17:30:16.947056 kubelet[3397]: E0317 17:30:16.946822 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.947056 kubelet[3397]: E0317 17:30:16.946838 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" Mar 17 17:30:16.947056 kubelet[3397]: E0317 17:30:16.946851 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" Mar 17 17:30:16.947127 kubelet[3397]: E0317 17:30:16.946872 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-678db5fc46-n7khj_calico-apiserver(e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-678db5fc46-n7khj_calico-apiserver(e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" podUID="e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a" Mar 17 17:30:16.947900 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da-shm.mount: Deactivated successfully. Mar 17 17:30:16.948000 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0-shm.mount: Deactivated successfully. Mar 17 17:30:16.971437 containerd[1721]: time="2025-03-17T17:30:16.971383273Z" level=error msg="Failed to destroy network for sandbox \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.971736 containerd[1721]: time="2025-03-17T17:30:16.971709712Z" level=error msg="encountered an error cleaning up failed sandbox \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.971795 containerd[1721]: time="2025-03-17T17:30:16.971772912Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vjh47,Uid:feb1e339-9a1a-480e-9e83-ea79ab0971ae,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.972347 kubelet[3397]: E0317 17:30:16.971986 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:16.972347 kubelet[3397]: E0317 17:30:16.972038 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vjh47" Mar 17 17:30:16.972347 kubelet[3397]: E0317 17:30:16.972057 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vjh47" Mar 17 17:30:16.972517 kubelet[3397]: E0317 17:30:16.972107 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vjh47_calico-system(feb1e339-9a1a-480e-9e83-ea79ab0971ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vjh47_calico-system(feb1e339-9a1a-480e-9e83-ea79ab0971ae)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vjh47" podUID="feb1e339-9a1a-480e-9e83-ea79ab0971ae" Mar 17 17:30:17.537285 kubelet[3397]: I0317 17:30:17.537206 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8" Mar 17 17:30:17.538121 containerd[1721]: time="2025-03-17T17:30:17.538084082Z" level=info msg="StopPodSandbox for \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\"" Mar 17 17:30:17.538418 containerd[1721]: time="2025-03-17T17:30:17.538293282Z" level=info msg="Ensure that sandbox d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8 in task-service has been cleanup successfully" Mar 17 17:30:17.538798 containerd[1721]: time="2025-03-17T17:30:17.538745922Z" level=info msg="TearDown network for sandbox \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\" successfully" Mar 17 17:30:17.538798 containerd[1721]: time="2025-03-17T17:30:17.538775802Z" level=info msg="StopPodSandbox for \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\" returns successfully" Mar 17 17:30:17.540194 containerd[1721]: time="2025-03-17T17:30:17.540168840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-26p8t,Uid:f1d22644-c9fd-4c49-8bcb-aa7ce67eb937,Namespace:kube-system,Attempt:1,}" Mar 17 17:30:17.540990 kubelet[3397]: I0317 17:30:17.540728 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927" Mar 17 17:30:17.542120 containerd[1721]: time="2025-03-17T17:30:17.541858199Z" level=info msg="StopPodSandbox for \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\"" Mar 17 17:30:17.542120 containerd[1721]: time="2025-03-17T17:30:17.542017039Z" level=info msg="Ensure that sandbox b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927 in task-service has been cleanup successfully" Mar 17 17:30:17.542532 kubelet[3397]: I0317 17:30:17.542469 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da" Mar 17 17:30:17.542732 containerd[1721]: time="2025-03-17T17:30:17.542667678Z" level=info msg="TearDown network for sandbox \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\" successfully" Mar 17 17:30:17.543074 containerd[1721]: time="2025-03-17T17:30:17.543057118Z" level=info msg="StopPodSandbox for \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\" returns successfully" Mar 17 17:30:17.543250 containerd[1721]: time="2025-03-17T17:30:17.542930078Z" level=info msg="StopPodSandbox for \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\"" Mar 17 17:30:17.543619 containerd[1721]: time="2025-03-17T17:30:17.543497757Z" level=info msg="Ensure that sandbox 3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da in task-service has been cleanup successfully" Mar 17 17:30:17.543794 containerd[1721]: time="2025-03-17T17:30:17.543738557Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vjh47,Uid:feb1e339-9a1a-480e-9e83-ea79ab0971ae,Namespace:calico-system,Attempt:1,}" Mar 17 17:30:17.543935 containerd[1721]: time="2025-03-17T17:30:17.543854517Z" level=info msg="TearDown network for sandbox \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\" successfully" Mar 17 17:30:17.543935 containerd[1721]: time="2025-03-17T17:30:17.543874677Z" level=info msg="StopPodSandbox for \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\" returns successfully" Mar 17 17:30:17.544918 kubelet[3397]: I0317 17:30:17.544529 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13" Mar 17 17:30:17.545003 containerd[1721]: time="2025-03-17T17:30:17.544684676Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b9c99b997-26lfs,Uid:4354ea3a-d606-47b2-8073-479d0b804cd3,Namespace:calico-system,Attempt:1,}" Mar 17 17:30:17.545003 containerd[1721]: time="2025-03-17T17:30:17.544915796Z" level=info msg="StopPodSandbox for \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\"" Mar 17 17:30:17.545073 containerd[1721]: time="2025-03-17T17:30:17.545036276Z" level=info msg="Ensure that sandbox 25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13 in task-service has been cleanup successfully" Mar 17 17:30:17.545659 containerd[1721]: time="2025-03-17T17:30:17.545627036Z" level=info msg="TearDown network for sandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\" successfully" Mar 17 17:30:17.545659 containerd[1721]: time="2025-03-17T17:30:17.545651316Z" level=info msg="StopPodSandbox for \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\" returns successfully" Mar 17 17:30:17.546972 containerd[1721]: time="2025-03-17T17:30:17.546616355Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-9s6qs,Uid:f70e11b2-e153-4a1f-b987-6a5f11a3781f,Namespace:calico-apiserver,Attempt:1,}" Mar 17 17:30:17.548290 kubelet[3397]: I0317 17:30:17.547999 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47" Mar 17 17:30:17.549049 containerd[1721]: time="2025-03-17T17:30:17.548821713Z" level=info msg="StopPodSandbox for \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\"" Mar 17 17:30:17.549049 containerd[1721]: time="2025-03-17T17:30:17.548992632Z" level=info msg="Ensure that sandbox aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47 in task-service has been cleanup successfully" Mar 17 17:30:17.549790 kubelet[3397]: I0317 17:30:17.549445 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0" Mar 17 17:30:17.549862 containerd[1721]: time="2025-03-17T17:30:17.549481592Z" level=info msg="TearDown network for sandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\" successfully" Mar 17 17:30:17.549862 containerd[1721]: time="2025-03-17T17:30:17.549496232Z" level=info msg="StopPodSandbox for \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\" returns successfully" Mar 17 17:30:17.551183 containerd[1721]: time="2025-03-17T17:30:17.551136551Z" level=info msg="StopPodSandbox for \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\"" Mar 17 17:30:17.551956 containerd[1721]: time="2025-03-17T17:30:17.551386510Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7f6jb,Uid:bce6fc98-b3ee-43d8-88b1-252f92d0da22,Namespace:kube-system,Attempt:1,}" Mar 17 17:30:17.551956 containerd[1721]: time="2025-03-17T17:30:17.551409550Z" level=info msg="Ensure that sandbox 32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0 in task-service has been cleanup successfully" Mar 17 17:30:17.551956 containerd[1721]: time="2025-03-17T17:30:17.551591990Z" level=info msg="TearDown network for sandbox \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\" successfully" Mar 17 17:30:17.551956 containerd[1721]: time="2025-03-17T17:30:17.551628110Z" level=info msg="StopPodSandbox for \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\" returns successfully" Mar 17 17:30:17.552842 containerd[1721]: time="2025-03-17T17:30:17.552549469Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-n7khj,Uid:e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a,Namespace:calico-apiserver,Attempt:1,}" Mar 17 17:30:17.745920 kubelet[3397]: I0317 17:30:17.745888 3397 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:30:17.816830 systemd[1]: run-netns-cni\x2dbba94ebb\x2d29ee\x2d329e\x2d944b\x2d78398883482a.mount: Deactivated successfully. Mar 17 17:30:17.817039 systemd[1]: run-netns-cni\x2db3792284\x2d89b1\x2d62cb\x2d8f10\x2d0d828d85b168.mount: Deactivated successfully. Mar 17 17:30:17.817089 systemd[1]: run-netns-cni\x2d2a5ccc36\x2d7f26\x2d6b06\x2d0ad5\x2de57e80f7fd34.mount: Deactivated successfully. Mar 17 17:30:17.817138 systemd[1]: run-netns-cni\x2d4fa913ce\x2da293\x2dd660\x2d3001\x2dc2d5eef4f30b.mount: Deactivated successfully. Mar 17 17:30:17.817190 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927-shm.mount: Deactivated successfully. Mar 17 17:30:17.817262 systemd[1]: run-netns-cni\x2deac8f999\x2daeb6\x2d3bd8\x2de481\x2d056416f63da4.mount: Deactivated successfully. Mar 17 17:30:17.817311 systemd[1]: run-netns-cni\x2dd90bcbc3\x2d4200\x2d3a53\x2db39c\x2dafe0b594ba26.mount: Deactivated successfully. Mar 17 17:30:17.903312 containerd[1721]: time="2025-03-17T17:30:17.902893314Z" level=error msg="Failed to destroy network for sandbox \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:17.907498 containerd[1721]: time="2025-03-17T17:30:17.907419870Z" level=error msg="encountered an error cleaning up failed sandbox \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:17.907642 containerd[1721]: time="2025-03-17T17:30:17.907552510Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b9c99b997-26lfs,Uid:4354ea3a-d606-47b2-8073-479d0b804cd3,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:17.911539 kubelet[3397]: E0317 17:30:17.911460 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:17.912639 kubelet[3397]: E0317 17:30:17.911580 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" Mar 17 17:30:17.912639 kubelet[3397]: E0317 17:30:17.911607 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" Mar 17 17:30:17.912639 kubelet[3397]: E0317 17:30:17.911668 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7b9c99b997-26lfs_calico-system(4354ea3a-d606-47b2-8073-479d0b804cd3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7b9c99b997-26lfs_calico-system(4354ea3a-d606-47b2-8073-479d0b804cd3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" podUID="4354ea3a-d606-47b2-8073-479d0b804cd3" Mar 17 17:30:17.984271 containerd[1721]: time="2025-03-17T17:30:17.983709881Z" level=error msg="Failed to destroy network for sandbox \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:17.984271 containerd[1721]: time="2025-03-17T17:30:17.984035161Z" level=error msg="encountered an error cleaning up failed sandbox \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:17.984271 containerd[1721]: time="2025-03-17T17:30:17.984088321Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-9s6qs,Uid:f70e11b2-e153-4a1f-b987-6a5f11a3781f,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:17.984271 containerd[1721]: time="2025-03-17T17:30:17.984197400Z" level=error msg="Failed to destroy network for sandbox \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:17.984509 containerd[1721]: time="2025-03-17T17:30:17.984454440Z" level=error msg="encountered an error cleaning up failed sandbox \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:17.984509 containerd[1721]: time="2025-03-17T17:30:17.984499040Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-26p8t,Uid:f1d22644-c9fd-4c49-8bcb-aa7ce67eb937,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:17.984739 kubelet[3397]: E0317 17:30:17.984705 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:17.984856 kubelet[3397]: E0317 17:30:17.984840 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-26p8t" Mar 17 17:30:17.985400 kubelet[3397]: E0317 17:30:17.985011 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-26p8t" Mar 17 17:30:17.985400 kubelet[3397]: E0317 17:30:17.985030 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:17.985400 kubelet[3397]: E0317 17:30:17.985082 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" Mar 17 17:30:17.985400 kubelet[3397]: E0317 17:30:17.985100 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" Mar 17 17:30:17.986921 kubelet[3397]: E0317 17:30:17.985136 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-678db5fc46-9s6qs_calico-apiserver(f70e11b2-e153-4a1f-b987-6a5f11a3781f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-678db5fc46-9s6qs_calico-apiserver(f70e11b2-e153-4a1f-b987-6a5f11a3781f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" podUID="f70e11b2-e153-4a1f-b987-6a5f11a3781f" Mar 17 17:30:17.986921 kubelet[3397]: E0317 17:30:17.985352 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-26p8t_kube-system(f1d22644-c9fd-4c49-8bcb-aa7ce67eb937)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-26p8t_kube-system(f1d22644-c9fd-4c49-8bcb-aa7ce67eb937)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-26p8t" podUID="f1d22644-c9fd-4c49-8bcb-aa7ce67eb937" Mar 17 17:30:18.004746 containerd[1721]: time="2025-03-17T17:30:18.004503902Z" level=error msg="Failed to destroy network for sandbox \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:18.006026 containerd[1721]: time="2025-03-17T17:30:18.005984621Z" level=error msg="encountered an error cleaning up failed sandbox \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:18.006308 containerd[1721]: time="2025-03-17T17:30:18.006276061Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vjh47,Uid:feb1e339-9a1a-480e-9e83-ea79ab0971ae,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:18.006927 kubelet[3397]: E0317 17:30:18.006888 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:18.007437 kubelet[3397]: E0317 17:30:18.007048 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vjh47" Mar 17 17:30:18.007437 kubelet[3397]: E0317 17:30:18.007075 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vjh47" Mar 17 17:30:18.007437 kubelet[3397]: E0317 17:30:18.007134 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vjh47_calico-system(feb1e339-9a1a-480e-9e83-ea79ab0971ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vjh47_calico-system(feb1e339-9a1a-480e-9e83-ea79ab0971ae)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vjh47" podUID="feb1e339-9a1a-480e-9e83-ea79ab0971ae" Mar 17 17:30:18.022322 containerd[1721]: time="2025-03-17T17:30:18.022254966Z" level=error msg="Failed to destroy network for sandbox \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:18.023454 containerd[1721]: time="2025-03-17T17:30:18.022697926Z" level=error msg="encountered an error cleaning up failed sandbox \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:18.023454 containerd[1721]: time="2025-03-17T17:30:18.023353565Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-n7khj,Uid:e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:18.023625 kubelet[3397]: E0317 17:30:18.023583 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:18.023675 kubelet[3397]: E0317 17:30:18.023643 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" Mar 17 17:30:18.023675 kubelet[3397]: E0317 17:30:18.023665 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" Mar 17 17:30:18.023742 kubelet[3397]: E0317 17:30:18.023703 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-678db5fc46-n7khj_calico-apiserver(e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-678db5fc46-n7khj_calico-apiserver(e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" podUID="e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a" Mar 17 17:30:18.031189 containerd[1721]: time="2025-03-17T17:30:18.031129558Z" level=error msg="Failed to destroy network for sandbox \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:18.031493 containerd[1721]: time="2025-03-17T17:30:18.031461838Z" level=error msg="encountered an error cleaning up failed sandbox \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:18.031548 containerd[1721]: time="2025-03-17T17:30:18.031532478Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7f6jb,Uid:bce6fc98-b3ee-43d8-88b1-252f92d0da22,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:18.032088 kubelet[3397]: E0317 17:30:18.031761 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:18.032088 kubelet[3397]: E0317 17:30:18.031820 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-7f6jb" Mar 17 17:30:18.032088 kubelet[3397]: E0317 17:30:18.031850 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-7f6jb" Mar 17 17:30:18.032810 kubelet[3397]: E0317 17:30:18.031900 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-7f6jb_kube-system(bce6fc98-b3ee-43d8-88b1-252f92d0da22)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-7f6jb_kube-system(bce6fc98-b3ee-43d8-88b1-252f92d0da22)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-7f6jb" podUID="bce6fc98-b3ee-43d8-88b1-252f92d0da22" Mar 17 17:30:18.552919 kubelet[3397]: I0317 17:30:18.552885 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc" Mar 17 17:30:18.555284 containerd[1721]: time="2025-03-17T17:30:18.553963247Z" level=info msg="StopPodSandbox for \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\"" Mar 17 17:30:18.555284 containerd[1721]: time="2025-03-17T17:30:18.554618527Z" level=info msg="Ensure that sandbox 6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc in task-service has been cleanup successfully" Mar 17 17:30:18.555284 containerd[1721]: time="2025-03-17T17:30:18.555200446Z" level=info msg="TearDown network for sandbox \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\" successfully" Mar 17 17:30:18.555284 containerd[1721]: time="2025-03-17T17:30:18.555220206Z" level=info msg="StopPodSandbox for \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\" returns successfully" Mar 17 17:30:18.555683 containerd[1721]: time="2025-03-17T17:30:18.555656966Z" level=info msg="StopPodSandbox for \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\"" Mar 17 17:30:18.555706 kubelet[3397]: I0317 17:30:18.554429 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b" Mar 17 17:30:18.555914 containerd[1721]: time="2025-03-17T17:30:18.555733126Z" level=info msg="TearDown network for sandbox \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\" successfully" Mar 17 17:30:18.555914 containerd[1721]: time="2025-03-17T17:30:18.555751926Z" level=info msg="StopPodSandbox for \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\" returns successfully" Mar 17 17:30:18.555914 containerd[1721]: time="2025-03-17T17:30:18.555798486Z" level=info msg="StopPodSandbox for \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\"" Mar 17 17:30:18.556001 containerd[1721]: time="2025-03-17T17:30:18.555920685Z" level=info msg="Ensure that sandbox bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b in task-service has been cleanup successfully" Mar 17 17:30:18.556501 containerd[1721]: time="2025-03-17T17:30:18.556458765Z" level=info msg="TearDown network for sandbox \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\" successfully" Mar 17 17:30:18.556501 containerd[1721]: time="2025-03-17T17:30:18.556485805Z" level=info msg="StopPodSandbox for \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\" returns successfully" Mar 17 17:30:18.557773 containerd[1721]: time="2025-03-17T17:30:18.557741724Z" level=info msg="StopPodSandbox for \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\"" Mar 17 17:30:18.557939 containerd[1721]: time="2025-03-17T17:30:18.557918324Z" level=info msg="TearDown network for sandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\" successfully" Mar 17 17:30:18.557977 containerd[1721]: time="2025-03-17T17:30:18.557936604Z" level=info msg="StopPodSandbox for \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\" returns successfully" Mar 17 17:30:18.558710 containerd[1721]: time="2025-03-17T17:30:18.558682763Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-9s6qs,Uid:f70e11b2-e153-4a1f-b987-6a5f11a3781f,Namespace:calico-apiserver,Attempt:2,}" Mar 17 17:30:18.558948 containerd[1721]: time="2025-03-17T17:30:18.558795203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b9c99b997-26lfs,Uid:4354ea3a-d606-47b2-8073-479d0b804cd3,Namespace:calico-system,Attempt:2,}" Mar 17 17:30:18.559059 kubelet[3397]: I0317 17:30:18.559014 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5" Mar 17 17:30:18.560383 containerd[1721]: time="2025-03-17T17:30:18.559820802Z" level=info msg="StopPodSandbox for \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\"" Mar 17 17:30:18.560383 containerd[1721]: time="2025-03-17T17:30:18.560155082Z" level=info msg="Ensure that sandbox 6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5 in task-service has been cleanup successfully" Mar 17 17:30:18.561686 containerd[1721]: time="2025-03-17T17:30:18.560899841Z" level=info msg="TearDown network for sandbox \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\" successfully" Mar 17 17:30:18.561686 containerd[1721]: time="2025-03-17T17:30:18.560926281Z" level=info msg="StopPodSandbox for \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\" returns successfully" Mar 17 17:30:18.561686 containerd[1721]: time="2025-03-17T17:30:18.561276161Z" level=info msg="StopPodSandbox for \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\"" Mar 17 17:30:18.561686 containerd[1721]: time="2025-03-17T17:30:18.561356441Z" level=info msg="TearDown network for sandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\" successfully" Mar 17 17:30:18.561686 containerd[1721]: time="2025-03-17T17:30:18.561365801Z" level=info msg="StopPodSandbox for \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\" returns successfully" Mar 17 17:30:18.562570 containerd[1721]: time="2025-03-17T17:30:18.562026200Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7f6jb,Uid:bce6fc98-b3ee-43d8-88b1-252f92d0da22,Namespace:kube-system,Attempt:2,}" Mar 17 17:30:18.563792 kubelet[3397]: I0317 17:30:18.563764 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62" Mar 17 17:30:18.564399 containerd[1721]: time="2025-03-17T17:30:18.564291518Z" level=info msg="StopPodSandbox for \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\"" Mar 17 17:30:18.564475 containerd[1721]: time="2025-03-17T17:30:18.564453038Z" level=info msg="Ensure that sandbox 1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62 in task-service has been cleanup successfully" Mar 17 17:30:18.564949 containerd[1721]: time="2025-03-17T17:30:18.564802117Z" level=info msg="TearDown network for sandbox \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\" successfully" Mar 17 17:30:18.564949 containerd[1721]: time="2025-03-17T17:30:18.564825397Z" level=info msg="StopPodSandbox for \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\" returns successfully" Mar 17 17:30:18.567176 kubelet[3397]: I0317 17:30:18.566781 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507" Mar 17 17:30:18.567307 containerd[1721]: time="2025-03-17T17:30:18.567161275Z" level=info msg="StopPodSandbox for \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\"" Mar 17 17:30:18.567307 containerd[1721]: time="2025-03-17T17:30:18.567292035Z" level=info msg="TearDown network for sandbox \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\" successfully" Mar 17 17:30:18.567307 containerd[1721]: time="2025-03-17T17:30:18.567303355Z" level=info msg="StopPodSandbox for \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\" returns successfully" Mar 17 17:30:18.567833 containerd[1721]: time="2025-03-17T17:30:18.567466195Z" level=info msg="StopPodSandbox for \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\"" Mar 17 17:30:18.567833 containerd[1721]: time="2025-03-17T17:30:18.567606235Z" level=info msg="Ensure that sandbox 6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507 in task-service has been cleanup successfully" Mar 17 17:30:18.568112 containerd[1721]: time="2025-03-17T17:30:18.567976115Z" level=info msg="TearDown network for sandbox \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\" successfully" Mar 17 17:30:18.568112 containerd[1721]: time="2025-03-17T17:30:18.568000915Z" level=info msg="StopPodSandbox for \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\" returns successfully" Mar 17 17:30:18.569566 containerd[1721]: time="2025-03-17T17:30:18.568992194Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-n7khj,Uid:e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a,Namespace:calico-apiserver,Attempt:2,}" Mar 17 17:30:18.569770 kubelet[3397]: I0317 17:30:18.569719 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc" Mar 17 17:30:18.570868 containerd[1721]: time="2025-03-17T17:30:18.569665633Z" level=info msg="StopPodSandbox for \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\"" Mar 17 17:30:18.570868 containerd[1721]: time="2025-03-17T17:30:18.570817352Z" level=info msg="TearDown network for sandbox \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\" successfully" Mar 17 17:30:18.570868 containerd[1721]: time="2025-03-17T17:30:18.570827392Z" level=info msg="StopPodSandbox for \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\" returns successfully" Mar 17 17:30:18.571150 containerd[1721]: time="2025-03-17T17:30:18.571123912Z" level=info msg="StopPodSandbox for \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\"" Mar 17 17:30:18.571802 containerd[1721]: time="2025-03-17T17:30:18.571698231Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-26p8t,Uid:f1d22644-c9fd-4c49-8bcb-aa7ce67eb937,Namespace:kube-system,Attempt:2,}" Mar 17 17:30:18.572088 containerd[1721]: time="2025-03-17T17:30:18.571754751Z" level=info msg="Ensure that sandbox 025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc in task-service has been cleanup successfully" Mar 17 17:30:18.572191 containerd[1721]: time="2025-03-17T17:30:18.572173311Z" level=info msg="TearDown network for sandbox \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\" successfully" Mar 17 17:30:18.572423 containerd[1721]: time="2025-03-17T17:30:18.572277031Z" level=info msg="StopPodSandbox for \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\" returns successfully" Mar 17 17:30:18.572680 containerd[1721]: time="2025-03-17T17:30:18.572651030Z" level=info msg="StopPodSandbox for \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\"" Mar 17 17:30:18.572761 containerd[1721]: time="2025-03-17T17:30:18.572739790Z" level=info msg="TearDown network for sandbox \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\" successfully" Mar 17 17:30:18.572761 containerd[1721]: time="2025-03-17T17:30:18.572756190Z" level=info msg="StopPodSandbox for \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\" returns successfully" Mar 17 17:30:18.573407 containerd[1721]: time="2025-03-17T17:30:18.573221150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vjh47,Uid:feb1e339-9a1a-480e-9e83-ea79ab0971ae,Namespace:calico-system,Attempt:2,}" Mar 17 17:30:18.812151 systemd[1]: run-netns-cni\x2d2db73141\x2d8c61\x2db984\x2d2cff\x2d0bc4604da396.mount: Deactivated successfully. Mar 17 17:30:18.812276 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5-shm.mount: Deactivated successfully. Mar 17 17:30:18.812338 systemd[1]: run-netns-cni\x2da14307e2\x2d4206\x2d0acb\x2d11e5\x2d91036e740b37.mount: Deactivated successfully. Mar 17 17:30:18.812381 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc-shm.mount: Deactivated successfully. Mar 17 17:30:18.812430 systemd[1]: run-netns-cni\x2d1f277d95\x2d5bac\x2d9515\x2da656\x2db75c544d5221.mount: Deactivated successfully. Mar 17 17:30:18.812475 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b-shm.mount: Deactivated successfully. Mar 17 17:30:18.812520 systemd[1]: run-netns-cni\x2d958e6542\x2d1b94\x2df774\x2d5baa\x2d819698110b0c.mount: Deactivated successfully. Mar 17 17:30:18.812565 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507-shm.mount: Deactivated successfully. Mar 17 17:30:18.812612 systemd[1]: run-netns-cni\x2db3fb10ec\x2d98bd\x2dd1b0\x2d738e\x2d95e4de469343.mount: Deactivated successfully. Mar 17 17:30:18.812653 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc-shm.mount: Deactivated successfully. Mar 17 17:30:19.528835 containerd[1721]: time="2025-03-17T17:30:19.528693689Z" level=error msg="Failed to destroy network for sandbox \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.532377 containerd[1721]: time="2025-03-17T17:30:19.531698686Z" level=error msg="encountered an error cleaning up failed sandbox \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.533236 containerd[1721]: time="2025-03-17T17:30:19.532357086Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-9s6qs,Uid:f70e11b2-e153-4a1f-b987-6a5f11a3781f,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.534221 kubelet[3397]: E0317 17:30:19.533970 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.534221 kubelet[3397]: E0317 17:30:19.534023 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" Mar 17 17:30:19.534221 kubelet[3397]: E0317 17:30:19.534044 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" Mar 17 17:30:19.534644 kubelet[3397]: E0317 17:30:19.534080 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-678db5fc46-9s6qs_calico-apiserver(f70e11b2-e153-4a1f-b987-6a5f11a3781f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-678db5fc46-9s6qs_calico-apiserver(f70e11b2-e153-4a1f-b987-6a5f11a3781f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" podUID="f70e11b2-e153-4a1f-b987-6a5f11a3781f" Mar 17 17:30:19.586569 kubelet[3397]: I0317 17:30:19.585808 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995" Mar 17 17:30:19.590137 containerd[1721]: time="2025-03-17T17:30:19.588912035Z" level=info msg="StopPodSandbox for \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\"" Mar 17 17:30:19.590137 containerd[1721]: time="2025-03-17T17:30:19.589096315Z" level=info msg="Ensure that sandbox 3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995 in task-service has been cleanup successfully" Mar 17 17:30:19.597922 containerd[1721]: time="2025-03-17T17:30:19.597797227Z" level=info msg="TearDown network for sandbox \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\" successfully" Mar 17 17:30:19.597922 containerd[1721]: time="2025-03-17T17:30:19.597830507Z" level=info msg="StopPodSandbox for \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\" returns successfully" Mar 17 17:30:19.598866 containerd[1721]: time="2025-03-17T17:30:19.598692626Z" level=info msg="StopPodSandbox for \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\"" Mar 17 17:30:19.598866 containerd[1721]: time="2025-03-17T17:30:19.598787066Z" level=info msg="TearDown network for sandbox \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\" successfully" Mar 17 17:30:19.598866 containerd[1721]: time="2025-03-17T17:30:19.598797226Z" level=info msg="StopPodSandbox for \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\" returns successfully" Mar 17 17:30:19.599467 containerd[1721]: time="2025-03-17T17:30:19.599341146Z" level=info msg="StopPodSandbox for \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\"" Mar 17 17:30:19.599467 containerd[1721]: time="2025-03-17T17:30:19.599424185Z" level=info msg="TearDown network for sandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\" successfully" Mar 17 17:30:19.599467 containerd[1721]: time="2025-03-17T17:30:19.599433105Z" level=info msg="StopPodSandbox for \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\" returns successfully" Mar 17 17:30:19.600658 containerd[1721]: time="2025-03-17T17:30:19.600388385Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-9s6qs,Uid:f70e11b2-e153-4a1f-b987-6a5f11a3781f,Namespace:calico-apiserver,Attempt:3,}" Mar 17 17:30:19.613977 containerd[1721]: time="2025-03-17T17:30:19.613931772Z" level=error msg="Failed to destroy network for sandbox \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.614603 containerd[1721]: time="2025-03-17T17:30:19.614569972Z" level=error msg="encountered an error cleaning up failed sandbox \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.615665 containerd[1721]: time="2025-03-17T17:30:19.615220291Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b9c99b997-26lfs,Uid:4354ea3a-d606-47b2-8073-479d0b804cd3,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.616263 kubelet[3397]: E0317 17:30:19.616196 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.616687 kubelet[3397]: E0317 17:30:19.616383 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" Mar 17 17:30:19.616687 kubelet[3397]: E0317 17:30:19.616412 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" Mar 17 17:30:19.616687 kubelet[3397]: E0317 17:30:19.616464 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7b9c99b997-26lfs_calico-system(4354ea3a-d606-47b2-8073-479d0b804cd3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7b9c99b997-26lfs_calico-system(4354ea3a-d606-47b2-8073-479d0b804cd3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" podUID="4354ea3a-d606-47b2-8073-479d0b804cd3" Mar 17 17:30:19.643667 containerd[1721]: time="2025-03-17T17:30:19.643620026Z" level=error msg="Failed to destroy network for sandbox \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.646598 containerd[1721]: time="2025-03-17T17:30:19.646465063Z" level=error msg="encountered an error cleaning up failed sandbox \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.646801 containerd[1721]: time="2025-03-17T17:30:19.646779823Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7f6jb,Uid:bce6fc98-b3ee-43d8-88b1-252f92d0da22,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.648187 kubelet[3397]: E0317 17:30:19.648060 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.650380 kubelet[3397]: E0317 17:30:19.648120 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-7f6jb" Mar 17 17:30:19.650380 kubelet[3397]: E0317 17:30:19.648403 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-7f6jb" Mar 17 17:30:19.650380 kubelet[3397]: E0317 17:30:19.649345 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-7f6jb_kube-system(bce6fc98-b3ee-43d8-88b1-252f92d0da22)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-7f6jb_kube-system(bce6fc98-b3ee-43d8-88b1-252f92d0da22)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-7f6jb" podUID="bce6fc98-b3ee-43d8-88b1-252f92d0da22" Mar 17 17:30:19.656206 containerd[1721]: time="2025-03-17T17:30:19.656163254Z" level=error msg="Failed to destroy network for sandbox \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.657688 containerd[1721]: time="2025-03-17T17:30:19.657627373Z" level=error msg="encountered an error cleaning up failed sandbox \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.657905 containerd[1721]: time="2025-03-17T17:30:19.657882613Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-n7khj,Uid:e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.658645 kubelet[3397]: E0317 17:30:19.658580 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.658845 kubelet[3397]: E0317 17:30:19.658826 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" Mar 17 17:30:19.659555 kubelet[3397]: E0317 17:30:19.659345 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" Mar 17 17:30:19.659811 kubelet[3397]: E0317 17:30:19.659683 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-678db5fc46-n7khj_calico-apiserver(e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-678db5fc46-n7khj_calico-apiserver(e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" podUID="e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a" Mar 17 17:30:19.665477 containerd[1721]: time="2025-03-17T17:30:19.665436006Z" level=error msg="Failed to destroy network for sandbox \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.667665 containerd[1721]: time="2025-03-17T17:30:19.667624404Z" level=error msg="encountered an error cleaning up failed sandbox \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.668180 containerd[1721]: time="2025-03-17T17:30:19.668064604Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-26p8t,Uid:f1d22644-c9fd-4c49-8bcb-aa7ce67eb937,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.668633 kubelet[3397]: E0317 17:30:19.668601 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.668747 kubelet[3397]: E0317 17:30:19.668730 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-26p8t" Mar 17 17:30:19.669173 kubelet[3397]: E0317 17:30:19.668833 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-26p8t" Mar 17 17:30:19.669173 kubelet[3397]: E0317 17:30:19.668906 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-26p8t_kube-system(f1d22644-c9fd-4c49-8bcb-aa7ce67eb937)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-26p8t_kube-system(f1d22644-c9fd-4c49-8bcb-aa7ce67eb937)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-26p8t" podUID="f1d22644-c9fd-4c49-8bcb-aa7ce67eb937" Mar 17 17:30:19.671337 containerd[1721]: time="2025-03-17T17:30:19.671292041Z" level=error msg="Failed to destroy network for sandbox \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.672680 containerd[1721]: time="2025-03-17T17:30:19.672224000Z" level=error msg="encountered an error cleaning up failed sandbox \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.672680 containerd[1721]: time="2025-03-17T17:30:19.672591360Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vjh47,Uid:feb1e339-9a1a-480e-9e83-ea79ab0971ae,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.672988 kubelet[3397]: E0317 17:30:19.672883 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.672988 kubelet[3397]: E0317 17:30:19.672933 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vjh47" Mar 17 17:30:19.672988 kubelet[3397]: E0317 17:30:19.672957 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vjh47" Mar 17 17:30:19.673266 kubelet[3397]: E0317 17:30:19.673169 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vjh47_calico-system(feb1e339-9a1a-480e-9e83-ea79ab0971ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vjh47_calico-system(feb1e339-9a1a-480e-9e83-ea79ab0971ae)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vjh47" podUID="feb1e339-9a1a-480e-9e83-ea79ab0971ae" Mar 17 17:30:19.741503 containerd[1721]: time="2025-03-17T17:30:19.741364258Z" level=error msg="Failed to destroy network for sandbox \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.742631 containerd[1721]: time="2025-03-17T17:30:19.742269697Z" level=error msg="encountered an error cleaning up failed sandbox \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.742631 containerd[1721]: time="2025-03-17T17:30:19.742361977Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-9s6qs,Uid:f70e11b2-e153-4a1f-b987-6a5f11a3781f,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.742845 kubelet[3397]: E0317 17:30:19.742607 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:19.742845 kubelet[3397]: E0317 17:30:19.742660 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" Mar 17 17:30:19.742845 kubelet[3397]: E0317 17:30:19.742688 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" Mar 17 17:30:19.742932 kubelet[3397]: E0317 17:30:19.742728 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-678db5fc46-9s6qs_calico-apiserver(f70e11b2-e153-4a1f-b987-6a5f11a3781f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-678db5fc46-9s6qs_calico-apiserver(f70e11b2-e153-4a1f-b987-6a5f11a3781f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" podUID="f70e11b2-e153-4a1f-b987-6a5f11a3781f" Mar 17 17:30:19.815141 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b-shm.mount: Deactivated successfully. Mar 17 17:30:19.816100 systemd[1]: run-netns-cni\x2de9357c70\x2da019\x2d24f0\x2dc74d\x2d27d05a3dcb94.mount: Deactivated successfully. Mar 17 17:30:19.816166 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995-shm.mount: Deactivated successfully. Mar 17 17:30:20.591705 kubelet[3397]: I0317 17:30:20.591668 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745" Mar 17 17:30:20.592585 containerd[1721]: time="2025-03-17T17:30:20.592546931Z" level=info msg="StopPodSandbox for \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\"" Mar 17 17:30:20.594090 containerd[1721]: time="2025-03-17T17:30:20.592718050Z" level=info msg="Ensure that sandbox 7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745 in task-service has been cleanup successfully" Mar 17 17:30:20.595066 containerd[1721]: time="2025-03-17T17:30:20.594202889Z" level=info msg="TearDown network for sandbox \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\" successfully" Mar 17 17:30:20.595066 containerd[1721]: time="2025-03-17T17:30:20.594879568Z" level=info msg="StopPodSandbox for \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\" returns successfully" Mar 17 17:30:20.596657 containerd[1721]: time="2025-03-17T17:30:20.596457487Z" level=info msg="StopPodSandbox for \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\"" Mar 17 17:30:20.597494 containerd[1721]: time="2025-03-17T17:30:20.596866127Z" level=info msg="TearDown network for sandbox \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\" successfully" Mar 17 17:30:20.597494 containerd[1721]: time="2025-03-17T17:30:20.596980287Z" level=info msg="StopPodSandbox for \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\" returns successfully" Mar 17 17:30:20.597494 containerd[1721]: time="2025-03-17T17:30:20.597415846Z" level=info msg="StopPodSandbox for \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\"" Mar 17 17:30:20.597494 containerd[1721]: time="2025-03-17T17:30:20.597492126Z" level=info msg="TearDown network for sandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\" successfully" Mar 17 17:30:20.597623 containerd[1721]: time="2025-03-17T17:30:20.597501566Z" level=info msg="StopPodSandbox for \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\" returns successfully" Mar 17 17:30:20.599540 containerd[1721]: time="2025-03-17T17:30:20.598270565Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7f6jb,Uid:bce6fc98-b3ee-43d8-88b1-252f92d0da22,Namespace:kube-system,Attempt:3,}" Mar 17 17:30:20.598674 systemd[1]: run-netns-cni\x2d0b64c998\x2de3b7\x2de097\x2d13b3\x2d9189a07bdee1.mount: Deactivated successfully. Mar 17 17:30:20.601291 kubelet[3397]: I0317 17:30:20.600607 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d" Mar 17 17:30:20.601382 containerd[1721]: time="2025-03-17T17:30:20.601100003Z" level=info msg="StopPodSandbox for \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\"" Mar 17 17:30:20.601782 containerd[1721]: time="2025-03-17T17:30:20.601647442Z" level=info msg="Ensure that sandbox 2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d in task-service has been cleanup successfully" Mar 17 17:30:20.604441 containerd[1721]: time="2025-03-17T17:30:20.604404520Z" level=info msg="TearDown network for sandbox \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\" successfully" Mar 17 17:30:20.604441 containerd[1721]: time="2025-03-17T17:30:20.604431760Z" level=info msg="StopPodSandbox for \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\" returns successfully" Mar 17 17:30:20.608027 containerd[1721]: time="2025-03-17T17:30:20.605185879Z" level=info msg="StopPodSandbox for \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\"" Mar 17 17:30:20.608027 containerd[1721]: time="2025-03-17T17:30:20.605617839Z" level=info msg="TearDown network for sandbox \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\" successfully" Mar 17 17:30:20.608027 containerd[1721]: time="2025-03-17T17:30:20.605632919Z" level=info msg="StopPodSandbox for \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\" returns successfully" Mar 17 17:30:20.608027 containerd[1721]: time="2025-03-17T17:30:20.605855159Z" level=info msg="StopPodSandbox for \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\"" Mar 17 17:30:20.608027 containerd[1721]: time="2025-03-17T17:30:20.606542558Z" level=info msg="TearDown network for sandbox \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\" successfully" Mar 17 17:30:20.608027 containerd[1721]: time="2025-03-17T17:30:20.606561158Z" level=info msg="StopPodSandbox for \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\" returns successfully" Mar 17 17:30:20.606951 systemd[1]: run-netns-cni\x2dac2a7be4\x2d70da\x2d6db4\x2d5b2f\x2d1fd546d7ea28.mount: Deactivated successfully. Mar 17 17:30:20.608457 kubelet[3397]: I0317 17:30:20.606877 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef" Mar 17 17:30:20.608495 containerd[1721]: time="2025-03-17T17:30:20.608432556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-n7khj,Uid:e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a,Namespace:calico-apiserver,Attempt:3,}" Mar 17 17:30:20.609506 containerd[1721]: time="2025-03-17T17:30:20.609016236Z" level=info msg="StopPodSandbox for \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\"" Mar 17 17:30:20.609506 containerd[1721]: time="2025-03-17T17:30:20.609172836Z" level=info msg="Ensure that sandbox de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef in task-service has been cleanup successfully" Mar 17 17:30:20.614343 containerd[1721]: time="2025-03-17T17:30:20.614263671Z" level=info msg="TearDown network for sandbox \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\" successfully" Mar 17 17:30:20.614343 containerd[1721]: time="2025-03-17T17:30:20.614292311Z" level=info msg="StopPodSandbox for \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\" returns successfully" Mar 17 17:30:20.615686 containerd[1721]: time="2025-03-17T17:30:20.614577791Z" level=info msg="StopPodSandbox for \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\"" Mar 17 17:30:20.615686 containerd[1721]: time="2025-03-17T17:30:20.614650631Z" level=info msg="TearDown network for sandbox \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\" successfully" Mar 17 17:30:20.615686 containerd[1721]: time="2025-03-17T17:30:20.614659431Z" level=info msg="StopPodSandbox for \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\" returns successfully" Mar 17 17:30:20.615759 kubelet[3397]: I0317 17:30:20.615196 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217" Mar 17 17:30:20.619498 containerd[1721]: time="2025-03-17T17:30:20.618483667Z" level=info msg="StopPodSandbox for \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\"" Mar 17 17:30:20.619498 containerd[1721]: time="2025-03-17T17:30:20.618648667Z" level=info msg="Ensure that sandbox 85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217 in task-service has been cleanup successfully" Mar 17 17:30:20.619498 containerd[1721]: time="2025-03-17T17:30:20.619292786Z" level=info msg="TearDown network for sandbox \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\" successfully" Mar 17 17:30:20.619498 containerd[1721]: time="2025-03-17T17:30:20.619314466Z" level=info msg="StopPodSandbox for \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\" returns successfully" Mar 17 17:30:20.619922 systemd[1]: run-netns-cni\x2d914ad63e\x2d3b1d\x2d93ae\x2df348\x2dda43dc058f18.mount: Deactivated successfully. Mar 17 17:30:20.620589 containerd[1721]: time="2025-03-17T17:30:20.620528825Z" level=info msg="StopPodSandbox for \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\"" Mar 17 17:30:20.620949 containerd[1721]: time="2025-03-17T17:30:20.620923425Z" level=info msg="StopPodSandbox for \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\"" Mar 17 17:30:20.621527 containerd[1721]: time="2025-03-17T17:30:20.621006065Z" level=info msg="TearDown network for sandbox \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\" successfully" Mar 17 17:30:20.621527 containerd[1721]: time="2025-03-17T17:30:20.621020465Z" level=info msg="StopPodSandbox for \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\" returns successfully" Mar 17 17:30:20.621527 containerd[1721]: time="2025-03-17T17:30:20.621088305Z" level=info msg="TearDown network for sandbox \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\" successfully" Mar 17 17:30:20.621527 containerd[1721]: time="2025-03-17T17:30:20.621103665Z" level=info msg="StopPodSandbox for \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\" returns successfully" Mar 17 17:30:20.624579 containerd[1721]: time="2025-03-17T17:30:20.622719543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-26p8t,Uid:f1d22644-c9fd-4c49-8bcb-aa7ce67eb937,Namespace:kube-system,Attempt:3,}" Mar 17 17:30:20.624579 containerd[1721]: time="2025-03-17T17:30:20.622777423Z" level=info msg="StopPodSandbox for \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\"" Mar 17 17:30:20.624579 containerd[1721]: time="2025-03-17T17:30:20.623705982Z" level=info msg="TearDown network for sandbox \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\" successfully" Mar 17 17:30:20.624579 containerd[1721]: time="2025-03-17T17:30:20.623823502Z" level=info msg="StopPodSandbox for \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\" returns successfully" Mar 17 17:30:20.625997 containerd[1721]: time="2025-03-17T17:30:20.625168421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vjh47,Uid:feb1e339-9a1a-480e-9e83-ea79ab0971ae,Namespace:calico-system,Attempt:3,}" Mar 17 17:30:20.626547 kubelet[3397]: I0317 17:30:20.626125 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b" Mar 17 17:30:20.628389 systemd[1]: run-netns-cni\x2d55d4fe82\x2deff2\x2d604c\x2da395\x2de14b7d938ae4.mount: Deactivated successfully. Mar 17 17:30:20.630988 containerd[1721]: time="2025-03-17T17:30:20.630765496Z" level=info msg="StopPodSandbox for \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\"" Mar 17 17:30:20.631736 containerd[1721]: time="2025-03-17T17:30:20.631699815Z" level=info msg="Ensure that sandbox 6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b in task-service has been cleanup successfully" Mar 17 17:30:20.632469 containerd[1721]: time="2025-03-17T17:30:20.632382575Z" level=info msg="TearDown network for sandbox \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\" successfully" Mar 17 17:30:20.632469 containerd[1721]: time="2025-03-17T17:30:20.632407335Z" level=info msg="StopPodSandbox for \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\" returns successfully" Mar 17 17:30:20.633952 containerd[1721]: time="2025-03-17T17:30:20.633915213Z" level=info msg="StopPodSandbox for \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\"" Mar 17 17:30:20.634040 containerd[1721]: time="2025-03-17T17:30:20.634007733Z" level=info msg="TearDown network for sandbox \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\" successfully" Mar 17 17:30:20.634040 containerd[1721]: time="2025-03-17T17:30:20.634021133Z" level=info msg="StopPodSandbox for \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\" returns successfully" Mar 17 17:30:20.635056 containerd[1721]: time="2025-03-17T17:30:20.634962732Z" level=info msg="StopPodSandbox for \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\"" Mar 17 17:30:20.635128 containerd[1721]: time="2025-03-17T17:30:20.635083092Z" level=info msg="TearDown network for sandbox \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\" successfully" Mar 17 17:30:20.635128 containerd[1721]: time="2025-03-17T17:30:20.635094372Z" level=info msg="StopPodSandbox for \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\" returns successfully" Mar 17 17:30:20.636943 containerd[1721]: time="2025-03-17T17:30:20.636826611Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b9c99b997-26lfs,Uid:4354ea3a-d606-47b2-8073-479d0b804cd3,Namespace:calico-system,Attempt:3,}" Mar 17 17:30:20.638665 kubelet[3397]: I0317 17:30:20.638632 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b" Mar 17 17:30:20.640770 containerd[1721]: time="2025-03-17T17:30:20.640609767Z" level=info msg="StopPodSandbox for \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\"" Mar 17 17:30:20.641133 containerd[1721]: time="2025-03-17T17:30:20.641096047Z" level=info msg="Ensure that sandbox 3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b in task-service has been cleanup successfully" Mar 17 17:30:20.643621 containerd[1721]: time="2025-03-17T17:30:20.643489245Z" level=info msg="TearDown network for sandbox \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\" successfully" Mar 17 17:30:20.643621 containerd[1721]: time="2025-03-17T17:30:20.643519605Z" level=info msg="StopPodSandbox for \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\" returns successfully" Mar 17 17:30:20.644304 containerd[1721]: time="2025-03-17T17:30:20.644068244Z" level=info msg="StopPodSandbox for \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\"" Mar 17 17:30:20.644304 containerd[1721]: time="2025-03-17T17:30:20.644160844Z" level=info msg="TearDown network for sandbox \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\" successfully" Mar 17 17:30:20.644304 containerd[1721]: time="2025-03-17T17:30:20.644171324Z" level=info msg="StopPodSandbox for \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\" returns successfully" Mar 17 17:30:20.644873 containerd[1721]: time="2025-03-17T17:30:20.644625004Z" level=info msg="StopPodSandbox for \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\"" Mar 17 17:30:20.644873 containerd[1721]: time="2025-03-17T17:30:20.644802523Z" level=info msg="TearDown network for sandbox \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\" successfully" Mar 17 17:30:20.644873 containerd[1721]: time="2025-03-17T17:30:20.644815243Z" level=info msg="StopPodSandbox for \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\" returns successfully" Mar 17 17:30:20.645866 containerd[1721]: time="2025-03-17T17:30:20.645443603Z" level=info msg="StopPodSandbox for \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\"" Mar 17 17:30:20.645866 containerd[1721]: time="2025-03-17T17:30:20.645552243Z" level=info msg="TearDown network for sandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\" successfully" Mar 17 17:30:20.646189 containerd[1721]: time="2025-03-17T17:30:20.646006802Z" level=info msg="StopPodSandbox for \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\" returns successfully" Mar 17 17:30:20.647175 containerd[1721]: time="2025-03-17T17:30:20.646867162Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-9s6qs,Uid:f70e11b2-e153-4a1f-b987-6a5f11a3781f,Namespace:calico-apiserver,Attempt:4,}" Mar 17 17:30:20.815495 systemd[1]: run-netns-cni\x2dfc8c01cf\x2d3f00\x2d12d9\x2d1c31\x2d051ff91bd664.mount: Deactivated successfully. Mar 17 17:30:20.815586 systemd[1]: run-netns-cni\x2d61c34464\x2d96b9\x2d197f\x2dfe06\x2daae23af351c7.mount: Deactivated successfully. Mar 17 17:30:20.903178 containerd[1721]: time="2025-03-17T17:30:20.903033610Z" level=error msg="Failed to destroy network for sandbox \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:20.907442 containerd[1721]: time="2025-03-17T17:30:20.905221249Z" level=error msg="encountered an error cleaning up failed sandbox \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:20.907442 containerd[1721]: time="2025-03-17T17:30:20.905332288Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7f6jb,Uid:bce6fc98-b3ee-43d8-88b1-252f92d0da22,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:20.907622 kubelet[3397]: E0317 17:30:20.905657 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:20.907622 kubelet[3397]: E0317 17:30:20.905703 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-7f6jb" Mar 17 17:30:20.907622 kubelet[3397]: E0317 17:30:20.905722 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-7f6jb" Mar 17 17:30:20.907713 kubelet[3397]: E0317 17:30:20.905767 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-7f6jb_kube-system(bce6fc98-b3ee-43d8-88b1-252f92d0da22)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-7f6jb_kube-system(bce6fc98-b3ee-43d8-88b1-252f92d0da22)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-7f6jb" podUID="bce6fc98-b3ee-43d8-88b1-252f92d0da22" Mar 17 17:30:20.909714 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80-shm.mount: Deactivated successfully. Mar 17 17:30:21.648090 kubelet[3397]: I0317 17:30:21.647685 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80" Mar 17 17:30:21.648908 containerd[1721]: time="2025-03-17T17:30:21.648560898Z" level=info msg="StopPodSandbox for \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\"" Mar 17 17:30:21.648908 containerd[1721]: time="2025-03-17T17:30:21.648776578Z" level=info msg="Ensure that sandbox 2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80 in task-service has been cleanup successfully" Mar 17 17:30:21.652122 containerd[1721]: time="2025-03-17T17:30:21.651913815Z" level=info msg="TearDown network for sandbox \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\" successfully" Mar 17 17:30:21.652122 containerd[1721]: time="2025-03-17T17:30:21.651944775Z" level=info msg="StopPodSandbox for \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\" returns successfully" Mar 17 17:30:21.655026 containerd[1721]: time="2025-03-17T17:30:21.654986292Z" level=info msg="StopPodSandbox for \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\"" Mar 17 17:30:21.655124 containerd[1721]: time="2025-03-17T17:30:21.655080852Z" level=info msg="TearDown network for sandbox \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\" successfully" Mar 17 17:30:21.655124 containerd[1721]: time="2025-03-17T17:30:21.655091132Z" level=info msg="StopPodSandbox for \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\" returns successfully" Mar 17 17:30:21.656065 containerd[1721]: time="2025-03-17T17:30:21.655908411Z" level=info msg="StopPodSandbox for \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\"" Mar 17 17:30:21.656065 containerd[1721]: time="2025-03-17T17:30:21.655996211Z" level=info msg="TearDown network for sandbox \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\" successfully" Mar 17 17:30:21.656065 containerd[1721]: time="2025-03-17T17:30:21.656006251Z" level=info msg="StopPodSandbox for \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\" returns successfully" Mar 17 17:30:21.658740 containerd[1721]: time="2025-03-17T17:30:21.658711969Z" level=info msg="StopPodSandbox for \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\"" Mar 17 17:30:21.659991 containerd[1721]: time="2025-03-17T17:30:21.659886528Z" level=info msg="TearDown network for sandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\" successfully" Mar 17 17:30:21.659991 containerd[1721]: time="2025-03-17T17:30:21.659911528Z" level=info msg="StopPodSandbox for \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\" returns successfully" Mar 17 17:30:21.660859 containerd[1721]: time="2025-03-17T17:30:21.660796567Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7f6jb,Uid:bce6fc98-b3ee-43d8-88b1-252f92d0da22,Namespace:kube-system,Attempt:4,}" Mar 17 17:30:21.708820 containerd[1721]: time="2025-03-17T17:30:21.708629404Z" level=error msg="Failed to destroy network for sandbox \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.711349 containerd[1721]: time="2025-03-17T17:30:21.711099841Z" level=error msg="encountered an error cleaning up failed sandbox \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.712186 containerd[1721]: time="2025-03-17T17:30:21.711973801Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-26p8t,Uid:f1d22644-c9fd-4c49-8bcb-aa7ce67eb937,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.713318 kubelet[3397]: E0317 17:30:21.712848 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.713318 kubelet[3397]: E0317 17:30:21.712903 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-26p8t" Mar 17 17:30:21.713318 kubelet[3397]: E0317 17:30:21.712922 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-26p8t" Mar 17 17:30:21.713505 kubelet[3397]: E0317 17:30:21.712967 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-26p8t_kube-system(f1d22644-c9fd-4c49-8bcb-aa7ce67eb937)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-26p8t_kube-system(f1d22644-c9fd-4c49-8bcb-aa7ce67eb937)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-26p8t" podUID="f1d22644-c9fd-4c49-8bcb-aa7ce67eb937" Mar 17 17:30:21.737711 containerd[1721]: time="2025-03-17T17:30:21.737648058Z" level=error msg="Failed to destroy network for sandbox \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.741879 containerd[1721]: time="2025-03-17T17:30:21.741765134Z" level=error msg="encountered an error cleaning up failed sandbox \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.741879 containerd[1721]: time="2025-03-17T17:30:21.741855814Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vjh47,Uid:feb1e339-9a1a-480e-9e83-ea79ab0971ae,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.750902 kubelet[3397]: E0317 17:30:21.750547 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.750902 kubelet[3397]: E0317 17:30:21.750607 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vjh47" Mar 17 17:30:21.750902 kubelet[3397]: E0317 17:30:21.750627 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vjh47" Mar 17 17:30:21.751081 kubelet[3397]: E0317 17:30:21.750662 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vjh47_calico-system(feb1e339-9a1a-480e-9e83-ea79ab0971ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vjh47_calico-system(feb1e339-9a1a-480e-9e83-ea79ab0971ae)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vjh47" podUID="feb1e339-9a1a-480e-9e83-ea79ab0971ae" Mar 17 17:30:21.760587 containerd[1721]: time="2025-03-17T17:30:21.760449077Z" level=error msg="Failed to destroy network for sandbox \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.761661 containerd[1721]: time="2025-03-17T17:30:21.761612556Z" level=error msg="encountered an error cleaning up failed sandbox \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.762269 containerd[1721]: time="2025-03-17T17:30:21.762216355Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-n7khj,Uid:e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.763004 kubelet[3397]: E0317 17:30:21.762960 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.763099 kubelet[3397]: E0317 17:30:21.763066 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" Mar 17 17:30:21.763099 kubelet[3397]: E0317 17:30:21.763092 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" Mar 17 17:30:21.763359 containerd[1721]: time="2025-03-17T17:30:21.762877155Z" level=error msg="Failed to destroy network for sandbox \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.764219 kubelet[3397]: E0317 17:30:21.764166 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-678db5fc46-n7khj_calico-apiserver(e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-678db5fc46-n7khj_calico-apiserver(e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" podUID="e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a" Mar 17 17:30:21.765169 containerd[1721]: time="2025-03-17T17:30:21.764475593Z" level=error msg="encountered an error cleaning up failed sandbox \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.767683 containerd[1721]: time="2025-03-17T17:30:21.765639192Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b9c99b997-26lfs,Uid:4354ea3a-d606-47b2-8073-479d0b804cd3,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.767944 kubelet[3397]: E0317 17:30:21.767900 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.768074 kubelet[3397]: E0317 17:30:21.768057 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" Mar 17 17:30:21.768163 kubelet[3397]: E0317 17:30:21.768148 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" Mar 17 17:30:21.768457 kubelet[3397]: E0317 17:30:21.768297 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7b9c99b997-26lfs_calico-system(4354ea3a-d606-47b2-8073-479d0b804cd3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7b9c99b997-26lfs_calico-system(4354ea3a-d606-47b2-8073-479d0b804cd3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" podUID="4354ea3a-d606-47b2-8073-479d0b804cd3" Mar 17 17:30:21.770625 containerd[1721]: time="2025-03-17T17:30:21.770587508Z" level=error msg="Failed to destroy network for sandbox \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.778061 containerd[1721]: time="2025-03-17T17:30:21.778003061Z" level=error msg="encountered an error cleaning up failed sandbox \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.779025 containerd[1721]: time="2025-03-17T17:30:21.778462701Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-9s6qs,Uid:f70e11b2-e153-4a1f-b987-6a5f11a3781f,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.780152 kubelet[3397]: E0317 17:30:21.779274 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.780152 kubelet[3397]: E0317 17:30:21.779330 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" Mar 17 17:30:21.780152 kubelet[3397]: E0317 17:30:21.779350 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" Mar 17 17:30:21.780321 kubelet[3397]: E0317 17:30:21.779396 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-678db5fc46-9s6qs_calico-apiserver(f70e11b2-e153-4a1f-b987-6a5f11a3781f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-678db5fc46-9s6qs_calico-apiserver(f70e11b2-e153-4a1f-b987-6a5f11a3781f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" podUID="f70e11b2-e153-4a1f-b987-6a5f11a3781f" Mar 17 17:30:21.814687 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e-shm.mount: Deactivated successfully. Mar 17 17:30:21.814805 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f-shm.mount: Deactivated successfully. Mar 17 17:30:21.814882 systemd[1]: run-netns-cni\x2d8cb88a26\x2daa46\x2d574b\x2d37ab\x2d7b3df13d538c.mount: Deactivated successfully. Mar 17 17:30:21.814948 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1442027427.mount: Deactivated successfully. Mar 17 17:30:21.838017 containerd[1721]: time="2025-03-17T17:30:21.837970087Z" level=error msg="Failed to destroy network for sandbox \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.840177 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744-shm.mount: Deactivated successfully. Mar 17 17:30:21.841018 containerd[1721]: time="2025-03-17T17:30:21.840511205Z" level=error msg="encountered an error cleaning up failed sandbox \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.841018 containerd[1721]: time="2025-03-17T17:30:21.840587365Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7f6jb,Uid:bce6fc98-b3ee-43d8-88b1-252f92d0da22,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.841085 kubelet[3397]: E0317 17:30:21.840805 3397 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:21.841085 kubelet[3397]: E0317 17:30:21.840855 3397 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-7f6jb" Mar 17 17:30:21.841085 kubelet[3397]: E0317 17:30:21.840876 3397 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-7f6jb" Mar 17 17:30:21.841177 kubelet[3397]: E0317 17:30:21.840912 3397 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-7f6jb_kube-system(bce6fc98-b3ee-43d8-88b1-252f92d0da22)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-7f6jb_kube-system(bce6fc98-b3ee-43d8-88b1-252f92d0da22)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-7f6jb" podUID="bce6fc98-b3ee-43d8-88b1-252f92d0da22" Mar 17 17:30:21.908343 containerd[1721]: time="2025-03-17T17:30:21.907621264Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:21.911110 containerd[1721]: time="2025-03-17T17:30:21.911057381Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=137086024" Mar 17 17:30:21.918214 containerd[1721]: time="2025-03-17T17:30:21.918168655Z" level=info msg="ImageCreate event name:\"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:21.923462 containerd[1721]: time="2025-03-17T17:30:21.923392690Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:21.924270 containerd[1721]: time="2025-03-17T17:30:21.923985329Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"137085886\" in 5.389030103s" Mar 17 17:30:21.924270 containerd[1721]: time="2025-03-17T17:30:21.924023089Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\"" Mar 17 17:30:21.934388 containerd[1721]: time="2025-03-17T17:30:21.933382681Z" level=info msg="CreateContainer within sandbox \"ce869e18967f2888ac53864b8c7b19dd32d646c40c9071e6c9acd750187f0578\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 17:30:22.001688 containerd[1721]: time="2025-03-17T17:30:22.001596659Z" level=info msg="CreateContainer within sandbox \"ce869e18967f2888ac53864b8c7b19dd32d646c40c9071e6c9acd750187f0578\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"dd9595dd9dae201b8ee835ca4f1020543733b79a926f2c649d931154b61c82bd\"" Mar 17 17:30:22.003151 containerd[1721]: time="2025-03-17T17:30:22.002367619Z" level=info msg="StartContainer for \"dd9595dd9dae201b8ee835ca4f1020543733b79a926f2c649d931154b61c82bd\"" Mar 17 17:30:22.027620 systemd[1]: Started cri-containerd-dd9595dd9dae201b8ee835ca4f1020543733b79a926f2c649d931154b61c82bd.scope - libcontainer container dd9595dd9dae201b8ee835ca4f1020543733b79a926f2c649d931154b61c82bd. Mar 17 17:30:22.055947 waagent[1929]: 2025-03-17T17:30:22.055851Z INFO ExtHandler Mar 17 17:30:22.056283 waagent[1929]: 2025-03-17T17:30:22.056034Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: f4f2b5e2-833c-4ccd-8145-49009b3e8064 eTag: 2184146270338667053 source: Fabric] Mar 17 17:30:22.057150 waagent[1929]: 2025-03-17T17:30:22.056649Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 17 17:30:22.064747 containerd[1721]: time="2025-03-17T17:30:22.064692802Z" level=info msg="StartContainer for \"dd9595dd9dae201b8ee835ca4f1020543733b79a926f2c649d931154b61c82bd\" returns successfully" Mar 17 17:30:22.309554 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 17 17:30:22.310223 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 17 17:30:22.652747 kubelet[3397]: I0317 17:30:22.652636 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde" Mar 17 17:30:22.653522 containerd[1721]: time="2025-03-17T17:30:22.653424831Z" level=info msg="StopPodSandbox for \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\"" Mar 17 17:30:22.654139 containerd[1721]: time="2025-03-17T17:30:22.653955631Z" level=info msg="Ensure that sandbox aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde in task-service has been cleanup successfully" Mar 17 17:30:22.654139 containerd[1721]: time="2025-03-17T17:30:22.654243031Z" level=info msg="TearDown network for sandbox \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\" successfully" Mar 17 17:30:22.654139 containerd[1721]: time="2025-03-17T17:30:22.654269591Z" level=info msg="StopPodSandbox for \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\" returns successfully" Mar 17 17:30:22.655481 containerd[1721]: time="2025-03-17T17:30:22.654659990Z" level=info msg="StopPodSandbox for \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\"" Mar 17 17:30:22.655481 containerd[1721]: time="2025-03-17T17:30:22.654740230Z" level=info msg="TearDown network for sandbox \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\" successfully" Mar 17 17:30:22.655481 containerd[1721]: time="2025-03-17T17:30:22.654750030Z" level=info msg="StopPodSandbox for \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\" returns successfully" Mar 17 17:30:22.655481 containerd[1721]: time="2025-03-17T17:30:22.655017830Z" level=info msg="StopPodSandbox for \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\"" Mar 17 17:30:22.655481 containerd[1721]: time="2025-03-17T17:30:22.655102350Z" level=info msg="TearDown network for sandbox \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\" successfully" Mar 17 17:30:22.655481 containerd[1721]: time="2025-03-17T17:30:22.655111750Z" level=info msg="StopPodSandbox for \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\" returns successfully" Mar 17 17:30:22.655481 containerd[1721]: time="2025-03-17T17:30:22.655419390Z" level=info msg="StopPodSandbox for \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\"" Mar 17 17:30:22.655628 containerd[1721]: time="2025-03-17T17:30:22.655489389Z" level=info msg="TearDown network for sandbox \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\" successfully" Mar 17 17:30:22.655628 containerd[1721]: time="2025-03-17T17:30:22.655498789Z" level=info msg="StopPodSandbox for \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\" returns successfully" Mar 17 17:30:22.655816 containerd[1721]: time="2025-03-17T17:30:22.655722949Z" level=info msg="StopPodSandbox for \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\"" Mar 17 17:30:22.656901 containerd[1721]: time="2025-03-17T17:30:22.655798749Z" level=info msg="TearDown network for sandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\" successfully" Mar 17 17:30:22.656901 containerd[1721]: time="2025-03-17T17:30:22.655948869Z" level=info msg="StopPodSandbox for \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\" returns successfully" Mar 17 17:30:22.658349 containerd[1721]: time="2025-03-17T17:30:22.657017108Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-9s6qs,Uid:f70e11b2-e153-4a1f-b987-6a5f11a3781f,Namespace:calico-apiserver,Attempt:5,}" Mar 17 17:30:22.661765 kubelet[3397]: I0317 17:30:22.661733 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e" Mar 17 17:30:22.662690 containerd[1721]: time="2025-03-17T17:30:22.662660983Z" level=info msg="StopPodSandbox for \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\"" Mar 17 17:30:22.663958 containerd[1721]: time="2025-03-17T17:30:22.663856462Z" level=info msg="Ensure that sandbox 5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e in task-service has been cleanup successfully" Mar 17 17:30:22.664411 containerd[1721]: time="2025-03-17T17:30:22.664216102Z" level=info msg="TearDown network for sandbox \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\" successfully" Mar 17 17:30:22.664411 containerd[1721]: time="2025-03-17T17:30:22.664259062Z" level=info msg="StopPodSandbox for \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\" returns successfully" Mar 17 17:30:22.665671 containerd[1721]: time="2025-03-17T17:30:22.665507860Z" level=info msg="StopPodSandbox for \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\"" Mar 17 17:30:22.665671 containerd[1721]: time="2025-03-17T17:30:22.665583260Z" level=info msg="TearDown network for sandbox \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\" successfully" Mar 17 17:30:22.665671 containerd[1721]: time="2025-03-17T17:30:22.665593020Z" level=info msg="StopPodSandbox for \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\" returns successfully" Mar 17 17:30:22.666128 containerd[1721]: time="2025-03-17T17:30:22.666006540Z" level=info msg="StopPodSandbox for \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\"" Mar 17 17:30:22.666128 containerd[1721]: time="2025-03-17T17:30:22.666086660Z" level=info msg="TearDown network for sandbox \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\" successfully" Mar 17 17:30:22.666128 containerd[1721]: time="2025-03-17T17:30:22.666097060Z" level=info msg="StopPodSandbox for \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\" returns successfully" Mar 17 17:30:22.666821 containerd[1721]: time="2025-03-17T17:30:22.666546859Z" level=info msg="StopPodSandbox for \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\"" Mar 17 17:30:22.666821 containerd[1721]: time="2025-03-17T17:30:22.666620819Z" level=info msg="TearDown network for sandbox \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\" successfully" Mar 17 17:30:22.666821 containerd[1721]: time="2025-03-17T17:30:22.666629659Z" level=info msg="StopPodSandbox for \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\" returns successfully" Mar 17 17:30:22.668096 kubelet[3397]: I0317 17:30:22.667446 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f" Mar 17 17:30:22.668180 containerd[1721]: time="2025-03-17T17:30:22.667706498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-n7khj,Uid:e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a,Namespace:calico-apiserver,Attempt:4,}" Mar 17 17:30:22.668564 containerd[1721]: time="2025-03-17T17:30:22.668465458Z" level=info msg="StopPodSandbox for \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\"" Mar 17 17:30:22.669255 containerd[1721]: time="2025-03-17T17:30:22.669195777Z" level=info msg="Ensure that sandbox 1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f in task-service has been cleanup successfully" Mar 17 17:30:22.670558 containerd[1721]: time="2025-03-17T17:30:22.670481336Z" level=info msg="TearDown network for sandbox \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\" successfully" Mar 17 17:30:22.670558 containerd[1721]: time="2025-03-17T17:30:22.670508936Z" level=info msg="StopPodSandbox for \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\" returns successfully" Mar 17 17:30:22.671421 containerd[1721]: time="2025-03-17T17:30:22.671166775Z" level=info msg="StopPodSandbox for \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\"" Mar 17 17:30:22.671421 containerd[1721]: time="2025-03-17T17:30:22.671312295Z" level=info msg="TearDown network for sandbox \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\" successfully" Mar 17 17:30:22.671421 containerd[1721]: time="2025-03-17T17:30:22.671386695Z" level=info msg="StopPodSandbox for \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\" returns successfully" Mar 17 17:30:22.673887 containerd[1721]: time="2025-03-17T17:30:22.673832133Z" level=info msg="StopPodSandbox for \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\"" Mar 17 17:30:22.673985 containerd[1721]: time="2025-03-17T17:30:22.673975733Z" level=info msg="TearDown network for sandbox \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\" successfully" Mar 17 17:30:22.674014 containerd[1721]: time="2025-03-17T17:30:22.673988413Z" level=info msg="StopPodSandbox for \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\" returns successfully" Mar 17 17:30:22.675440 containerd[1721]: time="2025-03-17T17:30:22.675411171Z" level=info msg="StopPodSandbox for \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\"" Mar 17 17:30:22.675501 containerd[1721]: time="2025-03-17T17:30:22.675493731Z" level=info msg="TearDown network for sandbox \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\" successfully" Mar 17 17:30:22.675543 containerd[1721]: time="2025-03-17T17:30:22.675503171Z" level=info msg="StopPodSandbox for \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\" returns successfully" Mar 17 17:30:22.676680 containerd[1721]: time="2025-03-17T17:30:22.676633770Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-26p8t,Uid:f1d22644-c9fd-4c49-8bcb-aa7ce67eb937,Namespace:kube-system,Attempt:4,}" Mar 17 17:30:22.678825 containerd[1721]: time="2025-03-17T17:30:22.678791048Z" level=info msg="StopPodSandbox for \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\"" Mar 17 17:30:22.678962 containerd[1721]: time="2025-03-17T17:30:22.678937968Z" level=info msg="Ensure that sandbox 579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5 in task-service has been cleanup successfully" Mar 17 17:30:22.679103 kubelet[3397]: I0317 17:30:22.677878 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5" Mar 17 17:30:22.679963 containerd[1721]: time="2025-03-17T17:30:22.679933447Z" level=info msg="TearDown network for sandbox \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\" successfully" Mar 17 17:30:22.680672 containerd[1721]: time="2025-03-17T17:30:22.679959927Z" level=info msg="StopPodSandbox for \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\" returns successfully" Mar 17 17:30:22.681124 containerd[1721]: time="2025-03-17T17:30:22.681099286Z" level=info msg="StopPodSandbox for \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\"" Mar 17 17:30:22.681230 containerd[1721]: time="2025-03-17T17:30:22.681183006Z" level=info msg="TearDown network for sandbox \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\" successfully" Mar 17 17:30:22.681264 containerd[1721]: time="2025-03-17T17:30:22.681224086Z" level=info msg="StopPodSandbox for \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\" returns successfully" Mar 17 17:30:22.681787 containerd[1721]: time="2025-03-17T17:30:22.681762086Z" level=info msg="StopPodSandbox for \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\"" Mar 17 17:30:22.681855 containerd[1721]: time="2025-03-17T17:30:22.681834206Z" level=info msg="TearDown network for sandbox \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\" successfully" Mar 17 17:30:22.681855 containerd[1721]: time="2025-03-17T17:30:22.681849206Z" level=info msg="StopPodSandbox for \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\" returns successfully" Mar 17 17:30:22.684394 containerd[1721]: time="2025-03-17T17:30:22.684002044Z" level=info msg="StopPodSandbox for \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\"" Mar 17 17:30:22.684394 containerd[1721]: time="2025-03-17T17:30:22.684125284Z" level=info msg="TearDown network for sandbox \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\" successfully" Mar 17 17:30:22.684394 containerd[1721]: time="2025-03-17T17:30:22.684136204Z" level=info msg="StopPodSandbox for \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\" returns successfully" Mar 17 17:30:22.684523 kubelet[3397]: I0317 17:30:22.684192 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744" Mar 17 17:30:22.685676 containerd[1721]: time="2025-03-17T17:30:22.685645642Z" level=info msg="StopPodSandbox for \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\"" Mar 17 17:30:22.685896 containerd[1721]: time="2025-03-17T17:30:22.685817722Z" level=info msg="Ensure that sandbox 1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744 in task-service has been cleanup successfully" Mar 17 17:30:22.687446 containerd[1721]: time="2025-03-17T17:30:22.687383961Z" level=info msg="TearDown network for sandbox \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\" successfully" Mar 17 17:30:22.687446 containerd[1721]: time="2025-03-17T17:30:22.687426041Z" level=info msg="StopPodSandbox for \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\" returns successfully" Mar 17 17:30:22.687774 containerd[1721]: time="2025-03-17T17:30:22.687571361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vjh47,Uid:feb1e339-9a1a-480e-9e83-ea79ab0971ae,Namespace:calico-system,Attempt:4,}" Mar 17 17:30:22.689564 containerd[1721]: time="2025-03-17T17:30:22.689015159Z" level=info msg="StopPodSandbox for \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\"" Mar 17 17:30:22.689564 containerd[1721]: time="2025-03-17T17:30:22.689090799Z" level=info msg="TearDown network for sandbox \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\" successfully" Mar 17 17:30:22.689564 containerd[1721]: time="2025-03-17T17:30:22.689099719Z" level=info msg="StopPodSandbox for \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\" returns successfully" Mar 17 17:30:22.690291 containerd[1721]: time="2025-03-17T17:30:22.690072598Z" level=info msg="StopPodSandbox for \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\"" Mar 17 17:30:22.690291 containerd[1721]: time="2025-03-17T17:30:22.690156518Z" level=info msg="TearDown network for sandbox \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\" successfully" Mar 17 17:30:22.690291 containerd[1721]: time="2025-03-17T17:30:22.690166758Z" level=info msg="StopPodSandbox for \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\" returns successfully" Mar 17 17:30:22.693116 containerd[1721]: time="2025-03-17T17:30:22.690697798Z" level=info msg="StopPodSandbox for \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\"" Mar 17 17:30:22.693116 containerd[1721]: time="2025-03-17T17:30:22.690792038Z" level=info msg="TearDown network for sandbox \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\" successfully" Mar 17 17:30:22.693116 containerd[1721]: time="2025-03-17T17:30:22.690801998Z" level=info msg="StopPodSandbox for \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\" returns successfully" Mar 17 17:30:22.693116 containerd[1721]: time="2025-03-17T17:30:22.691899957Z" level=info msg="StopPodSandbox for \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\"" Mar 17 17:30:22.693116 containerd[1721]: time="2025-03-17T17:30:22.691973597Z" level=info msg="TearDown network for sandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\" successfully" Mar 17 17:30:22.693116 containerd[1721]: time="2025-03-17T17:30:22.691982157Z" level=info msg="StopPodSandbox for \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\" returns successfully" Mar 17 17:30:22.693710 containerd[1721]: time="2025-03-17T17:30:22.693536715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7f6jb,Uid:bce6fc98-b3ee-43d8-88b1-252f92d0da22,Namespace:kube-system,Attempt:5,}" Mar 17 17:30:22.694019 kubelet[3397]: I0317 17:30:22.693849 3397 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18" Mar 17 17:30:22.694019 kubelet[3397]: I0317 17:30:22.693646 3397 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-v4t8h" podStartSLOduration=1.683040823 podStartE2EDuration="17.693627795s" podCreationTimestamp="2025-03-17 17:30:05 +0000 UTC" firstStartedPulling="2025-03-17 17:30:05.914553916 +0000 UTC m=+24.606560887" lastFinishedPulling="2025-03-17 17:30:21.925140848 +0000 UTC m=+40.617147859" observedRunningTime="2025-03-17 17:30:22.690060998 +0000 UTC m=+41.382068009" watchObservedRunningTime="2025-03-17 17:30:22.693627795 +0000 UTC m=+41.385634846" Mar 17 17:30:22.695538 containerd[1721]: time="2025-03-17T17:30:22.695488633Z" level=info msg="StopPodSandbox for \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\"" Mar 17 17:30:22.695675 containerd[1721]: time="2025-03-17T17:30:22.695652193Z" level=info msg="Ensure that sandbox 9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18 in task-service has been cleanup successfully" Mar 17 17:30:22.696492 containerd[1721]: time="2025-03-17T17:30:22.696460952Z" level=info msg="TearDown network for sandbox \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\" successfully" Mar 17 17:30:22.696492 containerd[1721]: time="2025-03-17T17:30:22.696483632Z" level=info msg="StopPodSandbox for \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\" returns successfully" Mar 17 17:30:22.696853 containerd[1721]: time="2025-03-17T17:30:22.696821312Z" level=info msg="StopPodSandbox for \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\"" Mar 17 17:30:22.697068 containerd[1721]: time="2025-03-17T17:30:22.696935432Z" level=info msg="TearDown network for sandbox \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\" successfully" Mar 17 17:30:22.697068 containerd[1721]: time="2025-03-17T17:30:22.696955032Z" level=info msg="StopPodSandbox for \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\" returns successfully" Mar 17 17:30:22.697687 containerd[1721]: time="2025-03-17T17:30:22.697411032Z" level=info msg="StopPodSandbox for \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\"" Mar 17 17:30:22.697687 containerd[1721]: time="2025-03-17T17:30:22.697626191Z" level=info msg="TearDown network for sandbox \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\" successfully" Mar 17 17:30:22.697687 containerd[1721]: time="2025-03-17T17:30:22.697638431Z" level=info msg="StopPodSandbox for \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\" returns successfully" Mar 17 17:30:22.699516 containerd[1721]: time="2025-03-17T17:30:22.697959911Z" level=info msg="StopPodSandbox for \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\"" Mar 17 17:30:22.699516 containerd[1721]: time="2025-03-17T17:30:22.698030071Z" level=info msg="TearDown network for sandbox \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\" successfully" Mar 17 17:30:22.699516 containerd[1721]: time="2025-03-17T17:30:22.698040391Z" level=info msg="StopPodSandbox for \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\" returns successfully" Mar 17 17:30:22.699516 containerd[1721]: time="2025-03-17T17:30:22.698942030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b9c99b997-26lfs,Uid:4354ea3a-d606-47b2-8073-479d0b804cd3,Namespace:calico-system,Attempt:4,}" Mar 17 17:30:22.814779 systemd[1]: run-netns-cni\x2d4e9ecb0c\x2d83b1\x2d9558\x2dacbe\x2d72f77f28c465.mount: Deactivated successfully. Mar 17 17:30:22.814867 systemd[1]: run-netns-cni\x2dfca154f0\x2da287\x2d870f\x2dc496\x2d0769209110af.mount: Deactivated successfully. Mar 17 17:30:22.814913 systemd[1]: run-netns-cni\x2d7a634744\x2ddf79\x2d9d9f\x2d6f7e\x2d99241fe0801f.mount: Deactivated successfully. Mar 17 17:30:22.814956 systemd[1]: run-netns-cni\x2de915dcd7\x2df480\x2dd7de\x2dca6e\x2d86aaf9914773.mount: Deactivated successfully. Mar 17 17:30:22.814998 systemd[1]: run-netns-cni\x2d791331ae\x2dff88\x2d4033\x2d97d1\x2dc5d1dcb3ad63.mount: Deactivated successfully. Mar 17 17:30:22.815043 systemd[1]: run-netns-cni\x2d73e48be4\x2dd2ae\x2dec8b\x2dab5b\x2d078636130fab.mount: Deactivated successfully. Mar 17 17:30:23.221624 systemd-networkd[1445]: cali1066f6b9340: Link UP Mar 17 17:30:23.221772 systemd-networkd[1445]: cali1066f6b9340: Gained carrier Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:22.904 [INFO][5017] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:22.923 [INFO][5017] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--9s6qs-eth0 calico-apiserver-678db5fc46- calico-apiserver f70e11b2-e153-4a1f-b987-6a5f11a3781f 687 0 2025-03-17 17:30:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:678db5fc46 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4152.2.2-a-6c46d54d7c calico-apiserver-678db5fc46-9s6qs eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1066f6b9340 [] []}} ContainerID="2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-9s6qs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--9s6qs-" Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:22.924 [INFO][5017] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-9s6qs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--9s6qs-eth0" Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.037 [INFO][5062] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" HandleID="k8s-pod-network.2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--9s6qs-eth0" Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.088 [INFO][5062] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" HandleID="k8s-pod-network.2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--9s6qs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000103a60), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4152.2.2-a-6c46d54d7c", "pod":"calico-apiserver-678db5fc46-9s6qs", "timestamp":"2025-03-17 17:30:23.037535965 +0000 UTC"}, Hostname:"ci-4152.2.2-a-6c46d54d7c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.088 [INFO][5062] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.088 [INFO][5062] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.088 [INFO][5062] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152.2.2-a-6c46d54d7c' Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.096 [INFO][5062] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.109 [INFO][5062] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.118 [INFO][5062] ipam/ipam.go 489: Trying affinity for 192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.122 [INFO][5062] ipam/ipam.go 155: Attempting to load block cidr=192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.134 [INFO][5062] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.136 [INFO][5062] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.104.0/26 handle="k8s-pod-network.2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.143 [INFO][5062] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.164 [INFO][5062] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.104.0/26 handle="k8s-pod-network.2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.180 [INFO][5062] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.104.1/26] block=192.168.104.0/26 handle="k8s-pod-network.2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.180 [INFO][5062] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.104.1/26] handle="k8s-pod-network.2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.181 [INFO][5062] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:30:23.255443 containerd[1721]: 2025-03-17 17:30:23.181 [INFO][5062] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.104.1/26] IPv6=[] ContainerID="2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" HandleID="k8s-pod-network.2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--9s6qs-eth0" Mar 17 17:30:23.256122 containerd[1721]: 2025-03-17 17:30:23.189 [INFO][5017] cni-plugin/k8s.go 386: Populated endpoint ContainerID="2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-9s6qs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--9s6qs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--9s6qs-eth0", GenerateName:"calico-apiserver-678db5fc46-", Namespace:"calico-apiserver", SelfLink:"", UID:"f70e11b2-e153-4a1f-b987-6a5f11a3781f", ResourceVersion:"687", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"678db5fc46", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-6c46d54d7c", ContainerID:"", Pod:"calico-apiserver-678db5fc46-9s6qs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.104.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1066f6b9340", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:23.256122 containerd[1721]: 2025-03-17 17:30:23.189 [INFO][5017] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.104.1/32] ContainerID="2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-9s6qs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--9s6qs-eth0" Mar 17 17:30:23.256122 containerd[1721]: 2025-03-17 17:30:23.190 [INFO][5017] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1066f6b9340 ContainerID="2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-9s6qs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--9s6qs-eth0" Mar 17 17:30:23.256122 containerd[1721]: 2025-03-17 17:30:23.220 [INFO][5017] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-9s6qs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--9s6qs-eth0" Mar 17 17:30:23.256122 containerd[1721]: 2025-03-17 17:30:23.222 [INFO][5017] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-9s6qs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--9s6qs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--9s6qs-eth0", GenerateName:"calico-apiserver-678db5fc46-", Namespace:"calico-apiserver", SelfLink:"", UID:"f70e11b2-e153-4a1f-b987-6a5f11a3781f", ResourceVersion:"687", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"678db5fc46", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-6c46d54d7c", ContainerID:"2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b", Pod:"calico-apiserver-678db5fc46-9s6qs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.104.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1066f6b9340", MAC:"d6:53:f5:be:0f:c2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:23.256122 containerd[1721]: 2025-03-17 17:30:23.249 [INFO][5017] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-9s6qs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--9s6qs-eth0" Mar 17 17:30:23.279151 systemd-networkd[1445]: cali3a5f4b423c4: Link UP Mar 17 17:30:23.284915 systemd-networkd[1445]: cali3a5f4b423c4: Gained carrier Mar 17 17:30:23.302277 containerd[1721]: time="2025-03-17T17:30:23.301713326Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:23.302277 containerd[1721]: time="2025-03-17T17:30:23.301864486Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:23.302277 containerd[1721]: time="2025-03-17T17:30:23.301913606Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:23.302866 containerd[1721]: time="2025-03-17T17:30:23.302508086Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:22.975 [INFO][5039] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:22.996 [INFO][5039] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--26p8t-eth0 coredns-7db6d8ff4d- kube-system f1d22644-c9fd-4c49-8bcb-aa7ce67eb937 686 0 2025-03-17 17:29:55 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4152.2.2-a-6c46d54d7c coredns-7db6d8ff4d-26p8t eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3a5f4b423c4 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" Namespace="kube-system" Pod="coredns-7db6d8ff4d-26p8t" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--26p8t-" Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:22.996 [INFO][5039] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" Namespace="kube-system" Pod="coredns-7db6d8ff4d-26p8t" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--26p8t-eth0" Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.148 [INFO][5089] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" HandleID="k8s-pod-network.ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--26p8t-eth0" Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.181 [INFO][5089] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" HandleID="k8s-pod-network.ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--26p8t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d1e0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4152.2.2-a-6c46d54d7c", "pod":"coredns-7db6d8ff4d-26p8t", "timestamp":"2025-03-17 17:30:23.148638905 +0000 UTC"}, Hostname:"ci-4152.2.2-a-6c46d54d7c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.181 [INFO][5089] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.181 [INFO][5089] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.182 [INFO][5089] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152.2.2-a-6c46d54d7c' Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.186 [INFO][5089] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.197 [INFO][5089] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.219 [INFO][5089] ipam/ipam.go 489: Trying affinity for 192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.227 [INFO][5089] ipam/ipam.go 155: Attempting to load block cidr=192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.234 [INFO][5089] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.234 [INFO][5089] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.104.0/26 handle="k8s-pod-network.ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.238 [INFO][5089] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378 Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.252 [INFO][5089] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.104.0/26 handle="k8s-pod-network.ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.267 [INFO][5089] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.104.2/26] block=192.168.104.0/26 handle="k8s-pod-network.ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.267 [INFO][5089] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.104.2/26] handle="k8s-pod-network.ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.267 [INFO][5089] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:30:23.311493 containerd[1721]: 2025-03-17 17:30:23.268 [INFO][5089] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.104.2/26] IPv6=[] ContainerID="ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" HandleID="k8s-pod-network.ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--26p8t-eth0" Mar 17 17:30:23.312026 containerd[1721]: 2025-03-17 17:30:23.273 [INFO][5039] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" Namespace="kube-system" Pod="coredns-7db6d8ff4d-26p8t" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--26p8t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--26p8t-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"f1d22644-c9fd-4c49-8bcb-aa7ce67eb937", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 29, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-6c46d54d7c", ContainerID:"", Pod:"coredns-7db6d8ff4d-26p8t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.104.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3a5f4b423c4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:23.312026 containerd[1721]: 2025-03-17 17:30:23.273 [INFO][5039] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.104.2/32] ContainerID="ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" Namespace="kube-system" Pod="coredns-7db6d8ff4d-26p8t" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--26p8t-eth0" Mar 17 17:30:23.312026 containerd[1721]: 2025-03-17 17:30:23.273 [INFO][5039] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3a5f4b423c4 ContainerID="ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" Namespace="kube-system" Pod="coredns-7db6d8ff4d-26p8t" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--26p8t-eth0" Mar 17 17:30:23.312026 containerd[1721]: 2025-03-17 17:30:23.284 [INFO][5039] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" Namespace="kube-system" Pod="coredns-7db6d8ff4d-26p8t" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--26p8t-eth0" Mar 17 17:30:23.312026 containerd[1721]: 2025-03-17 17:30:23.285 [INFO][5039] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" Namespace="kube-system" Pod="coredns-7db6d8ff4d-26p8t" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--26p8t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--26p8t-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"f1d22644-c9fd-4c49-8bcb-aa7ce67eb937", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 29, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-6c46d54d7c", ContainerID:"ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378", Pod:"coredns-7db6d8ff4d-26p8t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.104.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3a5f4b423c4", MAC:"9a:53:69:9e:18:a2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:23.312026 containerd[1721]: 2025-03-17 17:30:23.306 [INFO][5039] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378" Namespace="kube-system" Pod="coredns-7db6d8ff4d-26p8t" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--26p8t-eth0" Mar 17 17:30:23.332797 systemd[1]: Started cri-containerd-2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b.scope - libcontainer container 2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b. Mar 17 17:30:23.345450 systemd-networkd[1445]: cali01ddba2567c: Link UP Mar 17 17:30:23.345650 systemd-networkd[1445]: cali01ddba2567c: Gained carrier Mar 17 17:30:23.359672 containerd[1721]: time="2025-03-17T17:30:23.359421794Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:23.359672 containerd[1721]: time="2025-03-17T17:30:23.359563234Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:23.359672 containerd[1721]: time="2025-03-17T17:30:23.359579674Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:23.360682 containerd[1721]: time="2025-03-17T17:30:23.360507433Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:22.953 [INFO][5027] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:22.997 [INFO][5027] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--n7khj-eth0 calico-apiserver-678db5fc46- calico-apiserver e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a 685 0 2025-03-17 17:30:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:678db5fc46 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4152.2.2-a-6c46d54d7c calico-apiserver-678db5fc46-n7khj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali01ddba2567c [] []}} ContainerID="80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-n7khj" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--n7khj-" Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:22.999 [INFO][5027] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-n7khj" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--n7khj-eth0" Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.128 [INFO][5098] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" HandleID="k8s-pod-network.80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--n7khj-eth0" Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.163 [INFO][5098] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" HandleID="k8s-pod-network.80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--n7khj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000301aa0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4152.2.2-a-6c46d54d7c", "pod":"calico-apiserver-678db5fc46-n7khj", "timestamp":"2025-03-17 17:30:23.128035923 +0000 UTC"}, Hostname:"ci-4152.2.2-a-6c46d54d7c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.163 [INFO][5098] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.267 [INFO][5098] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.267 [INFO][5098] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152.2.2-a-6c46d54d7c' Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.275 [INFO][5098] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.282 [INFO][5098] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.306 [INFO][5098] ipam/ipam.go 489: Trying affinity for 192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.309 [INFO][5098] ipam/ipam.go 155: Attempting to load block cidr=192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.313 [INFO][5098] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.313 [INFO][5098] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.104.0/26 handle="k8s-pod-network.80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.314 [INFO][5098] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2 Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.322 [INFO][5098] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.104.0/26 handle="k8s-pod-network.80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.333 [INFO][5098] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.104.3/26] block=192.168.104.0/26 handle="k8s-pod-network.80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.333 [INFO][5098] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.104.3/26] handle="k8s-pod-network.80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.333 [INFO][5098] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:30:23.376719 containerd[1721]: 2025-03-17 17:30:23.333 [INFO][5098] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.104.3/26] IPv6=[] ContainerID="80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" HandleID="k8s-pod-network.80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--n7khj-eth0" Mar 17 17:30:23.377897 containerd[1721]: 2025-03-17 17:30:23.337 [INFO][5027] cni-plugin/k8s.go 386: Populated endpoint ContainerID="80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-n7khj" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--n7khj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--n7khj-eth0", GenerateName:"calico-apiserver-678db5fc46-", Namespace:"calico-apiserver", SelfLink:"", UID:"e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"678db5fc46", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-6c46d54d7c", ContainerID:"", Pod:"calico-apiserver-678db5fc46-n7khj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.104.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali01ddba2567c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:23.377897 containerd[1721]: 2025-03-17 17:30:23.337 [INFO][5027] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.104.3/32] ContainerID="80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-n7khj" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--n7khj-eth0" Mar 17 17:30:23.377897 containerd[1721]: 2025-03-17 17:30:23.338 [INFO][5027] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali01ddba2567c ContainerID="80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-n7khj" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--n7khj-eth0" Mar 17 17:30:23.377897 containerd[1721]: 2025-03-17 17:30:23.344 [INFO][5027] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-n7khj" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--n7khj-eth0" Mar 17 17:30:23.377897 containerd[1721]: 2025-03-17 17:30:23.347 [INFO][5027] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-n7khj" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--n7khj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--n7khj-eth0", GenerateName:"calico-apiserver-678db5fc46-", Namespace:"calico-apiserver", SelfLink:"", UID:"e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"678db5fc46", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-6c46d54d7c", ContainerID:"80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2", Pod:"calico-apiserver-678db5fc46-n7khj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.104.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali01ddba2567c", MAC:"b6:a9:01:56:69:91", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:23.377897 containerd[1721]: 2025-03-17 17:30:23.372 [INFO][5027] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2" Namespace="calico-apiserver" Pod="calico-apiserver-678db5fc46-n7khj" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--apiserver--678db5fc46--n7khj-eth0" Mar 17 17:30:23.386563 systemd[1]: Started cri-containerd-ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378.scope - libcontainer container ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378. Mar 17 17:30:23.417718 containerd[1721]: time="2025-03-17T17:30:23.417634982Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-9s6qs,Uid:f70e11b2-e153-4a1f-b987-6a5f11a3781f,Namespace:calico-apiserver,Attempt:5,} returns sandbox id \"2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b\"" Mar 17 17:30:23.428738 containerd[1721]: time="2025-03-17T17:30:23.427973213Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 17 17:30:23.446694 systemd-networkd[1445]: cali22682a95c44: Link UP Mar 17 17:30:23.446833 systemd-networkd[1445]: cali22682a95c44: Gained carrier Mar 17 17:30:23.457899 containerd[1721]: time="2025-03-17T17:30:23.457842906Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-26p8t,Uid:f1d22644-c9fd-4c49-8bcb-aa7ce67eb937,Namespace:kube-system,Attempt:4,} returns sandbox id \"ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378\"" Mar 17 17:30:23.464736 containerd[1721]: time="2025-03-17T17:30:23.462661061Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:23.464736 containerd[1721]: time="2025-03-17T17:30:23.462981581Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:23.464736 containerd[1721]: time="2025-03-17T17:30:23.463001741Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:23.465456 containerd[1721]: time="2025-03-17T17:30:23.464991019Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:23.466774 containerd[1721]: time="2025-03-17T17:30:23.466368858Z" level=info msg="CreateContainer within sandbox \"ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.033 [INFO][5068] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.055 [INFO][5068] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--7f6jb-eth0 coredns-7db6d8ff4d- kube-system bce6fc98-b3ee-43d8-88b1-252f92d0da22 679 0 2025-03-17 17:29:55 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4152.2.2-a-6c46d54d7c coredns-7db6d8ff4d-7f6jb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali22682a95c44 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7f6jb" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--7f6jb-" Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.057 [INFO][5068] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7f6jb" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--7f6jb-eth0" Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.193 [INFO][5111] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" HandleID="k8s-pod-network.7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--7f6jb-eth0" Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.212 [INFO][5111] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" HandleID="k8s-pod-network.7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--7f6jb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003a1150), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4152.2.2-a-6c46d54d7c", "pod":"coredns-7db6d8ff4d-7f6jb", "timestamp":"2025-03-17 17:30:23.193576504 +0000 UTC"}, Hostname:"ci-4152.2.2-a-6c46d54d7c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.212 [INFO][5111] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.334 [INFO][5111] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.335 [INFO][5111] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152.2.2-a-6c46d54d7c' Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.341 [INFO][5111] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.357 [INFO][5111] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.374 [INFO][5111] ipam/ipam.go 489: Trying affinity for 192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.381 [INFO][5111] ipam/ipam.go 155: Attempting to load block cidr=192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.389 [INFO][5111] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.390 [INFO][5111] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.104.0/26 handle="k8s-pod-network.7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.400 [INFO][5111] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2 Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.408 [INFO][5111] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.104.0/26 handle="k8s-pod-network.7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.425 [INFO][5111] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.104.4/26] block=192.168.104.0/26 handle="k8s-pod-network.7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.425 [INFO][5111] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.104.4/26] handle="k8s-pod-network.7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.428 [INFO][5111] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:30:23.481664 containerd[1721]: 2025-03-17 17:30:23.429 [INFO][5111] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.104.4/26] IPv6=[] ContainerID="7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" HandleID="k8s-pod-network.7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--7f6jb-eth0" Mar 17 17:30:23.482172 containerd[1721]: 2025-03-17 17:30:23.435 [INFO][5068] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7f6jb" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--7f6jb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--7f6jb-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"bce6fc98-b3ee-43d8-88b1-252f92d0da22", ResourceVersion:"679", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 29, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-6c46d54d7c", ContainerID:"", Pod:"coredns-7db6d8ff4d-7f6jb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.104.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali22682a95c44", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:23.482172 containerd[1721]: 2025-03-17 17:30:23.436 [INFO][5068] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.104.4/32] ContainerID="7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7f6jb" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--7f6jb-eth0" Mar 17 17:30:23.482172 containerd[1721]: 2025-03-17 17:30:23.436 [INFO][5068] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali22682a95c44 ContainerID="7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7f6jb" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--7f6jb-eth0" Mar 17 17:30:23.482172 containerd[1721]: 2025-03-17 17:30:23.449 [INFO][5068] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7f6jb" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--7f6jb-eth0" Mar 17 17:30:23.482172 containerd[1721]: 2025-03-17 17:30:23.450 [INFO][5068] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7f6jb" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--7f6jb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--7f6jb-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"bce6fc98-b3ee-43d8-88b1-252f92d0da22", ResourceVersion:"679", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 29, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-6c46d54d7c", ContainerID:"7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2", Pod:"coredns-7db6d8ff4d-7f6jb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.104.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali22682a95c44", MAC:"3e:77:08:44:59:b5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:23.482172 containerd[1721]: 2025-03-17 17:30:23.474 [INFO][5068] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7f6jb" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-coredns--7db6d8ff4d--7f6jb-eth0" Mar 17 17:30:23.495654 systemd[1]: Started cri-containerd-80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2.scope - libcontainer container 80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2. Mar 17 17:30:23.523783 containerd[1721]: time="2025-03-17T17:30:23.523593446Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:23.526439 containerd[1721]: time="2025-03-17T17:30:23.526047804Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:23.526439 containerd[1721]: time="2025-03-17T17:30:23.526167684Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:23.528197 containerd[1721]: time="2025-03-17T17:30:23.527916402Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:23.548590 containerd[1721]: time="2025-03-17T17:30:23.547311425Z" level=info msg="CreateContainer within sandbox \"ad5252eaadfb7ca079ae379eb3ad2748ad3c1cab017b440093bb9cdf336f2378\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"78ad85fabd3b64a50ecfd46301dfc3cd1623d91a012569e2e8d3a6c9e5d0a0a9\"" Mar 17 17:30:23.552586 systemd[1]: Started cri-containerd-7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2.scope - libcontainer container 7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2. Mar 17 17:30:23.555573 containerd[1721]: time="2025-03-17T17:30:23.551865261Z" level=info msg="StartContainer for \"78ad85fabd3b64a50ecfd46301dfc3cd1623d91a012569e2e8d3a6c9e5d0a0a9\"" Mar 17 17:30:23.568602 containerd[1721]: time="2025-03-17T17:30:23.567982966Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-678db5fc46-n7khj,Uid:e6a80aad-f4cd-48fc-a1af-2b0ba4e8f34a,Namespace:calico-apiserver,Attempt:4,} returns sandbox id \"80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2\"" Mar 17 17:30:23.578740 systemd-networkd[1445]: cali90874372c73: Link UP Mar 17 17:30:23.579181 systemd-networkd[1445]: cali90874372c73: Gained carrier Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.020 [INFO][5051] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.059 [INFO][5051] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152.2.2--a--6c46d54d7c-k8s-csi--node--driver--vjh47-eth0 csi-node-driver- calico-system feb1e339-9a1a-480e-9e83-ea79ab0971ae 601 0 2025-03-17 17:30:05 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:69ddf5d45d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4152.2.2-a-6c46d54d7c csi-node-driver-vjh47 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali90874372c73 [] []}} ContainerID="34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" Namespace="calico-system" Pod="csi-node-driver-vjh47" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-csi--node--driver--vjh47-" Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.059 [INFO][5051] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" Namespace="calico-system" Pod="csi-node-driver-vjh47" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-csi--node--driver--vjh47-eth0" Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.205 [INFO][5117] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" HandleID="k8s-pod-network.34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-csi--node--driver--vjh47-eth0" Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.239 [INFO][5117] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" HandleID="k8s-pod-network.34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-csi--node--driver--vjh47-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400028dbd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4152.2.2-a-6c46d54d7c", "pod":"csi-node-driver-vjh47", "timestamp":"2025-03-17 17:30:23.205895893 +0000 UTC"}, Hostname:"ci-4152.2.2-a-6c46d54d7c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.240 [INFO][5117] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.429 [INFO][5117] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.429 [INFO][5117] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152.2.2-a-6c46d54d7c' Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.433 [INFO][5117] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.450 [INFO][5117] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.471 [INFO][5117] ipam/ipam.go 489: Trying affinity for 192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.477 [INFO][5117] ipam/ipam.go 155: Attempting to load block cidr=192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.489 [INFO][5117] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.489 [INFO][5117] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.104.0/26 handle="k8s-pod-network.34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.493 [INFO][5117] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.512 [INFO][5117] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.104.0/26 handle="k8s-pod-network.34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.528 [INFO][5117] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.104.5/26] block=192.168.104.0/26 handle="k8s-pod-network.34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.528 [INFO][5117] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.104.5/26] handle="k8s-pod-network.34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.528 [INFO][5117] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:30:23.627062 containerd[1721]: 2025-03-17 17:30:23.529 [INFO][5117] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.104.5/26] IPv6=[] ContainerID="34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" HandleID="k8s-pod-network.34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-csi--node--driver--vjh47-eth0" Mar 17 17:30:23.627605 containerd[1721]: 2025-03-17 17:30:23.549 [INFO][5051] cni-plugin/k8s.go 386: Populated endpoint ContainerID="34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" Namespace="calico-system" Pod="csi-node-driver-vjh47" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-csi--node--driver--vjh47-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--6c46d54d7c-k8s-csi--node--driver--vjh47-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"feb1e339-9a1a-480e-9e83-ea79ab0971ae", ResourceVersion:"601", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-6c46d54d7c", ContainerID:"", Pod:"csi-node-driver-vjh47", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.104.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali90874372c73", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:23.627605 containerd[1721]: 2025-03-17 17:30:23.553 [INFO][5051] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.104.5/32] ContainerID="34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" Namespace="calico-system" Pod="csi-node-driver-vjh47" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-csi--node--driver--vjh47-eth0" Mar 17 17:30:23.627605 containerd[1721]: 2025-03-17 17:30:23.554 [INFO][5051] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali90874372c73 ContainerID="34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" Namespace="calico-system" Pod="csi-node-driver-vjh47" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-csi--node--driver--vjh47-eth0" Mar 17 17:30:23.627605 containerd[1721]: 2025-03-17 17:30:23.583 [INFO][5051] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" Namespace="calico-system" Pod="csi-node-driver-vjh47" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-csi--node--driver--vjh47-eth0" Mar 17 17:30:23.627605 containerd[1721]: 2025-03-17 17:30:23.586 [INFO][5051] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" Namespace="calico-system" Pod="csi-node-driver-vjh47" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-csi--node--driver--vjh47-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--6c46d54d7c-k8s-csi--node--driver--vjh47-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"feb1e339-9a1a-480e-9e83-ea79ab0971ae", ResourceVersion:"601", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-6c46d54d7c", ContainerID:"34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb", Pod:"csi-node-driver-vjh47", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.104.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali90874372c73", MAC:"1a:56:50:29:dd:85", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:23.627605 containerd[1721]: 2025-03-17 17:30:23.623 [INFO][5051] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb" Namespace="calico-system" Pod="csi-node-driver-vjh47" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-csi--node--driver--vjh47-eth0" Mar 17 17:30:23.631913 systemd[1]: Started cri-containerd-78ad85fabd3b64a50ecfd46301dfc3cd1623d91a012569e2e8d3a6c9e5d0a0a9.scope - libcontainer container 78ad85fabd3b64a50ecfd46301dfc3cd1623d91a012569e2e8d3a6c9e5d0a0a9. Mar 17 17:30:23.655276 containerd[1721]: time="2025-03-17T17:30:23.655217088Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7f6jb,Uid:bce6fc98-b3ee-43d8-88b1-252f92d0da22,Namespace:kube-system,Attempt:5,} returns sandbox id \"7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2\"" Mar 17 17:30:23.665479 containerd[1721]: time="2025-03-17T17:30:23.665426278Z" level=info msg="CreateContainer within sandbox \"7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 17:30:23.686486 containerd[1721]: time="2025-03-17T17:30:23.686343819Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:23.686486 containerd[1721]: time="2025-03-17T17:30:23.686428339Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:23.686486 containerd[1721]: time="2025-03-17T17:30:23.686440259Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:23.687080 containerd[1721]: time="2025-03-17T17:30:23.686683059Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:23.741260 systemd-networkd[1445]: cali4ce30177311: Link UP Mar 17 17:30:23.746370 systemd-networkd[1445]: cali4ce30177311: Gained carrier Mar 17 17:30:23.764863 systemd[1]: Started cri-containerd-34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb.scope - libcontainer container 34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb. Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.119 [INFO][5073] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.169 [INFO][5073] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152.2.2--a--6c46d54d7c-k8s-calico--kube--controllers--7b9c99b997--26lfs-eth0 calico-kube-controllers-7b9c99b997- calico-system 4354ea3a-d606-47b2-8073-479d0b804cd3 684 0 2025-03-17 17:30:05 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7b9c99b997 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4152.2.2-a-6c46d54d7c calico-kube-controllers-7b9c99b997-26lfs eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali4ce30177311 [] []}} ContainerID="6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" Namespace="calico-system" Pod="calico-kube-controllers-7b9c99b997-26lfs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--kube--controllers--7b9c99b997--26lfs-" Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.169 [INFO][5073] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" Namespace="calico-system" Pod="calico-kube-controllers-7b9c99b997-26lfs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--kube--controllers--7b9c99b997--26lfs-eth0" Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.257 [INFO][5129] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" HandleID="k8s-pod-network.6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-calico--kube--controllers--7b9c99b997--26lfs-eth0" Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.277 [INFO][5129] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" HandleID="k8s-pod-network.6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-calico--kube--controllers--7b9c99b997--26lfs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000330390), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4152.2.2-a-6c46d54d7c", "pod":"calico-kube-controllers-7b9c99b997-26lfs", "timestamp":"2025-03-17 17:30:23.257442606 +0000 UTC"}, Hostname:"ci-4152.2.2-a-6c46d54d7c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.277 [INFO][5129] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.528 [INFO][5129] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.531 [INFO][5129] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152.2.2-a-6c46d54d7c' Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.541 [INFO][5129] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.562 [INFO][5129] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.580 [INFO][5129] ipam/ipam.go 489: Trying affinity for 192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.596 [INFO][5129] ipam/ipam.go 155: Attempting to load block cidr=192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.602 [INFO][5129] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.104.0/26 host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.603 [INFO][5129] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.104.0/26 handle="k8s-pod-network.6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.624 [INFO][5129] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.663 [INFO][5129] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.104.0/26 handle="k8s-pod-network.6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.718 [INFO][5129] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.104.6/26] block=192.168.104.0/26 handle="k8s-pod-network.6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.721 [INFO][5129] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.104.6/26] handle="k8s-pod-network.6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" host="ci-4152.2.2-a-6c46d54d7c" Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.722 [INFO][5129] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:30:23.785006 containerd[1721]: 2025-03-17 17:30:23.722 [INFO][5129] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.104.6/26] IPv6=[] ContainerID="6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" HandleID="k8s-pod-network.6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" Workload="ci--4152.2.2--a--6c46d54d7c-k8s-calico--kube--controllers--7b9c99b997--26lfs-eth0" Mar 17 17:30:23.804946 containerd[1721]: 2025-03-17 17:30:23.733 [INFO][5073] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" Namespace="calico-system" Pod="calico-kube-controllers-7b9c99b997-26lfs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--kube--controllers--7b9c99b997--26lfs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--6c46d54d7c-k8s-calico--kube--controllers--7b9c99b997--26lfs-eth0", GenerateName:"calico-kube-controllers-7b9c99b997-", Namespace:"calico-system", SelfLink:"", UID:"4354ea3a-d606-47b2-8073-479d0b804cd3", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7b9c99b997", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-6c46d54d7c", ContainerID:"", Pod:"calico-kube-controllers-7b9c99b997-26lfs", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.104.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4ce30177311", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:23.804946 containerd[1721]: 2025-03-17 17:30:23.734 [INFO][5073] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.104.6/32] ContainerID="6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" Namespace="calico-system" Pod="calico-kube-controllers-7b9c99b997-26lfs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--kube--controllers--7b9c99b997--26lfs-eth0" Mar 17 17:30:23.804946 containerd[1721]: 2025-03-17 17:30:23.734 [INFO][5073] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4ce30177311 ContainerID="6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" Namespace="calico-system" Pod="calico-kube-controllers-7b9c99b997-26lfs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--kube--controllers--7b9c99b997--26lfs-eth0" Mar 17 17:30:23.804946 containerd[1721]: 2025-03-17 17:30:23.750 [INFO][5073] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" Namespace="calico-system" Pod="calico-kube-controllers-7b9c99b997-26lfs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--kube--controllers--7b9c99b997--26lfs-eth0" Mar 17 17:30:23.804946 containerd[1721]: 2025-03-17 17:30:23.752 [INFO][5073] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" Namespace="calico-system" Pod="calico-kube-controllers-7b9c99b997-26lfs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--kube--controllers--7b9c99b997--26lfs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152.2.2--a--6c46d54d7c-k8s-calico--kube--controllers--7b9c99b997--26lfs-eth0", GenerateName:"calico-kube-controllers-7b9c99b997-", Namespace:"calico-system", SelfLink:"", UID:"4354ea3a-d606-47b2-8073-479d0b804cd3", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7b9c99b997", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152.2.2-a-6c46d54d7c", ContainerID:"6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c", Pod:"calico-kube-controllers-7b9c99b997-26lfs", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.104.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4ce30177311", MAC:"3e:ee:a4:8d:25:4b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:23.804946 containerd[1721]: 2025-03-17 17:30:23.769 [INFO][5073] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c" Namespace="calico-system" Pod="calico-kube-controllers-7b9c99b997-26lfs" WorkloadEndpoint="ci--4152.2.2--a--6c46d54d7c-k8s-calico--kube--controllers--7b9c99b997--26lfs-eth0" Mar 17 17:30:23.841897 containerd[1721]: time="2025-03-17T17:30:23.841774159Z" level=info msg="StartContainer for \"78ad85fabd3b64a50ecfd46301dfc3cd1623d91a012569e2e8d3a6c9e5d0a0a9\" returns successfully" Mar 17 17:30:23.883593 containerd[1721]: time="2025-03-17T17:30:23.883480522Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vjh47,Uid:feb1e339-9a1a-480e-9e83-ea79ab0971ae,Namespace:calico-system,Attempt:4,} returns sandbox id \"34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb\"" Mar 17 17:30:23.913543 containerd[1721]: time="2025-03-17T17:30:23.913420615Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:23.913543 containerd[1721]: time="2025-03-17T17:30:23.913495655Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:23.913543 containerd[1721]: time="2025-03-17T17:30:23.913512414Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:23.913884 containerd[1721]: time="2025-03-17T17:30:23.913822214Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:23.917312 containerd[1721]: time="2025-03-17T17:30:23.917263771Z" level=info msg="CreateContainer within sandbox \"7f6e657d4a5a526ce9151caf1fc471fbb3966187e6bfb66a0824d556fdc371b2\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"69f31edae05f9ae502c0883a9e0fc1e9cdcf114369589392cc7e3b5285d5cee0\"" Mar 17 17:30:23.918821 containerd[1721]: time="2025-03-17T17:30:23.917962650Z" level=info msg="StartContainer for \"69f31edae05f9ae502c0883a9e0fc1e9cdcf114369589392cc7e3b5285d5cee0\"" Mar 17 17:30:23.979050 systemd[1]: Started cri-containerd-6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c.scope - libcontainer container 6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c. Mar 17 17:30:23.987517 systemd[1]: Started cri-containerd-69f31edae05f9ae502c0883a9e0fc1e9cdcf114369589392cc7e3b5285d5cee0.scope - libcontainer container 69f31edae05f9ae502c0883a9e0fc1e9cdcf114369589392cc7e3b5285d5cee0. Mar 17 17:30:24.117025 containerd[1721]: time="2025-03-17T17:30:24.116969031Z" level=info msg="StartContainer for \"69f31edae05f9ae502c0883a9e0fc1e9cdcf114369589392cc7e3b5285d5cee0\" returns successfully" Mar 17 17:30:24.150523 containerd[1721]: time="2025-03-17T17:30:24.150462321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b9c99b997-26lfs,Uid:4354ea3a-d606-47b2-8073-479d0b804cd3,Namespace:calico-system,Attempt:4,} returns sandbox id \"6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c\"" Mar 17 17:30:24.369327 kernel: bpftool[5647]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 17 17:30:24.492205 systemd-networkd[1445]: cali3a5f4b423c4: Gained IPv6LL Mar 17 17:30:24.563696 systemd-networkd[1445]: vxlan.calico: Link UP Mar 17 17:30:24.563703 systemd-networkd[1445]: vxlan.calico: Gained carrier Mar 17 17:30:24.683401 systemd-networkd[1445]: cali22682a95c44: Gained IPv6LL Mar 17 17:30:24.683896 systemd-networkd[1445]: cali1066f6b9340: Gained IPv6LL Mar 17 17:30:24.747367 systemd-networkd[1445]: cali90874372c73: Gained IPv6LL Mar 17 17:30:24.761367 kubelet[3397]: I0317 17:30:24.760868 3397 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:30:24.829686 kubelet[3397]: I0317 17:30:24.829610 3397 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-26p8t" podStartSLOduration=29.829589908 podStartE2EDuration="29.829589908s" podCreationTimestamp="2025-03-17 17:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:30:24.802366653 +0000 UTC m=+43.494373664" watchObservedRunningTime="2025-03-17 17:30:24.829589908 +0000 UTC m=+43.521596919" Mar 17 17:30:25.067408 systemd-networkd[1445]: cali01ddba2567c: Gained IPv6LL Mar 17 17:30:25.486033 kubelet[3397]: I0317 17:30:25.485960 3397 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-7f6jb" podStartSLOduration=30.485941276 podStartE2EDuration="30.485941276s" podCreationTimestamp="2025-03-17 17:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:30:24.829999668 +0000 UTC m=+43.522006679" watchObservedRunningTime="2025-03-17 17:30:25.485941276 +0000 UTC m=+44.177948287" Mar 17 17:30:25.515481 systemd-networkd[1445]: cali4ce30177311: Gained IPv6LL Mar 17 17:30:26.140630 containerd[1721]: time="2025-03-17T17:30:26.140571005Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:26.148423 containerd[1721]: time="2025-03-17T17:30:26.148349798Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=40253267" Mar 17 17:30:26.153276 containerd[1721]: time="2025-03-17T17:30:26.153189354Z" level=info msg="ImageCreate event name:\"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:26.162194 containerd[1721]: time="2025-03-17T17:30:26.161346827Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:26.162194 containerd[1721]: time="2025-03-17T17:30:26.162073586Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 2.734057134s" Mar 17 17:30:26.162194 containerd[1721]: time="2025-03-17T17:30:26.162101506Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 17 17:30:26.164320 containerd[1721]: time="2025-03-17T17:30:26.164286544Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 17 17:30:26.166255 containerd[1721]: time="2025-03-17T17:30:26.166190422Z" level=info msg="CreateContainer within sandbox \"2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 17:30:26.221030 containerd[1721]: time="2025-03-17T17:30:26.220988413Z" level=info msg="CreateContainer within sandbox \"2d923085b0fbafd11af7b0aa024ed4ccadce0d95a000e551f54499738ff0028b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"56bb0bc47b2d8088f6762507d73eb7bbaa6604ec83d9e5977a5181cc41a91188\"" Mar 17 17:30:26.222054 containerd[1721]: time="2025-03-17T17:30:26.222019252Z" level=info msg="StartContainer for \"56bb0bc47b2d8088f6762507d73eb7bbaa6604ec83d9e5977a5181cc41a91188\"" Mar 17 17:30:26.253436 systemd[1]: Started cri-containerd-56bb0bc47b2d8088f6762507d73eb7bbaa6604ec83d9e5977a5181cc41a91188.scope - libcontainer container 56bb0bc47b2d8088f6762507d73eb7bbaa6604ec83d9e5977a5181cc41a91188. Mar 17 17:30:26.575531 containerd[1721]: time="2025-03-17T17:30:26.575485573Z" level=info msg="StartContainer for \"56bb0bc47b2d8088f6762507d73eb7bbaa6604ec83d9e5977a5181cc41a91188\" returns successfully" Mar 17 17:30:26.603368 systemd-networkd[1445]: vxlan.calico: Gained IPv6LL Mar 17 17:30:27.382861 containerd[1721]: time="2025-03-17T17:30:27.382811285Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:27.389997 containerd[1721]: time="2025-03-17T17:30:27.389942358Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 17 17:30:27.393200 containerd[1721]: time="2025-03-17T17:30:27.393117355Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 1.228661771s" Mar 17 17:30:27.393200 containerd[1721]: time="2025-03-17T17:30:27.393197995Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 17 17:30:27.394963 containerd[1721]: time="2025-03-17T17:30:27.394884274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 17 17:30:27.397658 containerd[1721]: time="2025-03-17T17:30:27.397573951Z" level=info msg="CreateContainer within sandbox \"80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 17:30:27.483590 containerd[1721]: time="2025-03-17T17:30:27.483543994Z" level=info msg="CreateContainer within sandbox \"80214c1d4c7777d950403cf859a3b07d2908975338a00955d2ad4f6d3a7f10d2\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"61e4c1d3a1d6b8d045b0b8021218284e3c9de853d75c0b96b60afeb09a48b5c3\"" Mar 17 17:30:27.485503 containerd[1721]: time="2025-03-17T17:30:27.484224073Z" level=info msg="StartContainer for \"61e4c1d3a1d6b8d045b0b8021218284e3c9de853d75c0b96b60afeb09a48b5c3\"" Mar 17 17:30:27.533838 systemd[1]: Started cri-containerd-61e4c1d3a1d6b8d045b0b8021218284e3c9de853d75c0b96b60afeb09a48b5c3.scope - libcontainer container 61e4c1d3a1d6b8d045b0b8021218284e3c9de853d75c0b96b60afeb09a48b5c3. Mar 17 17:30:27.704201 containerd[1721]: time="2025-03-17T17:30:27.704031835Z" level=info msg="StartContainer for \"61e4c1d3a1d6b8d045b0b8021218284e3c9de853d75c0b96b60afeb09a48b5c3\" returns successfully" Mar 17 17:30:27.810297 kubelet[3397]: I0317 17:30:27.808808 3397 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:30:27.830080 kubelet[3397]: I0317 17:30:27.829771 3397 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-678db5fc46-9s6qs" podStartSLOduration=21.088180514 podStartE2EDuration="23.829752401s" podCreationTimestamp="2025-03-17 17:30:04 +0000 UTC" firstStartedPulling="2025-03-17 17:30:23.421529378 +0000 UTC m=+42.113536389" lastFinishedPulling="2025-03-17 17:30:26.163101265 +0000 UTC m=+44.855108276" observedRunningTime="2025-03-17 17:30:26.812361159 +0000 UTC m=+45.504368170" watchObservedRunningTime="2025-03-17 17:30:27.829752401 +0000 UTC m=+46.521759412" Mar 17 17:30:29.066603 kubelet[3397]: I0317 17:30:29.066051 3397 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-678db5fc46-n7khj" podStartSLOduration=21.243960215 podStartE2EDuration="25.066032047s" podCreationTimestamp="2025-03-17 17:30:04 +0000 UTC" firstStartedPulling="2025-03-17 17:30:23.572677442 +0000 UTC m=+42.264684453" lastFinishedPulling="2025-03-17 17:30:27.394749274 +0000 UTC m=+46.086756285" observedRunningTime="2025-03-17 17:30:27.830059801 +0000 UTC m=+46.522066812" watchObservedRunningTime="2025-03-17 17:30:29.066032047 +0000 UTC m=+47.758039058" Mar 17 17:30:29.177686 containerd[1721]: time="2025-03-17T17:30:29.176642508Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:29.182364 containerd[1721]: time="2025-03-17T17:30:29.182186023Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7473801" Mar 17 17:30:29.191260 containerd[1721]: time="2025-03-17T17:30:29.189830856Z" level=info msg="ImageCreate event name:\"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:29.198126 containerd[1721]: time="2025-03-17T17:30:29.198051968Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:29.199347 containerd[1721]: time="2025-03-17T17:30:29.198783808Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"8843558\" in 1.803830814s" Mar 17 17:30:29.199347 containerd[1721]: time="2025-03-17T17:30:29.198821728Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\"" Mar 17 17:30:29.201608 containerd[1721]: time="2025-03-17T17:30:29.201357045Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 17 17:30:29.203157 containerd[1721]: time="2025-03-17T17:30:29.202991284Z" level=info msg="CreateContainer within sandbox \"34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 17 17:30:29.255196 containerd[1721]: time="2025-03-17T17:30:29.255088717Z" level=info msg="CreateContainer within sandbox \"34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"c4a21b23f175ff6097ca004e82d229678a4f51413bcc816e9ca1d8f610f5218d\"" Mar 17 17:30:29.256492 containerd[1721]: time="2025-03-17T17:30:29.256444596Z" level=info msg="StartContainer for \"c4a21b23f175ff6097ca004e82d229678a4f51413bcc816e9ca1d8f610f5218d\"" Mar 17 17:30:29.293463 systemd[1]: Started cri-containerd-c4a21b23f175ff6097ca004e82d229678a4f51413bcc816e9ca1d8f610f5218d.scope - libcontainer container c4a21b23f175ff6097ca004e82d229678a4f51413bcc816e9ca1d8f610f5218d. Mar 17 17:30:29.336649 containerd[1721]: time="2025-03-17T17:30:29.335824164Z" level=info msg="StartContainer for \"c4a21b23f175ff6097ca004e82d229678a4f51413bcc816e9ca1d8f610f5218d\" returns successfully" Mar 17 17:30:31.201372 containerd[1721]: time="2025-03-17T17:30:31.201312565Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:31.206595 containerd[1721]: time="2025-03-17T17:30:31.206539240Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=32560257" Mar 17 17:30:31.212459 containerd[1721]: time="2025-03-17T17:30:31.212404875Z" level=info msg="ImageCreate event name:\"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:31.224979 containerd[1721]: time="2025-03-17T17:30:31.224909143Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:31.226279 containerd[1721]: time="2025-03-17T17:30:31.225761143Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"33929982\" in 2.024363538s" Mar 17 17:30:31.226279 containerd[1721]: time="2025-03-17T17:30:31.225795503Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\"" Mar 17 17:30:31.226952 containerd[1721]: time="2025-03-17T17:30:31.226910582Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 17 17:30:31.246123 containerd[1721]: time="2025-03-17T17:30:31.246079324Z" level=info msg="CreateContainer within sandbox \"6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 17 17:30:31.315413 containerd[1721]: time="2025-03-17T17:30:31.315360982Z" level=info msg="CreateContainer within sandbox \"6b24a2d9b04a6554249fc21bb78de1fe7c7f97d5f602e2965814e807b5b6f62c\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"f680233b4b5d0dcdc49f7742ac1bfd68da8761904f7fb0c76d2c6dface7ffa12\"" Mar 17 17:30:31.316777 containerd[1721]: time="2025-03-17T17:30:31.316393981Z" level=info msg="StartContainer for \"f680233b4b5d0dcdc49f7742ac1bfd68da8761904f7fb0c76d2c6dface7ffa12\"" Mar 17 17:30:31.345431 systemd[1]: Started cri-containerd-f680233b4b5d0dcdc49f7742ac1bfd68da8761904f7fb0c76d2c6dface7ffa12.scope - libcontainer container f680233b4b5d0dcdc49f7742ac1bfd68da8761904f7fb0c76d2c6dface7ffa12. Mar 17 17:30:31.384950 containerd[1721]: time="2025-03-17T17:30:31.384861399Z" level=info msg="StartContainer for \"f680233b4b5d0dcdc49f7742ac1bfd68da8761904f7fb0c76d2c6dface7ffa12\" returns successfully" Mar 17 17:30:31.847704 kubelet[3397]: I0317 17:30:31.847637 3397 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7b9c99b997-26lfs" podStartSLOduration=19.781003593 podStartE2EDuration="26.847618783s" podCreationTimestamp="2025-03-17 17:30:05 +0000 UTC" firstStartedPulling="2025-03-17 17:30:24.160028152 +0000 UTC m=+42.852035163" lastFinishedPulling="2025-03-17 17:30:31.226643382 +0000 UTC m=+49.918650353" observedRunningTime="2025-03-17 17:30:31.844628985 +0000 UTC m=+50.536635996" watchObservedRunningTime="2025-03-17 17:30:31.847618783 +0000 UTC m=+50.539625794" Mar 17 17:30:32.987269 containerd[1721]: time="2025-03-17T17:30:32.986816597Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:32.996702 containerd[1721]: time="2025-03-17T17:30:32.996548388Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13121717" Mar 17 17:30:33.006439 containerd[1721]: time="2025-03-17T17:30:33.006399659Z" level=info msg="ImageCreate event name:\"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:33.013286 containerd[1721]: time="2025-03-17T17:30:33.013184973Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:33.014192 containerd[1721]: time="2025-03-17T17:30:33.013733133Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"14491426\" in 1.786771671s" Mar 17 17:30:33.014192 containerd[1721]: time="2025-03-17T17:30:33.013768893Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\"" Mar 17 17:30:33.016364 containerd[1721]: time="2025-03-17T17:30:33.016124051Z" level=info msg="CreateContainer within sandbox \"34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 17 17:30:33.063583 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount963743100.mount: Deactivated successfully. Mar 17 17:30:33.079337 containerd[1721]: time="2025-03-17T17:30:33.079287474Z" level=info msg="CreateContainer within sandbox \"34d8043b3625b4cfd0f2290c3b15744aaf0ccb36f419f7cdbba6deba616584cb\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"c861792f7dfbe51f58b9123dc2b0f926b3f9c3cdc6b6ef191892a58d505c3b93\"" Mar 17 17:30:33.080069 containerd[1721]: time="2025-03-17T17:30:33.079975713Z" level=info msg="StartContainer for \"c861792f7dfbe51f58b9123dc2b0f926b3f9c3cdc6b6ef191892a58d505c3b93\"" Mar 17 17:30:33.124461 systemd[1]: Started cri-containerd-c861792f7dfbe51f58b9123dc2b0f926b3f9c3cdc6b6ef191892a58d505c3b93.scope - libcontainer container c861792f7dfbe51f58b9123dc2b0f926b3f9c3cdc6b6ef191892a58d505c3b93. Mar 17 17:30:33.167716 containerd[1721]: time="2025-03-17T17:30:33.167647274Z" level=info msg="StartContainer for \"c861792f7dfbe51f58b9123dc2b0f926b3f9c3cdc6b6ef191892a58d505c3b93\" returns successfully" Mar 17 17:30:33.519322 kubelet[3397]: I0317 17:30:33.519281 3397 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 17 17:30:33.522342 kubelet[3397]: I0317 17:30:33.522314 3397 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 17 17:30:33.855421 kubelet[3397]: I0317 17:30:33.855268 3397 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-vjh47" podStartSLOduration=19.727152402 podStartE2EDuration="28.855247375s" podCreationTimestamp="2025-03-17 17:30:05 +0000 UTC" firstStartedPulling="2025-03-17 17:30:23.886468759 +0000 UTC m=+42.578475770" lastFinishedPulling="2025-03-17 17:30:33.014563732 +0000 UTC m=+51.706570743" observedRunningTime="2025-03-17 17:30:33.854480456 +0000 UTC m=+52.546487427" watchObservedRunningTime="2025-03-17 17:30:33.855247375 +0000 UTC m=+52.547254386" Mar 17 17:30:41.434944 containerd[1721]: time="2025-03-17T17:30:41.434905398Z" level=info msg="StopPodSandbox for \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\"" Mar 17 17:30:41.436374 containerd[1721]: time="2025-03-17T17:30:41.435015518Z" level=info msg="TearDown network for sandbox \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\" successfully" Mar 17 17:30:41.436374 containerd[1721]: time="2025-03-17T17:30:41.435025798Z" level=info msg="StopPodSandbox for \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\" returns successfully" Mar 17 17:30:41.436374 containerd[1721]: time="2025-03-17T17:30:41.435415718Z" level=info msg="RemovePodSandbox for \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\"" Mar 17 17:30:41.436374 containerd[1721]: time="2025-03-17T17:30:41.435440798Z" level=info msg="Forcibly stopping sandbox \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\"" Mar 17 17:30:41.436374 containerd[1721]: time="2025-03-17T17:30:41.435509678Z" level=info msg="TearDown network for sandbox \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\" successfully" Mar 17 17:30:41.454749 containerd[1721]: time="2025-03-17T17:30:41.454702459Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.454868 containerd[1721]: time="2025-03-17T17:30:41.454777419Z" level=info msg="RemovePodSandbox \"b9858658b820cb234a2008c12d49cf9a1758d3a381b53aebce5c9d2167163927\" returns successfully" Mar 17 17:30:41.455616 containerd[1721]: time="2025-03-17T17:30:41.455446378Z" level=info msg="StopPodSandbox for \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\"" Mar 17 17:30:41.455616 containerd[1721]: time="2025-03-17T17:30:41.455552418Z" level=info msg="TearDown network for sandbox \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\" successfully" Mar 17 17:30:41.455616 containerd[1721]: time="2025-03-17T17:30:41.455562578Z" level=info msg="StopPodSandbox for \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\" returns successfully" Mar 17 17:30:41.457275 containerd[1721]: time="2025-03-17T17:30:41.455970498Z" level=info msg="RemovePodSandbox for \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\"" Mar 17 17:30:41.457275 containerd[1721]: time="2025-03-17T17:30:41.455995978Z" level=info msg="Forcibly stopping sandbox \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\"" Mar 17 17:30:41.457275 containerd[1721]: time="2025-03-17T17:30:41.456055898Z" level=info msg="TearDown network for sandbox \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\" successfully" Mar 17 17:30:41.467110 containerd[1721]: time="2025-03-17T17:30:41.466770487Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.467110 containerd[1721]: time="2025-03-17T17:30:41.466840047Z" level=info msg="RemovePodSandbox \"025d4d8d3daf24ec9f25f37e86e48559348f2f8adc88923aa5430a4d2804f9dc\" returns successfully" Mar 17 17:30:41.467267 containerd[1721]: time="2025-03-17T17:30:41.467246127Z" level=info msg="StopPodSandbox for \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\"" Mar 17 17:30:41.467511 containerd[1721]: time="2025-03-17T17:30:41.467336406Z" level=info msg="TearDown network for sandbox \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\" successfully" Mar 17 17:30:41.467511 containerd[1721]: time="2025-03-17T17:30:41.467354086Z" level=info msg="StopPodSandbox for \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\" returns successfully" Mar 17 17:30:41.468283 containerd[1721]: time="2025-03-17T17:30:41.468123086Z" level=info msg="RemovePodSandbox for \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\"" Mar 17 17:30:41.468283 containerd[1721]: time="2025-03-17T17:30:41.468149166Z" level=info msg="Forcibly stopping sandbox \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\"" Mar 17 17:30:41.468283 containerd[1721]: time="2025-03-17T17:30:41.468218686Z" level=info msg="TearDown network for sandbox \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\" successfully" Mar 17 17:30:41.482498 containerd[1721]: time="2025-03-17T17:30:41.482342392Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.482498 containerd[1721]: time="2025-03-17T17:30:41.482409472Z" level=info msg="RemovePodSandbox \"85753bc2129bef97fbf776a9bce4a047a21b9c5dd80adecc1f3c0bb9a6d62217\" returns successfully" Mar 17 17:30:41.483180 containerd[1721]: time="2025-03-17T17:30:41.482934711Z" level=info msg="StopPodSandbox for \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\"" Mar 17 17:30:41.483180 containerd[1721]: time="2025-03-17T17:30:41.483022271Z" level=info msg="TearDown network for sandbox \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\" successfully" Mar 17 17:30:41.483180 containerd[1721]: time="2025-03-17T17:30:41.483031631Z" level=info msg="StopPodSandbox for \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\" returns successfully" Mar 17 17:30:41.483516 containerd[1721]: time="2025-03-17T17:30:41.483427271Z" level=info msg="RemovePodSandbox for \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\"" Mar 17 17:30:41.483605 containerd[1721]: time="2025-03-17T17:30:41.483588630Z" level=info msg="Forcibly stopping sandbox \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\"" Mar 17 17:30:41.483714 containerd[1721]: time="2025-03-17T17:30:41.483700270Z" level=info msg="TearDown network for sandbox \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\" successfully" Mar 17 17:30:41.495471 containerd[1721]: time="2025-03-17T17:30:41.495421099Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.495989 containerd[1721]: time="2025-03-17T17:30:41.495822658Z" level=info msg="RemovePodSandbox \"579bf830bf5706559fe1c08118f3c3ebe86b79700a155553d4b6824b9b6969e5\" returns successfully" Mar 17 17:30:41.497307 containerd[1721]: time="2025-03-17T17:30:41.497143137Z" level=info msg="StopPodSandbox for \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\"" Mar 17 17:30:41.497506 containerd[1721]: time="2025-03-17T17:30:41.497449537Z" level=info msg="TearDown network for sandbox \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\" successfully" Mar 17 17:30:41.497506 containerd[1721]: time="2025-03-17T17:30:41.497468777Z" level=info msg="StopPodSandbox for \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\" returns successfully" Mar 17 17:30:41.497962 containerd[1721]: time="2025-03-17T17:30:41.497900336Z" level=info msg="RemovePodSandbox for \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\"" Mar 17 17:30:41.498223 containerd[1721]: time="2025-03-17T17:30:41.498072456Z" level=info msg="Forcibly stopping sandbox \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\"" Mar 17 17:30:41.499001 containerd[1721]: time="2025-03-17T17:30:41.498868255Z" level=info msg="TearDown network for sandbox \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\" successfully" Mar 17 17:30:41.512421 containerd[1721]: time="2025-03-17T17:30:41.512215802Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.512421 containerd[1721]: time="2025-03-17T17:30:41.512302122Z" level=info msg="RemovePodSandbox \"32428e95999019a1efa21165aaf119891c9af64025a30539792663eecb00fdf0\" returns successfully" Mar 17 17:30:41.513138 containerd[1721]: time="2025-03-17T17:30:41.512973161Z" level=info msg="StopPodSandbox for \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\"" Mar 17 17:30:41.513138 containerd[1721]: time="2025-03-17T17:30:41.513070041Z" level=info msg="TearDown network for sandbox \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\" successfully" Mar 17 17:30:41.513138 containerd[1721]: time="2025-03-17T17:30:41.513079361Z" level=info msg="StopPodSandbox for \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\" returns successfully" Mar 17 17:30:41.513580 containerd[1721]: time="2025-03-17T17:30:41.513502801Z" level=info msg="RemovePodSandbox for \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\"" Mar 17 17:30:41.513580 containerd[1721]: time="2025-03-17T17:30:41.513529521Z" level=info msg="Forcibly stopping sandbox \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\"" Mar 17 17:30:41.513712 containerd[1721]: time="2025-03-17T17:30:41.513604041Z" level=info msg="TearDown network for sandbox \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\" successfully" Mar 17 17:30:41.525162 containerd[1721]: time="2025-03-17T17:30:41.525121229Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.525287 containerd[1721]: time="2025-03-17T17:30:41.525190909Z" level=info msg="RemovePodSandbox \"1433c8eeb5a9c12c665eb3c43b218b1c85d18ccdd49f546596c42b1cd1658f62\" returns successfully" Mar 17 17:30:41.525955 containerd[1721]: time="2025-03-17T17:30:41.525680509Z" level=info msg="StopPodSandbox for \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\"" Mar 17 17:30:41.525955 containerd[1721]: time="2025-03-17T17:30:41.525785029Z" level=info msg="TearDown network for sandbox \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\" successfully" Mar 17 17:30:41.525955 containerd[1721]: time="2025-03-17T17:30:41.525796029Z" level=info msg="StopPodSandbox for \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\" returns successfully" Mar 17 17:30:41.526511 containerd[1721]: time="2025-03-17T17:30:41.526493988Z" level=info msg="RemovePodSandbox for \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\"" Mar 17 17:30:41.526672 containerd[1721]: time="2025-03-17T17:30:41.526576868Z" level=info msg="Forcibly stopping sandbox \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\"" Mar 17 17:30:41.526789 containerd[1721]: time="2025-03-17T17:30:41.526725548Z" level=info msg="TearDown network for sandbox \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\" successfully" Mar 17 17:30:41.545718 containerd[1721]: time="2025-03-17T17:30:41.545667809Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.545862 containerd[1721]: time="2025-03-17T17:30:41.545737289Z" level=info msg="RemovePodSandbox \"2f997126cf57b4e6e114e3aca6f2824887ced66eeebc53e0298c5ccb1b620d1d\" returns successfully" Mar 17 17:30:41.546290 containerd[1721]: time="2025-03-17T17:30:41.546243729Z" level=info msg="StopPodSandbox for \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\"" Mar 17 17:30:41.546368 containerd[1721]: time="2025-03-17T17:30:41.546345569Z" level=info msg="TearDown network for sandbox \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\" successfully" Mar 17 17:30:41.546368 containerd[1721]: time="2025-03-17T17:30:41.546359329Z" level=info msg="StopPodSandbox for \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\" returns successfully" Mar 17 17:30:41.547002 containerd[1721]: time="2025-03-17T17:30:41.546977288Z" level=info msg="RemovePodSandbox for \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\"" Mar 17 17:30:41.547039 containerd[1721]: time="2025-03-17T17:30:41.547006808Z" level=info msg="Forcibly stopping sandbox \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\"" Mar 17 17:30:41.547086 containerd[1721]: time="2025-03-17T17:30:41.547068688Z" level=info msg="TearDown network for sandbox \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\" successfully" Mar 17 17:30:41.559296 containerd[1721]: time="2025-03-17T17:30:41.559251996Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.559391 containerd[1721]: time="2025-03-17T17:30:41.559329316Z" level=info msg="RemovePodSandbox \"5b79f932fd2b92df31e12e28ad675f1d1cd9234183b706aa068998e98b774c5e\" returns successfully" Mar 17 17:30:41.559935 containerd[1721]: time="2025-03-17T17:30:41.559851195Z" level=info msg="StopPodSandbox for \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\"" Mar 17 17:30:41.560013 containerd[1721]: time="2025-03-17T17:30:41.559949715Z" level=info msg="TearDown network for sandbox \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\" successfully" Mar 17 17:30:41.560013 containerd[1721]: time="2025-03-17T17:30:41.559960395Z" level=info msg="StopPodSandbox for \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\" returns successfully" Mar 17 17:30:41.560491 containerd[1721]: time="2025-03-17T17:30:41.560334395Z" level=info msg="RemovePodSandbox for \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\"" Mar 17 17:30:41.560491 containerd[1721]: time="2025-03-17T17:30:41.560358995Z" level=info msg="Forcibly stopping sandbox \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\"" Mar 17 17:30:41.560998 containerd[1721]: time="2025-03-17T17:30:41.560646394Z" level=info msg="TearDown network for sandbox \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\" successfully" Mar 17 17:30:41.576926 containerd[1721]: time="2025-03-17T17:30:41.576883378Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.577115 containerd[1721]: time="2025-03-17T17:30:41.577099938Z" level=info msg="RemovePodSandbox \"d337151ac78054fa9d38888d920caf2c845543c32b046369fb0bb099614212f8\" returns successfully" Mar 17 17:30:41.577589 containerd[1721]: time="2025-03-17T17:30:41.577562218Z" level=info msg="StopPodSandbox for \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\"" Mar 17 17:30:41.577683 containerd[1721]: time="2025-03-17T17:30:41.577662658Z" level=info msg="TearDown network for sandbox \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\" successfully" Mar 17 17:30:41.577683 containerd[1721]: time="2025-03-17T17:30:41.577679738Z" level=info msg="StopPodSandbox for \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\" returns successfully" Mar 17 17:30:41.578053 containerd[1721]: time="2025-03-17T17:30:41.578027257Z" level=info msg="RemovePodSandbox for \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\"" Mar 17 17:30:41.578111 containerd[1721]: time="2025-03-17T17:30:41.578053457Z" level=info msg="Forcibly stopping sandbox \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\"" Mar 17 17:30:41.578208 containerd[1721]: time="2025-03-17T17:30:41.578168297Z" level=info msg="TearDown network for sandbox \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\" successfully" Mar 17 17:30:41.589389 containerd[1721]: time="2025-03-17T17:30:41.589325246Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.589586 containerd[1721]: time="2025-03-17T17:30:41.589417566Z" level=info msg="RemovePodSandbox \"6430ee98373b9887d226b80ddcb6ae146978cb52ff488d3b140861058a4dc507\" returns successfully" Mar 17 17:30:41.590420 containerd[1721]: time="2025-03-17T17:30:41.590263325Z" level=info msg="StopPodSandbox for \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\"" Mar 17 17:30:41.590420 containerd[1721]: time="2025-03-17T17:30:41.590359285Z" level=info msg="TearDown network for sandbox \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\" successfully" Mar 17 17:30:41.590420 containerd[1721]: time="2025-03-17T17:30:41.590368125Z" level=info msg="StopPodSandbox for \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\" returns successfully" Mar 17 17:30:41.590913 containerd[1721]: time="2025-03-17T17:30:41.590869445Z" level=info msg="RemovePodSandbox for \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\"" Mar 17 17:30:41.590913 containerd[1721]: time="2025-03-17T17:30:41.590894485Z" level=info msg="Forcibly stopping sandbox \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\"" Mar 17 17:30:41.591026 containerd[1721]: time="2025-03-17T17:30:41.590962285Z" level=info msg="TearDown network for sandbox \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\" successfully" Mar 17 17:30:41.613266 containerd[1721]: time="2025-03-17T17:30:41.613169943Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.613402 containerd[1721]: time="2025-03-17T17:30:41.613298303Z" level=info msg="RemovePodSandbox \"de06a253647c922b43dccd4398a59bce233a02cbb1e82412d391168cb8518eef\" returns successfully" Mar 17 17:30:41.614404 containerd[1721]: time="2025-03-17T17:30:41.614375381Z" level=info msg="StopPodSandbox for \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\"" Mar 17 17:30:41.614814 containerd[1721]: time="2025-03-17T17:30:41.614786141Z" level=info msg="TearDown network for sandbox \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\" successfully" Mar 17 17:30:41.614889 containerd[1721]: time="2025-03-17T17:30:41.614811661Z" level=info msg="StopPodSandbox for \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\" returns successfully" Mar 17 17:30:41.615667 containerd[1721]: time="2025-03-17T17:30:41.615634180Z" level=info msg="RemovePodSandbox for \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\"" Mar 17 17:30:41.615730 containerd[1721]: time="2025-03-17T17:30:41.615667940Z" level=info msg="Forcibly stopping sandbox \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\"" Mar 17 17:30:41.615756 containerd[1721]: time="2025-03-17T17:30:41.615730380Z" level=info msg="TearDown network for sandbox \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\" successfully" Mar 17 17:30:41.633089 containerd[1721]: time="2025-03-17T17:30:41.633039123Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.633204 containerd[1721]: time="2025-03-17T17:30:41.633146603Z" level=info msg="RemovePodSandbox \"1abbaeb6b718dac43b67be136896752cfd3d2e9b54e5a9c52f93aface23c335f\" returns successfully" Mar 17 17:30:41.633580 containerd[1721]: time="2025-03-17T17:30:41.633553843Z" level=info msg="StopPodSandbox for \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\"" Mar 17 17:30:41.633682 containerd[1721]: time="2025-03-17T17:30:41.633660122Z" level=info msg="TearDown network for sandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\" successfully" Mar 17 17:30:41.633682 containerd[1721]: time="2025-03-17T17:30:41.633677842Z" level=info msg="StopPodSandbox for \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\" returns successfully" Mar 17 17:30:41.634094 containerd[1721]: time="2025-03-17T17:30:41.634072162Z" level=info msg="RemovePodSandbox for \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\"" Mar 17 17:30:41.634140 containerd[1721]: time="2025-03-17T17:30:41.634097722Z" level=info msg="Forcibly stopping sandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\"" Mar 17 17:30:41.634183 containerd[1721]: time="2025-03-17T17:30:41.634163722Z" level=info msg="TearDown network for sandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\" successfully" Mar 17 17:30:41.648166 containerd[1721]: time="2025-03-17T17:30:41.648107668Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.648284 containerd[1721]: time="2025-03-17T17:30:41.648241348Z" level=info msg="RemovePodSandbox \"25b6bc95c0d46a22827ab6eef7a30416328b6d6ba48b5b0fec697f6679f7ff13\" returns successfully" Mar 17 17:30:41.648834 containerd[1721]: time="2025-03-17T17:30:41.648810307Z" level=info msg="StopPodSandbox for \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\"" Mar 17 17:30:41.648935 containerd[1721]: time="2025-03-17T17:30:41.648914987Z" level=info msg="TearDown network for sandbox \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\" successfully" Mar 17 17:30:41.648935 containerd[1721]: time="2025-03-17T17:30:41.648929827Z" level=info msg="StopPodSandbox for \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\" returns successfully" Mar 17 17:30:41.649363 containerd[1721]: time="2025-03-17T17:30:41.649331587Z" level=info msg="RemovePodSandbox for \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\"" Mar 17 17:30:41.649363 containerd[1721]: time="2025-03-17T17:30:41.649359907Z" level=info msg="Forcibly stopping sandbox \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\"" Mar 17 17:30:41.649439 containerd[1721]: time="2025-03-17T17:30:41.649425467Z" level=info msg="TearDown network for sandbox \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\" successfully" Mar 17 17:30:41.668903 containerd[1721]: time="2025-03-17T17:30:41.668821168Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.669025 containerd[1721]: time="2025-03-17T17:30:41.668945928Z" level=info msg="RemovePodSandbox \"bd8fd72d5086caed30f7c26e105b22a7ab0bded2c07736d055c677114545924b\" returns successfully" Mar 17 17:30:41.669545 containerd[1721]: time="2025-03-17T17:30:41.669510607Z" level=info msg="StopPodSandbox for \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\"" Mar 17 17:30:41.669637 containerd[1721]: time="2025-03-17T17:30:41.669617167Z" level=info msg="TearDown network for sandbox \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\" successfully" Mar 17 17:30:41.669637 containerd[1721]: time="2025-03-17T17:30:41.669632287Z" level=info msg="StopPodSandbox for \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\" returns successfully" Mar 17 17:30:41.669987 containerd[1721]: time="2025-03-17T17:30:41.669961527Z" level=info msg="RemovePodSandbox for \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\"" Mar 17 17:30:41.670039 containerd[1721]: time="2025-03-17T17:30:41.669997807Z" level=info msg="Forcibly stopping sandbox \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\"" Mar 17 17:30:41.670066 containerd[1721]: time="2025-03-17T17:30:41.670057807Z" level=info msg="TearDown network for sandbox \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\" successfully" Mar 17 17:30:41.687912 containerd[1721]: time="2025-03-17T17:30:41.687606989Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.687912 containerd[1721]: time="2025-03-17T17:30:41.687683149Z" level=info msg="RemovePodSandbox \"3f75787253e726c5d994194a1bc1ce5aed51a49532e00826795b24768e619995\" returns successfully" Mar 17 17:30:41.688514 containerd[1721]: time="2025-03-17T17:30:41.688325029Z" level=info msg="StopPodSandbox for \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\"" Mar 17 17:30:41.688514 containerd[1721]: time="2025-03-17T17:30:41.688453028Z" level=info msg="TearDown network for sandbox \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\" successfully" Mar 17 17:30:41.688514 containerd[1721]: time="2025-03-17T17:30:41.688465628Z" level=info msg="StopPodSandbox for \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\" returns successfully" Mar 17 17:30:41.688995 containerd[1721]: time="2025-03-17T17:30:41.688711428Z" level=info msg="RemovePodSandbox for \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\"" Mar 17 17:30:41.688995 containerd[1721]: time="2025-03-17T17:30:41.688734108Z" level=info msg="Forcibly stopping sandbox \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\"" Mar 17 17:30:41.688995 containerd[1721]: time="2025-03-17T17:30:41.688789108Z" level=info msg="TearDown network for sandbox \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\" successfully" Mar 17 17:30:41.707076 containerd[1721]: time="2025-03-17T17:30:41.706214571Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.707076 containerd[1721]: time="2025-03-17T17:30:41.706949890Z" level=info msg="RemovePodSandbox \"3248f2833476846d5b4add52b98e9f4600253e201512bb2c5537872a3e910d2b\" returns successfully" Mar 17 17:30:41.707587 containerd[1721]: time="2025-03-17T17:30:41.707408770Z" level=info msg="StopPodSandbox for \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\"" Mar 17 17:30:41.707587 containerd[1721]: time="2025-03-17T17:30:41.707501010Z" level=info msg="TearDown network for sandbox \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\" successfully" Mar 17 17:30:41.707587 containerd[1721]: time="2025-03-17T17:30:41.707511370Z" level=info msg="StopPodSandbox for \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\" returns successfully" Mar 17 17:30:41.707946 containerd[1721]: time="2025-03-17T17:30:41.707817289Z" level=info msg="RemovePodSandbox for \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\"" Mar 17 17:30:41.707946 containerd[1721]: time="2025-03-17T17:30:41.707844009Z" level=info msg="Forcibly stopping sandbox \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\"" Mar 17 17:30:41.707946 containerd[1721]: time="2025-03-17T17:30:41.707896689Z" level=info msg="TearDown network for sandbox \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\" successfully" Mar 17 17:30:41.726097 containerd[1721]: time="2025-03-17T17:30:41.726041151Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.726410 containerd[1721]: time="2025-03-17T17:30:41.726123391Z" level=info msg="RemovePodSandbox \"aef5971a28572605356823ca3ce5944ca204287dcf05436193da5d9d74512bde\" returns successfully" Mar 17 17:30:41.726838 containerd[1721]: time="2025-03-17T17:30:41.726700951Z" level=info msg="StopPodSandbox for \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\"" Mar 17 17:30:41.726838 containerd[1721]: time="2025-03-17T17:30:41.726804111Z" level=info msg="TearDown network for sandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\" successfully" Mar 17 17:30:41.726838 containerd[1721]: time="2025-03-17T17:30:41.726814151Z" level=info msg="StopPodSandbox for \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\" returns successfully" Mar 17 17:30:41.727098 containerd[1721]: time="2025-03-17T17:30:41.727069350Z" level=info msg="RemovePodSandbox for \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\"" Mar 17 17:30:41.727134 containerd[1721]: time="2025-03-17T17:30:41.727101110Z" level=info msg="Forcibly stopping sandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\"" Mar 17 17:30:41.727197 containerd[1721]: time="2025-03-17T17:30:41.727178750Z" level=info msg="TearDown network for sandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\" successfully" Mar 17 17:30:41.863715 containerd[1721]: time="2025-03-17T17:30:41.863668656Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.863834 containerd[1721]: time="2025-03-17T17:30:41.863744736Z" level=info msg="RemovePodSandbox \"aaa2c217545667f6b85cfad7d71ac1de870a9f2df843feb8ded39e47820b3f47\" returns successfully" Mar 17 17:30:41.864161 containerd[1721]: time="2025-03-17T17:30:41.864138295Z" level=info msg="StopPodSandbox for \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\"" Mar 17 17:30:41.864499 containerd[1721]: time="2025-03-17T17:30:41.864419695Z" level=info msg="TearDown network for sandbox \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\" successfully" Mar 17 17:30:41.864499 containerd[1721]: time="2025-03-17T17:30:41.864438255Z" level=info msg="StopPodSandbox for \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\" returns successfully" Mar 17 17:30:41.864857 containerd[1721]: time="2025-03-17T17:30:41.864832374Z" level=info msg="RemovePodSandbox for \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\"" Mar 17 17:30:41.864898 containerd[1721]: time="2025-03-17T17:30:41.864860854Z" level=info msg="Forcibly stopping sandbox \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\"" Mar 17 17:30:41.864942 containerd[1721]: time="2025-03-17T17:30:41.864924694Z" level=info msg="TearDown network for sandbox \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\" successfully" Mar 17 17:30:41.876791 containerd[1721]: time="2025-03-17T17:30:41.876733563Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.876888 containerd[1721]: time="2025-03-17T17:30:41.876814843Z" level=info msg="RemovePodSandbox \"6051f78a44a7fb803fcd7f99d7429e72fe2f9cf1c54c41b5c1db80d555635ef5\" returns successfully" Mar 17 17:30:41.877374 containerd[1721]: time="2025-03-17T17:30:41.877347402Z" level=info msg="StopPodSandbox for \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\"" Mar 17 17:30:41.877475 containerd[1721]: time="2025-03-17T17:30:41.877455242Z" level=info msg="TearDown network for sandbox \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\" successfully" Mar 17 17:30:41.877475 containerd[1721]: time="2025-03-17T17:30:41.877470442Z" level=info msg="StopPodSandbox for \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\" returns successfully" Mar 17 17:30:41.877879 containerd[1721]: time="2025-03-17T17:30:41.877819802Z" level=info msg="RemovePodSandbox for \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\"" Mar 17 17:30:41.877879 containerd[1721]: time="2025-03-17T17:30:41.877844842Z" level=info msg="Forcibly stopping sandbox \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\"" Mar 17 17:30:41.878615 containerd[1721]: time="2025-03-17T17:30:41.878010561Z" level=info msg="TearDown network for sandbox \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\" successfully" Mar 17 17:30:41.890948 containerd[1721]: time="2025-03-17T17:30:41.890902749Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.891220 containerd[1721]: time="2025-03-17T17:30:41.890977429Z" level=info msg="RemovePodSandbox \"7398e0ace4597fc22d42778e8f8c95b7d22f8aa9472d94ce86c6896494885745\" returns successfully" Mar 17 17:30:41.891659 containerd[1721]: time="2025-03-17T17:30:41.891550348Z" level=info msg="StopPodSandbox for \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\"" Mar 17 17:30:41.891781 containerd[1721]: time="2025-03-17T17:30:41.891754868Z" level=info msg="TearDown network for sandbox \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\" successfully" Mar 17 17:30:41.891926 containerd[1721]: time="2025-03-17T17:30:41.891813988Z" level=info msg="StopPodSandbox for \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\" returns successfully" Mar 17 17:30:41.892856 containerd[1721]: time="2025-03-17T17:30:41.892095828Z" level=info msg="RemovePodSandbox for \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\"" Mar 17 17:30:41.892856 containerd[1721]: time="2025-03-17T17:30:41.892127068Z" level=info msg="Forcibly stopping sandbox \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\"" Mar 17 17:30:41.892856 containerd[1721]: time="2025-03-17T17:30:41.892197187Z" level=info msg="TearDown network for sandbox \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\" successfully" Mar 17 17:30:41.916513 containerd[1721]: time="2025-03-17T17:30:41.916467004Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.916756 containerd[1721]: time="2025-03-17T17:30:41.916739923Z" level=info msg="RemovePodSandbox \"2aac308e3243f9b70958c5b4367d6ae3e07d8937e57b808962a549b78207fa80\" returns successfully" Mar 17 17:30:41.917453 containerd[1721]: time="2025-03-17T17:30:41.917431043Z" level=info msg="StopPodSandbox for \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\"" Mar 17 17:30:41.917760 containerd[1721]: time="2025-03-17T17:30:41.917659602Z" level=info msg="TearDown network for sandbox \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\" successfully" Mar 17 17:30:41.917760 containerd[1721]: time="2025-03-17T17:30:41.917676642Z" level=info msg="StopPodSandbox for \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\" returns successfully" Mar 17 17:30:41.918415 containerd[1721]: time="2025-03-17T17:30:41.918019522Z" level=info msg="RemovePodSandbox for \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\"" Mar 17 17:30:41.918415 containerd[1721]: time="2025-03-17T17:30:41.918056682Z" level=info msg="Forcibly stopping sandbox \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\"" Mar 17 17:30:41.918415 containerd[1721]: time="2025-03-17T17:30:41.918129282Z" level=info msg="TearDown network for sandbox \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\" successfully" Mar 17 17:30:41.931108 containerd[1721]: time="2025-03-17T17:30:41.931044989Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.931430 containerd[1721]: time="2025-03-17T17:30:41.931337789Z" level=info msg="RemovePodSandbox \"1a40fa07b6698ffc511dfc77f40723cae9a227e2eca414a7d249078501030744\" returns successfully" Mar 17 17:30:41.931884 containerd[1721]: time="2025-03-17T17:30:41.931700189Z" level=info msg="StopPodSandbox for \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\"" Mar 17 17:30:41.931884 containerd[1721]: time="2025-03-17T17:30:41.931802388Z" level=info msg="TearDown network for sandbox \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\" successfully" Mar 17 17:30:41.931884 containerd[1721]: time="2025-03-17T17:30:41.931813428Z" level=info msg="StopPodSandbox for \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\" returns successfully" Mar 17 17:30:41.932269 containerd[1721]: time="2025-03-17T17:30:41.932220388Z" level=info msg="RemovePodSandbox for \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\"" Mar 17 17:30:41.932461 containerd[1721]: time="2025-03-17T17:30:41.932346988Z" level=info msg="Forcibly stopping sandbox \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\"" Mar 17 17:30:41.932461 containerd[1721]: time="2025-03-17T17:30:41.932412188Z" level=info msg="TearDown network for sandbox \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\" successfully" Mar 17 17:30:41.946381 containerd[1721]: time="2025-03-17T17:30:41.946220054Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.947120 containerd[1721]: time="2025-03-17T17:30:41.946741894Z" level=info msg="RemovePodSandbox \"3c9f1216a9e3911e66c8eb7e2c7f34c0e20a80deabb43f9592e0dc62a58f87da\" returns successfully" Mar 17 17:30:41.947341 containerd[1721]: time="2025-03-17T17:30:41.947193573Z" level=info msg="StopPodSandbox for \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\"" Mar 17 17:30:41.947341 containerd[1721]: time="2025-03-17T17:30:41.947316133Z" level=info msg="TearDown network for sandbox \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\" successfully" Mar 17 17:30:41.947341 containerd[1721]: time="2025-03-17T17:30:41.947328533Z" level=info msg="StopPodSandbox for \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\" returns successfully" Mar 17 17:30:41.948187 containerd[1721]: time="2025-03-17T17:30:41.947975172Z" level=info msg="RemovePodSandbox for \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\"" Mar 17 17:30:41.948187 containerd[1721]: time="2025-03-17T17:30:41.948004572Z" level=info msg="Forcibly stopping sandbox \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\"" Mar 17 17:30:41.948187 containerd[1721]: time="2025-03-17T17:30:41.948062092Z" level=info msg="TearDown network for sandbox \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\" successfully" Mar 17 17:30:41.963243 containerd[1721]: time="2025-03-17T17:30:41.963178277Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.963363 containerd[1721]: time="2025-03-17T17:30:41.963286077Z" level=info msg="RemovePodSandbox \"6e620b7edb0090ec8bd3d3f840272314cb068a1a791262df0d3fd575bee825bc\" returns successfully" Mar 17 17:30:41.963958 containerd[1721]: time="2025-03-17T17:30:41.963832717Z" level=info msg="StopPodSandbox for \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\"" Mar 17 17:30:41.964060 containerd[1721]: time="2025-03-17T17:30:41.964045677Z" level=info msg="TearDown network for sandbox \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\" successfully" Mar 17 17:30:41.964318 containerd[1721]: time="2025-03-17T17:30:41.964102717Z" level=info msg="StopPodSandbox for \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\" returns successfully" Mar 17 17:30:41.965006 containerd[1721]: time="2025-03-17T17:30:41.964492596Z" level=info msg="RemovePodSandbox for \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\"" Mar 17 17:30:41.965006 containerd[1721]: time="2025-03-17T17:30:41.964517076Z" level=info msg="Forcibly stopping sandbox \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\"" Mar 17 17:30:41.965006 containerd[1721]: time="2025-03-17T17:30:41.964580636Z" level=info msg="TearDown network for sandbox \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\" successfully" Mar 17 17:30:41.977401 containerd[1721]: time="2025-03-17T17:30:41.977362823Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.977682 containerd[1721]: time="2025-03-17T17:30:41.977598183Z" level=info msg="RemovePodSandbox \"6a50d70519b706d6d01623dd28011da427a62e41ab92d0e85355d8e163953d0b\" returns successfully" Mar 17 17:30:41.978382 containerd[1721]: time="2025-03-17T17:30:41.978196863Z" level=info msg="StopPodSandbox for \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\"" Mar 17 17:30:41.978382 containerd[1721]: time="2025-03-17T17:30:41.978332903Z" level=info msg="TearDown network for sandbox \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\" successfully" Mar 17 17:30:41.978382 containerd[1721]: time="2025-03-17T17:30:41.978344783Z" level=info msg="StopPodSandbox for \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\" returns successfully" Mar 17 17:30:41.978865 containerd[1721]: time="2025-03-17T17:30:41.978815222Z" level=info msg="RemovePodSandbox for \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\"" Mar 17 17:30:41.978865 containerd[1721]: time="2025-03-17T17:30:41.978843542Z" level=info msg="Forcibly stopping sandbox \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\"" Mar 17 17:30:41.978941 containerd[1721]: time="2025-03-17T17:30:41.978908782Z" level=info msg="TearDown network for sandbox \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\" successfully" Mar 17 17:30:41.997062 containerd[1721]: time="2025-03-17T17:30:41.997014804Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:30:41.997141 containerd[1721]: time="2025-03-17T17:30:41.997091604Z" level=info msg="RemovePodSandbox \"9d638f215e08e0d171bfa1484eaa3133d1322a8777f908255be048cc20436a18\" returns successfully" Mar 17 17:30:47.567819 systemd[1]: run-containerd-runc-k8s.io-f680233b4b5d0dcdc49f7742ac1bfd68da8761904f7fb0c76d2c6dface7ffa12-runc.km7GLm.mount: Deactivated successfully. Mar 17 17:30:52.807481 kubelet[3397]: I0317 17:30:52.807178 3397 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:31:45.501456 systemd[1]: run-containerd-runc-k8s.io-f680233b4b5d0dcdc49f7742ac1bfd68da8761904f7fb0c76d2c6dface7ffa12-runc.tEJP7W.mount: Deactivated successfully. Mar 17 17:32:09.240518 systemd[1]: Started sshd@7-10.200.20.22:22-10.200.16.10:51780.service - OpenSSH per-connection server daemon (10.200.16.10:51780). Mar 17 17:32:09.691441 sshd[6258]: Accepted publickey for core from 10.200.16.10 port 51780 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:32:09.693337 sshd-session[6258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:32:09.697475 systemd-logind[1696]: New session 10 of user core. Mar 17 17:32:09.705606 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 17 17:32:10.110318 sshd[6260]: Connection closed by 10.200.16.10 port 51780 Mar 17 17:32:10.110889 sshd-session[6258]: pam_unix(sshd:session): session closed for user core Mar 17 17:32:10.114223 systemd[1]: sshd@7-10.200.20.22:22-10.200.16.10:51780.service: Deactivated successfully. Mar 17 17:32:10.115910 systemd[1]: session-10.scope: Deactivated successfully. Mar 17 17:32:10.116547 systemd-logind[1696]: Session 10 logged out. Waiting for processes to exit. Mar 17 17:32:10.117671 systemd-logind[1696]: Removed session 10. Mar 17 17:32:15.192019 systemd[1]: Started sshd@8-10.200.20.22:22-10.200.16.10:51796.service - OpenSSH per-connection server daemon (10.200.16.10:51796). Mar 17 17:32:15.647777 sshd[6272]: Accepted publickey for core from 10.200.16.10 port 51796 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:32:15.649091 sshd-session[6272]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:32:15.652953 systemd-logind[1696]: New session 11 of user core. Mar 17 17:32:15.657408 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 17 17:32:16.033759 sshd[6294]: Connection closed by 10.200.16.10 port 51796 Mar 17 17:32:16.034370 sshd-session[6272]: pam_unix(sshd:session): session closed for user core Mar 17 17:32:16.037825 systemd[1]: sshd@8-10.200.20.22:22-10.200.16.10:51796.service: Deactivated successfully. Mar 17 17:32:16.039949 systemd[1]: session-11.scope: Deactivated successfully. Mar 17 17:32:16.041064 systemd-logind[1696]: Session 11 logged out. Waiting for processes to exit. Mar 17 17:32:16.042085 systemd-logind[1696]: Removed session 11. Mar 17 17:32:21.121509 systemd[1]: Started sshd@9-10.200.20.22:22-10.200.16.10:38828.service - OpenSSH per-connection server daemon (10.200.16.10:38828). Mar 17 17:32:21.567195 sshd[6306]: Accepted publickey for core from 10.200.16.10 port 38828 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:32:21.568589 sshd-session[6306]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:32:21.572669 systemd-logind[1696]: New session 12 of user core. Mar 17 17:32:21.579429 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 17 17:32:21.960283 sshd[6308]: Connection closed by 10.200.16.10 port 38828 Mar 17 17:32:21.961603 sshd-session[6306]: pam_unix(sshd:session): session closed for user core Mar 17 17:32:21.964306 systemd[1]: sshd@9-10.200.20.22:22-10.200.16.10:38828.service: Deactivated successfully. Mar 17 17:32:21.966616 systemd[1]: session-12.scope: Deactivated successfully. Mar 17 17:32:21.968180 systemd-logind[1696]: Session 12 logged out. Waiting for processes to exit. Mar 17 17:32:21.969717 systemd-logind[1696]: Removed session 12. Mar 17 17:32:22.045469 systemd[1]: Started sshd@10-10.200.20.22:22-10.200.16.10:38840.service - OpenSSH per-connection server daemon (10.200.16.10:38840). Mar 17 17:32:22.492737 sshd[6320]: Accepted publickey for core from 10.200.16.10 port 38840 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:32:22.494147 sshd-session[6320]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:32:22.498473 systemd-logind[1696]: New session 13 of user core. Mar 17 17:32:22.502404 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 17 17:32:22.915809 sshd[6322]: Connection closed by 10.200.16.10 port 38840 Mar 17 17:32:22.916685 sshd-session[6320]: pam_unix(sshd:session): session closed for user core Mar 17 17:32:22.919564 systemd-logind[1696]: Session 13 logged out. Waiting for processes to exit. Mar 17 17:32:22.919824 systemd[1]: sshd@10-10.200.20.22:22-10.200.16.10:38840.service: Deactivated successfully. Mar 17 17:32:22.922067 systemd[1]: session-13.scope: Deactivated successfully. Mar 17 17:32:22.924066 systemd-logind[1696]: Removed session 13. Mar 17 17:32:23.005072 systemd[1]: Started sshd@11-10.200.20.22:22-10.200.16.10:38852.service - OpenSSH per-connection server daemon (10.200.16.10:38852). Mar 17 17:32:23.455147 sshd[6331]: Accepted publickey for core from 10.200.16.10 port 38852 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:32:23.456506 sshd-session[6331]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:32:23.460841 systemd-logind[1696]: New session 14 of user core. Mar 17 17:32:23.465429 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 17 17:32:23.843733 sshd[6333]: Connection closed by 10.200.16.10 port 38852 Mar 17 17:32:23.843644 sshd-session[6331]: pam_unix(sshd:session): session closed for user core Mar 17 17:32:23.846578 systemd-logind[1696]: Session 14 logged out. Waiting for processes to exit. Mar 17 17:32:23.847168 systemd[1]: sshd@11-10.200.20.22:22-10.200.16.10:38852.service: Deactivated successfully. Mar 17 17:32:23.849704 systemd[1]: session-14.scope: Deactivated successfully. Mar 17 17:32:23.851536 systemd-logind[1696]: Removed session 14. Mar 17 17:32:28.933626 systemd[1]: Started sshd@12-10.200.20.22:22-10.200.16.10:46268.service - OpenSSH per-connection server daemon (10.200.16.10:46268). Mar 17 17:32:29.432137 sshd[6373]: Accepted publickey for core from 10.200.16.10 port 46268 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:32:29.433560 sshd-session[6373]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:32:29.437894 systemd-logind[1696]: New session 15 of user core. Mar 17 17:32:29.443420 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 17 17:32:29.858282 sshd[6375]: Connection closed by 10.200.16.10 port 46268 Mar 17 17:32:29.858813 sshd-session[6373]: pam_unix(sshd:session): session closed for user core Mar 17 17:32:29.862707 systemd[1]: sshd@12-10.200.20.22:22-10.200.16.10:46268.service: Deactivated successfully. Mar 17 17:32:29.864772 systemd[1]: session-15.scope: Deactivated successfully. Mar 17 17:32:29.866279 systemd-logind[1696]: Session 15 logged out. Waiting for processes to exit. Mar 17 17:32:29.868677 systemd-logind[1696]: Removed session 15. Mar 17 17:32:34.941483 systemd[1]: Started sshd@13-10.200.20.22:22-10.200.16.10:46282.service - OpenSSH per-connection server daemon (10.200.16.10:46282). Mar 17 17:32:35.387709 sshd[6386]: Accepted publickey for core from 10.200.16.10 port 46282 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:32:35.388960 sshd-session[6386]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:32:35.392538 systemd-logind[1696]: New session 16 of user core. Mar 17 17:32:35.401388 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 17 17:32:35.771017 sshd[6388]: Connection closed by 10.200.16.10 port 46282 Mar 17 17:32:35.771606 sshd-session[6386]: pam_unix(sshd:session): session closed for user core Mar 17 17:32:35.774880 systemd[1]: sshd@13-10.200.20.22:22-10.200.16.10:46282.service: Deactivated successfully. Mar 17 17:32:35.777788 systemd[1]: session-16.scope: Deactivated successfully. Mar 17 17:32:35.778576 systemd-logind[1696]: Session 16 logged out. Waiting for processes to exit. Mar 17 17:32:35.779441 systemd-logind[1696]: Removed session 16. Mar 17 17:32:40.851333 systemd[1]: Started sshd@14-10.200.20.22:22-10.200.16.10:41680.service - OpenSSH per-connection server daemon (10.200.16.10:41680). Mar 17 17:32:41.304880 sshd[6399]: Accepted publickey for core from 10.200.16.10 port 41680 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:32:41.306273 sshd-session[6399]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:32:41.310140 systemd-logind[1696]: New session 17 of user core. Mar 17 17:32:41.314410 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 17 17:32:41.688340 sshd[6401]: Connection closed by 10.200.16.10 port 41680 Mar 17 17:32:41.689261 sshd-session[6399]: pam_unix(sshd:session): session closed for user core Mar 17 17:32:41.693711 systemd[1]: sshd@14-10.200.20.22:22-10.200.16.10:41680.service: Deactivated successfully. Mar 17 17:32:41.696669 systemd[1]: session-17.scope: Deactivated successfully. Mar 17 17:32:41.697279 systemd-logind[1696]: Session 17 logged out. Waiting for processes to exit. Mar 17 17:32:41.698340 systemd-logind[1696]: Removed session 17. Mar 17 17:32:41.793499 systemd[1]: Started sshd@15-10.200.20.22:22-10.200.16.10:41696.service - OpenSSH per-connection server daemon (10.200.16.10:41696). Mar 17 17:32:42.240462 sshd[6414]: Accepted publickey for core from 10.200.16.10 port 41696 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:32:42.241948 sshd-session[6414]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:32:42.245920 systemd-logind[1696]: New session 18 of user core. Mar 17 17:32:42.253406 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 17 17:32:42.724796 sshd[6416]: Connection closed by 10.200.16.10 port 41696 Mar 17 17:32:42.725155 sshd-session[6414]: pam_unix(sshd:session): session closed for user core Mar 17 17:32:42.729088 systemd[1]: sshd@15-10.200.20.22:22-10.200.16.10:41696.service: Deactivated successfully. Mar 17 17:32:42.731717 systemd[1]: session-18.scope: Deactivated successfully. Mar 17 17:32:42.733857 systemd-logind[1696]: Session 18 logged out. Waiting for processes to exit. Mar 17 17:32:42.734905 systemd-logind[1696]: Removed session 18. Mar 17 17:32:42.819503 systemd[1]: Started sshd@16-10.200.20.22:22-10.200.16.10:41704.service - OpenSSH per-connection server daemon (10.200.16.10:41704). Mar 17 17:32:43.317485 sshd[6425]: Accepted publickey for core from 10.200.16.10 port 41704 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:32:43.319395 sshd-session[6425]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:32:43.323293 systemd-logind[1696]: New session 19 of user core. Mar 17 17:32:43.333461 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 17 17:32:45.403261 sshd[6427]: Connection closed by 10.200.16.10 port 41704 Mar 17 17:32:45.404168 sshd-session[6425]: pam_unix(sshd:session): session closed for user core Mar 17 17:32:45.407971 systemd[1]: sshd@16-10.200.20.22:22-10.200.16.10:41704.service: Deactivated successfully. Mar 17 17:32:45.410500 systemd[1]: session-19.scope: Deactivated successfully. Mar 17 17:32:45.411481 systemd-logind[1696]: Session 19 logged out. Waiting for processes to exit. Mar 17 17:32:45.413147 systemd-logind[1696]: Removed session 19. Mar 17 17:32:45.496045 systemd[1]: Started sshd@17-10.200.20.22:22-10.200.16.10:41720.service - OpenSSH per-connection server daemon (10.200.16.10:41720). Mar 17 17:32:45.990378 sshd[6448]: Accepted publickey for core from 10.200.16.10 port 41720 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:32:45.991754 sshd-session[6448]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:32:45.995760 systemd-logind[1696]: New session 20 of user core. Mar 17 17:32:46.003416 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 17 17:32:46.526352 sshd[6466]: Connection closed by 10.200.16.10 port 41720 Mar 17 17:32:46.526948 sshd-session[6448]: pam_unix(sshd:session): session closed for user core Mar 17 17:32:46.530543 systemd-logind[1696]: Session 20 logged out. Waiting for processes to exit. Mar 17 17:32:46.530698 systemd[1]: sshd@17-10.200.20.22:22-10.200.16.10:41720.service: Deactivated successfully. Mar 17 17:32:46.533133 systemd[1]: session-20.scope: Deactivated successfully. Mar 17 17:32:46.535570 systemd-logind[1696]: Removed session 20. Mar 17 17:32:46.613263 systemd[1]: Started sshd@18-10.200.20.22:22-10.200.16.10:41736.service - OpenSSH per-connection server daemon (10.200.16.10:41736). Mar 17 17:32:47.100698 sshd[6475]: Accepted publickey for core from 10.200.16.10 port 41736 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:32:47.102032 sshd-session[6475]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:32:47.106607 systemd-logind[1696]: New session 21 of user core. Mar 17 17:32:47.113408 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 17 17:32:47.521121 sshd[6477]: Connection closed by 10.200.16.10 port 41736 Mar 17 17:32:47.521806 sshd-session[6475]: pam_unix(sshd:session): session closed for user core Mar 17 17:32:47.525733 systemd[1]: sshd@18-10.200.20.22:22-10.200.16.10:41736.service: Deactivated successfully. Mar 17 17:32:47.527822 systemd[1]: session-21.scope: Deactivated successfully. Mar 17 17:32:47.529574 systemd-logind[1696]: Session 21 logged out. Waiting for processes to exit. Mar 17 17:32:47.531193 systemd-logind[1696]: Removed session 21. Mar 17 17:32:52.606469 systemd[1]: Started sshd@19-10.200.20.22:22-10.200.16.10:34768.service - OpenSSH per-connection server daemon (10.200.16.10:34768). Mar 17 17:32:53.053852 sshd[6510]: Accepted publickey for core from 10.200.16.10 port 34768 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:32:53.055484 sshd-session[6510]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:32:53.059364 systemd-logind[1696]: New session 22 of user core. Mar 17 17:32:53.065449 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 17 17:32:53.433586 sshd[6512]: Connection closed by 10.200.16.10 port 34768 Mar 17 17:32:53.434360 sshd-session[6510]: pam_unix(sshd:session): session closed for user core Mar 17 17:32:53.437609 systemd-logind[1696]: Session 22 logged out. Waiting for processes to exit. Mar 17 17:32:53.438209 systemd[1]: sshd@19-10.200.20.22:22-10.200.16.10:34768.service: Deactivated successfully. Mar 17 17:32:53.440153 systemd[1]: session-22.scope: Deactivated successfully. Mar 17 17:32:53.441370 systemd-logind[1696]: Removed session 22. Mar 17 17:32:58.527749 systemd[1]: Started sshd@20-10.200.20.22:22-10.200.16.10:55238.service - OpenSSH per-connection server daemon (10.200.16.10:55238). Mar 17 17:32:59.015736 sshd[6548]: Accepted publickey for core from 10.200.16.10 port 55238 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:32:59.017521 sshd-session[6548]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:32:59.021374 systemd-logind[1696]: New session 23 of user core. Mar 17 17:32:59.028386 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 17 17:32:59.438743 sshd[6550]: Connection closed by 10.200.16.10 port 55238 Mar 17 17:32:59.439327 sshd-session[6548]: pam_unix(sshd:session): session closed for user core Mar 17 17:32:59.442191 systemd-logind[1696]: Session 23 logged out. Waiting for processes to exit. Mar 17 17:32:59.442347 systemd[1]: sshd@20-10.200.20.22:22-10.200.16.10:55238.service: Deactivated successfully. Mar 17 17:32:59.443909 systemd[1]: session-23.scope: Deactivated successfully. Mar 17 17:32:59.446172 systemd-logind[1696]: Removed session 23. Mar 17 17:33:04.519932 systemd[1]: Started sshd@21-10.200.20.22:22-10.200.16.10:55242.service - OpenSSH per-connection server daemon (10.200.16.10:55242). Mar 17 17:33:04.969736 sshd[6562]: Accepted publickey for core from 10.200.16.10 port 55242 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:33:04.971036 sshd-session[6562]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:33:04.975282 systemd-logind[1696]: New session 24 of user core. Mar 17 17:33:04.980381 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 17 17:33:05.350994 sshd[6564]: Connection closed by 10.200.16.10 port 55242 Mar 17 17:33:05.351584 sshd-session[6562]: pam_unix(sshd:session): session closed for user core Mar 17 17:33:05.354904 systemd[1]: sshd@21-10.200.20.22:22-10.200.16.10:55242.service: Deactivated successfully. Mar 17 17:33:05.356850 systemd[1]: session-24.scope: Deactivated successfully. Mar 17 17:33:05.357800 systemd-logind[1696]: Session 24 logged out. Waiting for processes to exit. Mar 17 17:33:05.358943 systemd-logind[1696]: Removed session 24. Mar 17 17:33:10.431760 systemd[1]: Started sshd@22-10.200.20.22:22-10.200.16.10:44336.service - OpenSSH per-connection server daemon (10.200.16.10:44336). Mar 17 17:33:10.879919 sshd[6583]: Accepted publickey for core from 10.200.16.10 port 44336 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:33:10.881328 sshd-session[6583]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:33:10.888867 systemd-logind[1696]: New session 25 of user core. Mar 17 17:33:10.895397 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 17 17:33:11.266847 sshd[6588]: Connection closed by 10.200.16.10 port 44336 Mar 17 17:33:11.266405 sshd-session[6583]: pam_unix(sshd:session): session closed for user core Mar 17 17:33:11.270004 systemd[1]: sshd@22-10.200.20.22:22-10.200.16.10:44336.service: Deactivated successfully. Mar 17 17:33:11.270054 systemd-logind[1696]: Session 25 logged out. Waiting for processes to exit. Mar 17 17:33:11.271841 systemd[1]: session-25.scope: Deactivated successfully. Mar 17 17:33:11.272792 systemd-logind[1696]: Removed session 25. Mar 17 17:33:16.351494 systemd[1]: Started sshd@23-10.200.20.22:22-10.200.16.10:44348.service - OpenSSH per-connection server daemon (10.200.16.10:44348). Mar 17 17:33:16.796861 sshd[6618]: Accepted publickey for core from 10.200.16.10 port 44348 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:33:16.798210 sshd-session[6618]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:33:16.802165 systemd-logind[1696]: New session 26 of user core. Mar 17 17:33:16.813375 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 17 17:33:17.180732 sshd[6620]: Connection closed by 10.200.16.10 port 44348 Mar 17 17:33:17.181665 sshd-session[6618]: pam_unix(sshd:session): session closed for user core Mar 17 17:33:17.185111 systemd[1]: sshd@23-10.200.20.22:22-10.200.16.10:44348.service: Deactivated successfully. Mar 17 17:33:17.187003 systemd[1]: session-26.scope: Deactivated successfully. Mar 17 17:33:17.187750 systemd-logind[1696]: Session 26 logged out. Waiting for processes to exit. Mar 17 17:33:17.188658 systemd-logind[1696]: Removed session 26. Mar 17 17:33:22.262247 systemd[1]: Started sshd@24-10.200.20.22:22-10.200.16.10:57288.service - OpenSSH per-connection server daemon (10.200.16.10:57288). Mar 17 17:33:22.711604 sshd[6634]: Accepted publickey for core from 10.200.16.10 port 57288 ssh2: RSA SHA256:Vv+Gx/xgYWEBj55H1UdRAcw683xVG5W8/4UU5IxNHAc Mar 17 17:33:22.712829 sshd-session[6634]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:33:22.716740 systemd-logind[1696]: New session 27 of user core. Mar 17 17:33:22.723369 systemd[1]: Started session-27.scope - Session 27 of User core. Mar 17 17:33:23.095586 sshd[6636]: Connection closed by 10.200.16.10 port 57288 Mar 17 17:33:23.096146 sshd-session[6634]: pam_unix(sshd:session): session closed for user core Mar 17 17:33:23.099775 systemd-logind[1696]: Session 27 logged out. Waiting for processes to exit. Mar 17 17:33:23.099946 systemd[1]: sshd@24-10.200.20.22:22-10.200.16.10:57288.service: Deactivated successfully. Mar 17 17:33:23.101504 systemd[1]: session-27.scope: Deactivated successfully. Mar 17 17:33:23.102486 systemd-logind[1696]: Removed session 27.