Jan 13 20:16:52.891374 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 13 20:16:52.891400 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241116 p3) 14.2.1 20241116, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT Mon Jan 13 18:56:28 -00 2025 Jan 13 20:16:52.891410 kernel: KASLR enabled Jan 13 20:16:52.891415 kernel: efi: EFI v2.7 by EDK II Jan 13 20:16:52.891421 kernel: efi: SMBIOS 3.0=0x135ed0000 MEMATTR=0x133c6b018 ACPI 2.0=0x132430018 RNG=0x13243e918 MEMRESERVE=0x132357218 Jan 13 20:16:52.891427 kernel: random: crng init done Jan 13 20:16:52.891434 kernel: secureboot: Secure boot disabled Jan 13 20:16:52.891440 kernel: ACPI: Early table checksum verification disabled Jan 13 20:16:52.891446 kernel: ACPI: RSDP 0x0000000132430018 000024 (v02 BOCHS ) Jan 13 20:16:52.891452 kernel: ACPI: XSDT 0x000000013243FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Jan 13 20:16:52.891460 kernel: ACPI: FACP 0x000000013243FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 20:16:52.891466 kernel: ACPI: DSDT 0x0000000132437518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 20:16:52.891472 kernel: ACPI: APIC 0x000000013243FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 20:16:52.891477 kernel: ACPI: PPTT 0x000000013243FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 20:16:52.891484 kernel: ACPI: GTDT 0x000000013243D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 20:16:52.891492 kernel: ACPI: MCFG 0x000000013243FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 20:16:52.891499 kernel: ACPI: SPCR 0x000000013243E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 20:16:52.891505 kernel: ACPI: DBG2 0x000000013243E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 20:16:52.891511 kernel: ACPI: IORT 0x000000013243E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 20:16:52.891518 kernel: ACPI: BGRT 0x000000013243E798 000038 (v01 INTEL EDK2 00000002 01000013) Jan 13 20:16:52.891524 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 13 20:16:52.891530 kernel: NUMA: Failed to initialise from firmware Jan 13 20:16:52.891536 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Jan 13 20:16:52.891542 kernel: NUMA: NODE_DATA [mem 0x13981f800-0x139824fff] Jan 13 20:16:52.891549 kernel: Zone ranges: Jan 13 20:16:52.891556 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 13 20:16:52.891566 kernel: DMA32 empty Jan 13 20:16:52.891573 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Jan 13 20:16:52.891579 kernel: Movable zone start for each node Jan 13 20:16:52.891585 kernel: Early memory node ranges Jan 13 20:16:52.891591 kernel: node 0: [mem 0x0000000040000000-0x000000013243ffff] Jan 13 20:16:52.891598 kernel: node 0: [mem 0x0000000132440000-0x000000013272ffff] Jan 13 20:16:52.891604 kernel: node 0: [mem 0x0000000132730000-0x0000000135bfffff] Jan 13 20:16:52.891610 kernel: node 0: [mem 0x0000000135c00000-0x0000000135fdffff] Jan 13 20:16:52.891616 kernel: node 0: [mem 0x0000000135fe0000-0x0000000139ffffff] Jan 13 20:16:52.891622 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Jan 13 20:16:52.891628 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Jan 13 20:16:52.891636 kernel: psci: probing for conduit method from ACPI. Jan 13 20:16:52.891643 kernel: psci: PSCIv1.1 detected in firmware. Jan 13 20:16:52.891649 kernel: psci: Using standard PSCI v0.2 function IDs Jan 13 20:16:52.891658 kernel: psci: Trusted OS migration not required Jan 13 20:16:52.891665 kernel: psci: SMC Calling Convention v1.1 Jan 13 20:16:52.891671 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 13 20:16:52.891680 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Jan 13 20:16:52.891686 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Jan 13 20:16:52.891693 kernel: pcpu-alloc: [0] 0 [0] 1 Jan 13 20:16:52.891699 kernel: Detected PIPT I-cache on CPU0 Jan 13 20:16:52.891898 kernel: CPU features: detected: GIC system register CPU interface Jan 13 20:16:52.891905 kernel: CPU features: detected: Hardware dirty bit management Jan 13 20:16:52.891912 kernel: CPU features: detected: Spectre-v4 Jan 13 20:16:52.891918 kernel: CPU features: detected: Spectre-BHB Jan 13 20:16:52.891925 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 13 20:16:52.891932 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 13 20:16:52.891938 kernel: CPU features: detected: ARM erratum 1418040 Jan 13 20:16:52.891950 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 13 20:16:52.891957 kernel: alternatives: applying boot alternatives Jan 13 20:16:52.891965 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=9798117b3b15ef802e3d618077f87253cc08e0d5280b8fe28b307e7558b7ebcc Jan 13 20:16:52.891972 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 13 20:16:52.891978 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 13 20:16:52.891985 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 13 20:16:52.891991 kernel: Fallback order for Node 0: 0 Jan 13 20:16:52.891998 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 Jan 13 20:16:52.892004 kernel: Policy zone: Normal Jan 13 20:16:52.892011 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 13 20:16:52.892017 kernel: software IO TLB: area num 2. Jan 13 20:16:52.892026 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) Jan 13 20:16:52.892032 kernel: Memory: 3881016K/4096000K available (10304K kernel code, 2184K rwdata, 8092K rodata, 39936K init, 897K bss, 214984K reserved, 0K cma-reserved) Jan 13 20:16:52.892039 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 13 20:16:52.892045 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 13 20:16:52.892053 kernel: rcu: RCU event tracing is enabled. Jan 13 20:16:52.892059 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 13 20:16:52.892066 kernel: Trampoline variant of Tasks RCU enabled. Jan 13 20:16:52.892072 kernel: Tracing variant of Tasks RCU enabled. Jan 13 20:16:52.892079 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 13 20:16:52.892085 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 13 20:16:52.892103 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 13 20:16:52.892112 kernel: GICv3: 256 SPIs implemented Jan 13 20:16:52.892119 kernel: GICv3: 0 Extended SPIs implemented Jan 13 20:16:52.892125 kernel: Root IRQ handler: gic_handle_irq Jan 13 20:16:52.892132 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 13 20:16:52.892138 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 13 20:16:52.892145 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 13 20:16:52.892152 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) Jan 13 20:16:52.892158 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) Jan 13 20:16:52.892165 kernel: GICv3: using LPI property table @0x00000001000e0000 Jan 13 20:16:52.892171 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 Jan 13 20:16:52.892178 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 13 20:16:52.892186 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 13 20:16:52.892193 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 13 20:16:52.892199 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 13 20:16:52.892206 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 13 20:16:52.892213 kernel: Console: colour dummy device 80x25 Jan 13 20:16:52.892220 kernel: ACPI: Core revision 20230628 Jan 13 20:16:52.892227 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 13 20:16:52.892234 kernel: pid_max: default: 32768 minimum: 301 Jan 13 20:16:52.892241 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 13 20:16:52.892248 kernel: landlock: Up and running. Jan 13 20:16:52.892256 kernel: SELinux: Initializing. Jan 13 20:16:52.892263 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 13 20:16:52.892269 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 13 20:16:52.892276 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 13 20:16:52.892283 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 13 20:16:52.892290 kernel: rcu: Hierarchical SRCU implementation. Jan 13 20:16:52.892297 kernel: rcu: Max phase no-delay instances is 400. Jan 13 20:16:52.892304 kernel: Platform MSI: ITS@0x8080000 domain created Jan 13 20:16:52.892310 kernel: PCI/MSI: ITS@0x8080000 domain created Jan 13 20:16:52.892318 kernel: Remapping and enabling EFI services. Jan 13 20:16:52.892325 kernel: smp: Bringing up secondary CPUs ... Jan 13 20:16:52.892332 kernel: Detected PIPT I-cache on CPU1 Jan 13 20:16:52.892338 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 13 20:16:52.892345 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 Jan 13 20:16:52.892352 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 13 20:16:52.892359 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 13 20:16:52.892366 kernel: smp: Brought up 1 node, 2 CPUs Jan 13 20:16:52.892372 kernel: SMP: Total of 2 processors activated. Jan 13 20:16:52.892379 kernel: CPU features: detected: 32-bit EL0 Support Jan 13 20:16:52.892387 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 13 20:16:52.892394 kernel: CPU features: detected: Common not Private translations Jan 13 20:16:52.892407 kernel: CPU features: detected: CRC32 instructions Jan 13 20:16:52.892415 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 13 20:16:52.892422 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 13 20:16:52.892429 kernel: CPU features: detected: LSE atomic instructions Jan 13 20:16:52.892437 kernel: CPU features: detected: Privileged Access Never Jan 13 20:16:52.892444 kernel: CPU features: detected: RAS Extension Support Jan 13 20:16:52.892451 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 13 20:16:52.892460 kernel: CPU: All CPU(s) started at EL1 Jan 13 20:16:52.892471 kernel: alternatives: applying system-wide alternatives Jan 13 20:16:52.892479 kernel: devtmpfs: initialized Jan 13 20:16:52.892486 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 13 20:16:52.892494 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 13 20:16:52.892501 kernel: pinctrl core: initialized pinctrl subsystem Jan 13 20:16:52.892508 kernel: SMBIOS 3.0.0 present. Jan 13 20:16:52.892515 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Jan 13 20:16:52.892525 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 13 20:16:52.892532 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Jan 13 20:16:52.892539 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 13 20:16:52.892546 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 13 20:16:52.892553 kernel: audit: initializing netlink subsys (disabled) Jan 13 20:16:52.892560 kernel: audit: type=2000 audit(0.012:1): state=initialized audit_enabled=0 res=1 Jan 13 20:16:52.892567 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 13 20:16:52.892575 kernel: cpuidle: using governor menu Jan 13 20:16:52.892582 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 13 20:16:52.892591 kernel: ASID allocator initialised with 32768 entries Jan 13 20:16:52.892598 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 13 20:16:52.892605 kernel: Serial: AMBA PL011 UART driver Jan 13 20:16:52.892612 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 13 20:16:52.892619 kernel: Modules: 0 pages in range for non-PLT usage Jan 13 20:16:52.892627 kernel: Modules: 508880 pages in range for PLT usage Jan 13 20:16:52.892634 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 13 20:16:52.892641 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 13 20:16:52.892648 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 13 20:16:52.892657 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 13 20:16:52.892664 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 13 20:16:52.892671 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 13 20:16:52.892678 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 13 20:16:52.892685 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 13 20:16:52.892692 kernel: ACPI: Added _OSI(Module Device) Jan 13 20:16:52.892700 kernel: ACPI: Added _OSI(Processor Device) Jan 13 20:16:52.892735 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 13 20:16:52.892743 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 13 20:16:52.892753 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 13 20:16:52.892760 kernel: ACPI: Interpreter enabled Jan 13 20:16:52.892767 kernel: ACPI: Using GIC for interrupt routing Jan 13 20:16:52.892775 kernel: ACPI: MCFG table detected, 1 entries Jan 13 20:16:52.892783 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 13 20:16:52.892790 kernel: printk: console [ttyAMA0] enabled Jan 13 20:16:52.892797 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 13 20:16:52.892971 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 13 20:16:52.893047 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 13 20:16:52.893160 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 13 20:16:52.893227 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 13 20:16:52.893289 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 13 20:16:52.893298 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 13 20:16:52.893305 kernel: PCI host bridge to bus 0000:00 Jan 13 20:16:52.893376 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 13 20:16:52.893439 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 13 20:16:52.893500 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 13 20:16:52.893559 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 13 20:16:52.893646 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Jan 13 20:16:52.893748 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 Jan 13 20:16:52.893817 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] Jan 13 20:16:52.893882 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] Jan 13 20:16:52.893963 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Jan 13 20:16:52.894030 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] Jan 13 20:16:52.894117 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Jan 13 20:16:52.894186 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] Jan 13 20:16:52.894265 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Jan 13 20:16:52.894332 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] Jan 13 20:16:52.894411 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Jan 13 20:16:52.894476 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] Jan 13 20:16:52.894549 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Jan 13 20:16:52.894613 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] Jan 13 20:16:52.894693 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Jan 13 20:16:52.894778 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] Jan 13 20:16:52.894854 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Jan 13 20:16:52.894918 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] Jan 13 20:16:52.894990 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Jan 13 20:16:52.895052 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] Jan 13 20:16:52.895138 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Jan 13 20:16:52.895205 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] Jan 13 20:16:52.895284 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 Jan 13 20:16:52.895349 kernel: pci 0000:00:04.0: reg 0x10: [io 0x8200-0x8207] Jan 13 20:16:52.895424 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Jan 13 20:16:52.895491 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] Jan 13 20:16:52.895558 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Jan 13 20:16:52.895635 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Jan 13 20:16:52.895739 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Jan 13 20:16:52.895815 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] Jan 13 20:16:52.895888 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Jan 13 20:16:52.895956 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] Jan 13 20:16:52.896022 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] Jan 13 20:16:52.896138 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Jan 13 20:16:52.896221 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] Jan 13 20:16:52.896301 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Jan 13 20:16:52.896368 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] Jan 13 20:16:52.896443 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Jan 13 20:16:52.896508 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] Jan 13 20:16:52.896575 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] Jan 13 20:16:52.896647 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Jan 13 20:16:52.896752 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] Jan 13 20:16:52.896837 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] Jan 13 20:16:52.896907 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Jan 13 20:16:52.896975 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 13 20:16:52.897038 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 13 20:16:52.897117 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 13 20:16:52.897188 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 13 20:16:52.897268 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 13 20:16:52.897332 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 13 20:16:52.897399 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 13 20:16:52.897462 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 13 20:16:52.897528 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 13 20:16:52.897594 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 13 20:16:52.897657 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 13 20:16:52.897755 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 13 20:16:52.897824 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 13 20:16:52.897887 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 13 20:16:52.897949 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Jan 13 20:16:52.898016 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 13 20:16:52.898078 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 13 20:16:52.898208 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 13 20:16:52.898279 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 13 20:16:52.898350 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Jan 13 20:16:52.898414 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Jan 13 20:16:52.898481 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 13 20:16:52.898545 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 13 20:16:52.898608 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 13 20:16:52.898675 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 13 20:16:52.898890 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 13 20:16:52.898968 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 13 20:16:52.899033 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] Jan 13 20:16:52.899118 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] Jan 13 20:16:52.899189 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] Jan 13 20:16:52.899251 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] Jan 13 20:16:52.899317 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] Jan 13 20:16:52.899380 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] Jan 13 20:16:52.899449 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] Jan 13 20:16:52.899512 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] Jan 13 20:16:52.899575 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] Jan 13 20:16:52.899637 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] Jan 13 20:16:52.899701 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] Jan 13 20:16:52.899791 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 13 20:16:52.899855 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] Jan 13 20:16:52.899922 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 13 20:16:52.899986 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] Jan 13 20:16:52.900052 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 13 20:16:52.900131 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] Jan 13 20:16:52.900197 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] Jan 13 20:16:52.900265 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] Jan 13 20:16:52.900328 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] Jan 13 20:16:52.900395 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] Jan 13 20:16:52.900457 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Jan 13 20:16:52.900519 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] Jan 13 20:16:52.900582 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Jan 13 20:16:52.900650 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] Jan 13 20:16:52.900728 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Jan 13 20:16:52.900791 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] Jan 13 20:16:52.900853 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Jan 13 20:16:52.900925 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] Jan 13 20:16:52.900988 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Jan 13 20:16:52.901051 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] Jan 13 20:16:52.901157 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Jan 13 20:16:52.901233 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] Jan 13 20:16:52.901297 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Jan 13 20:16:52.901362 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] Jan 13 20:16:52.901425 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Jan 13 20:16:52.901494 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] Jan 13 20:16:52.901572 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] Jan 13 20:16:52.901643 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] Jan 13 20:16:52.901813 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] Jan 13 20:16:52.901900 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Jan 13 20:16:52.901976 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] Jan 13 20:16:52.902051 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 13 20:16:52.902140 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Jan 13 20:16:52.902204 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Jan 13 20:16:52.902266 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 13 20:16:52.902335 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] Jan 13 20:16:52.902400 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 13 20:16:52.902466 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Jan 13 20:16:52.902528 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Jan 13 20:16:52.902590 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 13 20:16:52.902660 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] Jan 13 20:16:52.902863 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] Jan 13 20:16:52.902934 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 13 20:16:52.902995 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Jan 13 20:16:52.903055 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Jan 13 20:16:52.906970 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 13 20:16:52.907105 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] Jan 13 20:16:52.907184 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 13 20:16:52.907251 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Jan 13 20:16:52.907314 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Jan 13 20:16:52.907376 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 13 20:16:52.907450 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] Jan 13 20:16:52.907519 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 13 20:16:52.907593 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Jan 13 20:16:52.907657 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Jan 13 20:16:52.907759 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 13 20:16:52.907844 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] Jan 13 20:16:52.907916 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] Jan 13 20:16:52.907984 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 13 20:16:52.908048 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Jan 13 20:16:52.908129 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 13 20:16:52.908201 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 13 20:16:52.908276 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] Jan 13 20:16:52.908342 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] Jan 13 20:16:52.908408 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] Jan 13 20:16:52.908474 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 13 20:16:52.908537 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Jan 13 20:16:52.908601 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 13 20:16:52.908667 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 13 20:16:52.908760 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 13 20:16:52.908826 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Jan 13 20:16:52.908891 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 13 20:16:52.908953 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 13 20:16:52.909020 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 13 20:16:52.909097 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Jan 13 20:16:52.909172 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Jan 13 20:16:52.909242 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 13 20:16:52.909312 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 13 20:16:52.909393 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 13 20:16:52.910857 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 13 20:16:52.910962 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Jan 13 20:16:52.911022 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 13 20:16:52.911079 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 13 20:16:52.911178 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Jan 13 20:16:52.911247 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 13 20:16:52.911305 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 13 20:16:52.911380 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Jan 13 20:16:52.911439 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 13 20:16:52.911504 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 13 20:16:52.911575 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Jan 13 20:16:52.911646 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 13 20:16:52.911734 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 13 20:16:52.911827 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Jan 13 20:16:52.911896 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 13 20:16:52.911955 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 13 20:16:52.912025 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Jan 13 20:16:52.912085 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 13 20:16:52.912160 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 13 20:16:52.912229 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Jan 13 20:16:52.912289 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 13 20:16:52.912352 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 13 20:16:52.912420 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Jan 13 20:16:52.912478 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 13 20:16:52.912538 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 13 20:16:52.912605 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Jan 13 20:16:52.912666 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 13 20:16:52.913643 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 13 20:16:52.913669 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 13 20:16:52.913678 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 13 20:16:52.913686 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 13 20:16:52.913693 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 13 20:16:52.913701 kernel: iommu: Default domain type: Translated Jan 13 20:16:52.913723 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 13 20:16:52.913731 kernel: efivars: Registered efivars operations Jan 13 20:16:52.913738 kernel: vgaarb: loaded Jan 13 20:16:52.913746 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 13 20:16:52.913756 kernel: VFS: Disk quotas dquot_6.6.0 Jan 13 20:16:52.913764 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 13 20:16:52.913771 kernel: pnp: PnP ACPI init Jan 13 20:16:52.913869 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 13 20:16:52.913882 kernel: pnp: PnP ACPI: found 1 devices Jan 13 20:16:52.913890 kernel: NET: Registered PF_INET protocol family Jan 13 20:16:52.913897 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 13 20:16:52.913905 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 13 20:16:52.913913 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 13 20:16:52.913923 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 13 20:16:52.913931 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 13 20:16:52.913940 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 13 20:16:52.913948 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 13 20:16:52.913955 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 13 20:16:52.913963 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 13 20:16:52.914041 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 13 20:16:52.914053 kernel: PCI: CLS 0 bytes, default 64 Jan 13 20:16:52.914063 kernel: kvm [1]: HYP mode not available Jan 13 20:16:52.914070 kernel: Initialise system trusted keyrings Jan 13 20:16:52.914078 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 13 20:16:52.914096 kernel: Key type asymmetric registered Jan 13 20:16:52.914106 kernel: Asymmetric key parser 'x509' registered Jan 13 20:16:52.914113 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 13 20:16:52.914121 kernel: io scheduler mq-deadline registered Jan 13 20:16:52.914129 kernel: io scheduler kyber registered Jan 13 20:16:52.914136 kernel: io scheduler bfq registered Jan 13 20:16:52.914145 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 13 20:16:52.914224 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Jan 13 20:16:52.914292 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Jan 13 20:16:52.914357 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 20:16:52.914425 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Jan 13 20:16:52.914500 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Jan 13 20:16:52.914571 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 20:16:52.914645 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Jan 13 20:16:52.914724 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Jan 13 20:16:52.914791 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 20:16:52.914860 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Jan 13 20:16:52.914926 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Jan 13 20:16:52.914989 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 20:16:52.915062 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Jan 13 20:16:52.915144 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Jan 13 20:16:52.915211 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 20:16:52.915279 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Jan 13 20:16:52.915344 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Jan 13 20:16:52.915408 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 20:16:52.915483 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Jan 13 20:16:52.915548 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Jan 13 20:16:52.915612 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 20:16:52.915682 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Jan 13 20:16:52.915789 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Jan 13 20:16:52.915859 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 20:16:52.915873 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 13 20:16:52.915941 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Jan 13 20:16:52.916010 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Jan 13 20:16:52.916075 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 20:16:52.916119 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 13 20:16:52.916129 kernel: ACPI: button: Power Button [PWRB] Jan 13 20:16:52.916137 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 13 20:16:52.916230 kernel: virtio-pci 0000:03:00.0: enabling device (0000 -> 0002) Jan 13 20:16:52.916308 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 13 20:16:52.916380 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Jan 13 20:16:52.916392 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 13 20:16:52.916399 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 13 20:16:52.916466 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Jan 13 20:16:52.916491 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Jan 13 20:16:52.916499 kernel: thunder_xcv, ver 1.0 Jan 13 20:16:52.916510 kernel: thunder_bgx, ver 1.0 Jan 13 20:16:52.916518 kernel: nicpf, ver 1.0 Jan 13 20:16:52.916525 kernel: nicvf, ver 1.0 Jan 13 20:16:52.916614 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 13 20:16:52.916679 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-01-13T20:16:52 UTC (1736799412) Jan 13 20:16:52.916689 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 13 20:16:52.916697 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Jan 13 20:16:52.916759 kernel: watchdog: Delayed init of the lockup detector failed: -19 Jan 13 20:16:52.916772 kernel: watchdog: Hard watchdog permanently disabled Jan 13 20:16:52.916781 kernel: NET: Registered PF_INET6 protocol family Jan 13 20:16:52.916788 kernel: Segment Routing with IPv6 Jan 13 20:16:52.916796 kernel: In-situ OAM (IOAM) with IPv6 Jan 13 20:16:52.916805 kernel: NET: Registered PF_PACKET protocol family Jan 13 20:16:52.916813 kernel: Key type dns_resolver registered Jan 13 20:16:52.916821 kernel: registered taskstats version 1 Jan 13 20:16:52.916828 kernel: Loading compiled-in X.509 certificates Jan 13 20:16:52.916836 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: 46cb4d1b22f3a5974766fe7d7b651e2f296d4fe0' Jan 13 20:16:52.916845 kernel: Key type .fscrypt registered Jan 13 20:16:52.916852 kernel: Key type fscrypt-provisioning registered Jan 13 20:16:52.916860 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 13 20:16:52.916868 kernel: ima: Allocated hash algorithm: sha1 Jan 13 20:16:52.916875 kernel: ima: No architecture policies found Jan 13 20:16:52.916883 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 13 20:16:52.916891 kernel: clk: Disabling unused clocks Jan 13 20:16:52.916898 kernel: Freeing unused kernel memory: 39936K Jan 13 20:16:52.916906 kernel: Run /init as init process Jan 13 20:16:52.916915 kernel: with arguments: Jan 13 20:16:52.916923 kernel: /init Jan 13 20:16:52.916931 kernel: with environment: Jan 13 20:16:52.916938 kernel: HOME=/ Jan 13 20:16:52.916946 kernel: TERM=linux Jan 13 20:16:52.916953 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 13 20:16:52.916964 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 20:16:52.916973 systemd[1]: Detected virtualization kvm. Jan 13 20:16:52.916990 systemd[1]: Detected architecture arm64. Jan 13 20:16:52.916998 systemd[1]: Running in initrd. Jan 13 20:16:52.917007 systemd[1]: No hostname configured, using default hostname. Jan 13 20:16:52.917016 systemd[1]: Hostname set to . Jan 13 20:16:52.917024 systemd[1]: Initializing machine ID from VM UUID. Jan 13 20:16:52.917033 systemd[1]: Queued start job for default target initrd.target. Jan 13 20:16:52.917041 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:16:52.917050 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:16:52.917060 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 13 20:16:52.917068 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 20:16:52.917077 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 13 20:16:52.917122 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 13 20:16:52.917133 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 13 20:16:52.917142 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 13 20:16:52.917150 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:16:52.917161 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:16:52.917169 systemd[1]: Reached target paths.target - Path Units. Jan 13 20:16:52.917177 systemd[1]: Reached target slices.target - Slice Units. Jan 13 20:16:52.917185 systemd[1]: Reached target swap.target - Swaps. Jan 13 20:16:52.917193 systemd[1]: Reached target timers.target - Timer Units. Jan 13 20:16:52.917201 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:16:52.917209 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:16:52.917217 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 13 20:16:52.917227 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 13 20:16:52.917235 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:16:52.917243 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 20:16:52.917251 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:16:52.917260 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 20:16:52.917268 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 13 20:16:52.917276 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 20:16:52.917284 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 13 20:16:52.917292 systemd[1]: Starting systemd-fsck-usr.service... Jan 13 20:16:52.917302 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 20:16:52.917310 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 20:16:52.917318 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:16:52.917326 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 13 20:16:52.917334 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:16:52.917344 systemd[1]: Finished systemd-fsck-usr.service. Jan 13 20:16:52.917388 systemd-journald[236]: Collecting audit messages is disabled. Jan 13 20:16:52.917409 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 20:16:52.917419 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 13 20:16:52.917427 kernel: Bridge firewalling registered Jan 13 20:16:52.917435 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 20:16:52.917445 systemd-journald[236]: Journal started Jan 13 20:16:52.917471 systemd-journald[236]: Runtime Journal (/run/log/journal/f00f97c8d9104e05a00ff76a63f7ba10) is 8.0M, max 76.5M, 68.5M free. Jan 13 20:16:52.884686 systemd-modules-load[237]: Inserted module 'overlay' Jan 13 20:16:52.911162 systemd-modules-load[237]: Inserted module 'br_netfilter' Jan 13 20:16:52.925774 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 20:16:52.927211 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 20:16:52.927791 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:16:52.928874 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:16:52.935282 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:16:52.949004 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:16:52.954989 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 20:16:52.967540 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 20:16:52.971665 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:16:52.975020 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 13 20:16:52.979145 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:16:52.984312 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:16:52.991960 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 20:16:53.011070 dracut-cmdline[271]: dracut-dracut-053 Jan 13 20:16:53.017011 dracut-cmdline[271]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=9798117b3b15ef802e3d618077f87253cc08e0d5280b8fe28b307e7558b7ebcc Jan 13 20:16:53.033404 systemd-resolved[275]: Positive Trust Anchors: Jan 13 20:16:53.033425 systemd-resolved[275]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 20:16:53.033456 systemd-resolved[275]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 20:16:53.040954 systemd-resolved[275]: Defaulting to hostname 'linux'. Jan 13 20:16:53.042981 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 20:16:53.043645 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:16:53.117766 kernel: SCSI subsystem initialized Jan 13 20:16:53.122750 kernel: Loading iSCSI transport class v2.0-870. Jan 13 20:16:53.131382 kernel: iscsi: registered transport (tcp) Jan 13 20:16:53.145757 kernel: iscsi: registered transport (qla4xxx) Jan 13 20:16:53.145824 kernel: QLogic iSCSI HBA Driver Jan 13 20:16:53.190355 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 13 20:16:53.196986 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 13 20:16:53.216013 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 13 20:16:53.216131 kernel: device-mapper: uevent: version 1.0.3 Jan 13 20:16:53.216146 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 13 20:16:53.269775 kernel: raid6: neonx8 gen() 15468 MB/s Jan 13 20:16:53.286787 kernel: raid6: neonx4 gen() 15234 MB/s Jan 13 20:16:53.303807 kernel: raid6: neonx2 gen() 12965 MB/s Jan 13 20:16:53.320849 kernel: raid6: neonx1 gen() 10393 MB/s Jan 13 20:16:53.337791 kernel: raid6: int64x8 gen() 6675 MB/s Jan 13 20:16:53.356170 kernel: raid6: int64x4 gen() 7189 MB/s Jan 13 20:16:53.371764 kernel: raid6: int64x2 gen() 6077 MB/s Jan 13 20:16:53.388758 kernel: raid6: int64x1 gen() 5033 MB/s Jan 13 20:16:53.388837 kernel: raid6: using algorithm neonx8 gen() 15468 MB/s Jan 13 20:16:53.405789 kernel: raid6: .... xor() 11865 MB/s, rmw enabled Jan 13 20:16:53.405883 kernel: raid6: using neon recovery algorithm Jan 13 20:16:53.410735 kernel: xor: measuring software checksum speed Jan 13 20:16:53.410780 kernel: 8regs : 21584 MB/sec Jan 13 20:16:53.411844 kernel: 32regs : 19017 MB/sec Jan 13 20:16:53.411875 kernel: arm64_neon : 27766 MB/sec Jan 13 20:16:53.411892 kernel: xor: using function: arm64_neon (27766 MB/sec) Jan 13 20:16:53.464747 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 13 20:16:53.478508 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:16:53.483955 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:16:53.508205 systemd-udevd[457]: Using default interface naming scheme 'v255'. Jan 13 20:16:53.511729 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:16:53.522374 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 13 20:16:53.539496 dracut-pre-trigger[464]: rd.md=0: removing MD RAID activation Jan 13 20:16:53.581747 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:16:53.588945 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 20:16:53.643185 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:16:53.651442 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 13 20:16:53.671673 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 13 20:16:53.673539 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:16:53.675419 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:16:53.676124 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 20:16:53.685063 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 13 20:16:53.702234 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:16:53.784884 kernel: scsi host0: Virtio SCSI HBA Jan 13 20:16:53.788363 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Jan 13 20:16:53.788438 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jan 13 20:16:53.806724 kernel: ACPI: bus type USB registered Jan 13 20:16:53.807946 kernel: usbcore: registered new interface driver usbfs Jan 13 20:16:53.807987 kernel: usbcore: registered new interface driver hub Jan 13 20:16:53.810762 kernel: usbcore: registered new device driver usb Jan 13 20:16:53.813996 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:16:53.814873 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:16:53.817170 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:16:53.817745 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:16:53.817913 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:16:53.819756 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:16:53.827010 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:16:53.842034 kernel: sr 0:0:0:0: Power-on or device reset occurred Jan 13 20:16:53.857826 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Jan 13 20:16:53.857944 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 13 20:16:53.858043 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 13 20:16:53.858054 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 13 20:16:53.858198 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 13 20:16:53.858292 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 13 20:16:53.858381 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 13 20:16:53.858459 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 13 20:16:53.858535 kernel: hub 1-0:1.0: USB hub found Jan 13 20:16:53.858639 kernel: hub 1-0:1.0: 4 ports detected Jan 13 20:16:53.859169 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 13 20:16:53.859338 kernel: hub 2-0:1.0: USB hub found Jan 13 20:16:53.859445 kernel: hub 2-0:1.0: 4 ports detected Jan 13 20:16:53.859531 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Jan 13 20:16:53.859732 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:16:53.863919 kernel: sd 0:0:0:1: Power-on or device reset occurred Jan 13 20:16:53.874867 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Jan 13 20:16:53.875001 kernel: sd 0:0:0:1: [sda] Write Protect is off Jan 13 20:16:53.875111 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Jan 13 20:16:53.875211 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jan 13 20:16:53.875308 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 13 20:16:53.875319 kernel: GPT:17805311 != 80003071 Jan 13 20:16:53.875328 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 13 20:16:53.875338 kernel: GPT:17805311 != 80003071 Jan 13 20:16:53.875351 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 13 20:16:53.875361 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:16:53.875376 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Jan 13 20:16:53.869953 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 20:16:53.885263 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:16:53.929759 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (503) Jan 13 20:16:53.933772 kernel: BTRFS: device fsid 2be7cc1c-29d4-4496-b29b-8561323213d2 devid 1 transid 38 /dev/sda3 scanned by (udev-worker) (502) Jan 13 20:16:53.937148 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jan 13 20:16:53.949539 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jan 13 20:16:53.958274 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jan 13 20:16:53.959901 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Jan 13 20:16:53.966691 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 13 20:16:53.976005 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 13 20:16:53.990578 disk-uuid[575]: Primary Header is updated. Jan 13 20:16:53.990578 disk-uuid[575]: Secondary Entries is updated. Jan 13 20:16:53.990578 disk-uuid[575]: Secondary Header is updated. Jan 13 20:16:54.094780 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 13 20:16:54.339235 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 13 20:16:54.475046 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 13 20:16:54.475112 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 13 20:16:54.476731 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 13 20:16:54.533041 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 13 20:16:54.533386 kernel: usbcore: registered new interface driver usbhid Jan 13 20:16:54.533404 kernel: usbhid: USB HID core driver Jan 13 20:16:55.007729 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 20:16:55.009439 disk-uuid[576]: The operation has completed successfully. Jan 13 20:16:55.081898 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 13 20:16:55.083738 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 13 20:16:55.097151 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 13 20:16:55.103748 sh[584]: Success Jan 13 20:16:55.123781 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Jan 13 20:16:55.188532 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 13 20:16:55.199388 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 13 20:16:55.200291 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 13 20:16:55.234788 kernel: BTRFS info (device dm-0): first mount of filesystem 2be7cc1c-29d4-4496-b29b-8561323213d2 Jan 13 20:16:55.234878 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 13 20:16:55.236180 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 13 20:16:55.236221 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 13 20:16:55.236721 kernel: BTRFS info (device dm-0): using free space tree Jan 13 20:16:55.243731 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 13 20:16:55.245571 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 13 20:16:55.246785 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 13 20:16:55.253152 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 13 20:16:55.258001 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 13 20:16:55.271677 kernel: BTRFS info (device sda6): first mount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779 Jan 13 20:16:55.271768 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jan 13 20:16:55.271784 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:16:55.279209 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:16:55.279283 kernel: BTRFS info (device sda6): auto enabling async discard Jan 13 20:16:55.292990 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 13 20:16:55.297887 kernel: BTRFS info (device sda6): last unmount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779 Jan 13 20:16:55.309247 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 13 20:16:55.317952 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 13 20:16:55.414191 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:16:55.428797 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 20:16:55.436333 ignition[668]: Ignition 2.20.0 Jan 13 20:16:55.436343 ignition[668]: Stage: fetch-offline Jan 13 20:16:55.438355 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:16:55.436384 ignition[668]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:16:55.436393 ignition[668]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 13 20:16:55.436556 ignition[668]: parsed url from cmdline: "" Jan 13 20:16:55.436559 ignition[668]: no config URL provided Jan 13 20:16:55.436563 ignition[668]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 20:16:55.436570 ignition[668]: no config at "/usr/lib/ignition/user.ign" Jan 13 20:16:55.436574 ignition[668]: failed to fetch config: resource requires networking Jan 13 20:16:55.436778 ignition[668]: Ignition finished successfully Jan 13 20:16:55.454791 systemd-networkd[772]: lo: Link UP Jan 13 20:16:55.454805 systemd-networkd[772]: lo: Gained carrier Jan 13 20:16:55.457988 systemd-networkd[772]: Enumeration completed Jan 13 20:16:55.458932 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 20:16:55.461611 systemd-networkd[772]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 13 20:16:55.461628 systemd-networkd[772]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 13 20:16:55.462514 systemd[1]: Reached target network.target - Network. Jan 13 20:16:55.463924 systemd-networkd[772]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 13 20:16:55.463933 systemd-networkd[772]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 13 20:16:55.464799 systemd-networkd[772]: eth0: Link UP Jan 13 20:16:55.464802 systemd-networkd[772]: eth0: Gained carrier Jan 13 20:16:55.464811 systemd-networkd[772]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 13 20:16:55.469361 systemd-networkd[772]: eth1: Link UP Jan 13 20:16:55.469364 systemd-networkd[772]: eth1: Gained carrier Jan 13 20:16:55.469375 systemd-networkd[772]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 13 20:16:55.471950 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 13 20:16:55.480784 systemd-networkd[772]: eth0: DHCPv4 address 138.199.153.203/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 13 20:16:55.488355 ignition[775]: Ignition 2.20.0 Jan 13 20:16:55.488366 ignition[775]: Stage: fetch Jan 13 20:16:55.488662 ignition[775]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:16:55.488674 ignition[775]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 13 20:16:55.488862 ignition[775]: parsed url from cmdline: "" Jan 13 20:16:55.488866 ignition[775]: no config URL provided Jan 13 20:16:55.488872 ignition[775]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 20:16:55.488881 ignition[775]: no config at "/usr/lib/ignition/user.ign" Jan 13 20:16:55.488971 ignition[775]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jan 13 20:16:55.495644 ignition[775]: GET result: OK Jan 13 20:16:55.496548 ignition[775]: parsing config with SHA512: d6a247195694b5fc7415ee0ffe34fbf537b455d5c701b74a667c963822dae5a7ae1203ffaff4165b79779a655a152155023e86fd1eb8e3f04fef8a499eead693 Jan 13 20:16:55.503408 unknown[775]: fetched base config from "system" Jan 13 20:16:55.503799 ignition[775]: fetch: fetch complete Jan 13 20:16:55.503421 unknown[775]: fetched base config from "system" Jan 13 20:16:55.503804 ignition[775]: fetch: fetch passed Jan 13 20:16:55.503427 unknown[775]: fetched user config from "hetzner" Jan 13 20:16:55.503858 ignition[775]: Ignition finished successfully Jan 13 20:16:55.507393 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 13 20:16:55.513981 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 13 20:16:55.522799 systemd-networkd[772]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jan 13 20:16:55.530671 ignition[782]: Ignition 2.20.0 Jan 13 20:16:55.530681 ignition[782]: Stage: kargs Jan 13 20:16:55.530897 ignition[782]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:16:55.530908 ignition[782]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 13 20:16:55.535558 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 13 20:16:55.532995 ignition[782]: kargs: kargs passed Jan 13 20:16:55.533066 ignition[782]: Ignition finished successfully Jan 13 20:16:55.541014 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 13 20:16:55.553877 ignition[788]: Ignition 2.20.0 Jan 13 20:16:55.553888 ignition[788]: Stage: disks Jan 13 20:16:55.554102 ignition[788]: no configs at "/usr/lib/ignition/base.d" Jan 13 20:16:55.554115 ignition[788]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 13 20:16:55.556179 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 13 20:16:55.555097 ignition[788]: disks: disks passed Jan 13 20:16:55.557391 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 13 20:16:55.555156 ignition[788]: Ignition finished successfully Jan 13 20:16:55.558281 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 13 20:16:55.559135 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 20:16:55.561378 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 20:16:55.562200 systemd[1]: Reached target basic.target - Basic System. Jan 13 20:16:55.568121 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 13 20:16:55.600919 systemd-fsck[797]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 13 20:16:55.608825 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 13 20:16:55.615930 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 13 20:16:55.662735 kernel: EXT4-fs (sda9): mounted filesystem f9a95e53-2d63-4443-b523-cb2108fb48f6 r/w with ordered data mode. Quota mode: none. Jan 13 20:16:55.663875 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 13 20:16:55.665968 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 13 20:16:55.676920 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:16:55.681917 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 13 20:16:55.686929 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jan 13 20:16:55.688978 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 13 20:16:55.689021 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:16:55.695570 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 13 20:16:55.699732 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (805) Jan 13 20:16:55.702633 kernel: BTRFS info (device sda6): first mount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779 Jan 13 20:16:55.702712 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jan 13 20:16:55.702744 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:16:55.707078 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:16:55.707148 kernel: BTRFS info (device sda6): auto enabling async discard Jan 13 20:16:55.708005 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 13 20:16:55.713646 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:16:55.773774 coreos-metadata[807]: Jan 13 20:16:55.773 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jan 13 20:16:55.776425 initrd-setup-root[832]: cut: /sysroot/etc/passwd: No such file or directory Jan 13 20:16:55.777314 coreos-metadata[807]: Jan 13 20:16:55.776 INFO Fetch successful Jan 13 20:16:55.777314 coreos-metadata[807]: Jan 13 20:16:55.776 INFO wrote hostname ci-4186-1-0-7-a3f46aeb9c to /sysroot/etc/hostname Jan 13 20:16:55.779959 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 13 20:16:55.784568 initrd-setup-root[840]: cut: /sysroot/etc/group: No such file or directory Jan 13 20:16:55.790301 initrd-setup-root[847]: cut: /sysroot/etc/shadow: No such file or directory Jan 13 20:16:55.794779 initrd-setup-root[854]: cut: /sysroot/etc/gshadow: No such file or directory Jan 13 20:16:55.904009 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 13 20:16:55.914006 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 13 20:16:55.919379 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 13 20:16:55.928745 kernel: BTRFS info (device sda6): last unmount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779 Jan 13 20:16:55.955184 ignition[922]: INFO : Ignition 2.20.0 Jan 13 20:16:55.955184 ignition[922]: INFO : Stage: mount Jan 13 20:16:55.956538 ignition[922]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:16:55.956538 ignition[922]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 13 20:16:55.955211 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 13 20:16:55.959045 ignition[922]: INFO : mount: mount passed Jan 13 20:16:55.959045 ignition[922]: INFO : Ignition finished successfully Jan 13 20:16:55.959323 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 13 20:16:55.965877 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 13 20:16:56.234682 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 13 20:16:56.242042 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 20:16:56.254025 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (934) Jan 13 20:16:56.255825 kernel: BTRFS info (device sda6): first mount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779 Jan 13 20:16:56.255864 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jan 13 20:16:56.257488 kernel: BTRFS info (device sda6): using free space tree Jan 13 20:16:56.261166 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 20:16:56.261242 kernel: BTRFS info (device sda6): auto enabling async discard Jan 13 20:16:56.264479 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 20:16:56.287163 ignition[951]: INFO : Ignition 2.20.0 Jan 13 20:16:56.287163 ignition[951]: INFO : Stage: files Jan 13 20:16:56.287163 ignition[951]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:16:56.287163 ignition[951]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 13 20:16:56.290370 ignition[951]: DEBUG : files: compiled without relabeling support, skipping Jan 13 20:16:56.290370 ignition[951]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 13 20:16:56.290370 ignition[951]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 13 20:16:56.294499 ignition[951]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 13 20:16:56.295559 ignition[951]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 13 20:16:56.295559 ignition[951]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 13 20:16:56.294921 unknown[951]: wrote ssh authorized keys file for user: core Jan 13 20:16:56.298635 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Jan 13 20:16:56.298635 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Jan 13 20:16:56.371945 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 13 20:16:56.594273 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Jan 13 20:16:56.594273 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 13 20:16:56.596667 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 13 20:16:56.596667 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:16:56.596667 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 13 20:16:56.596667 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:16:56.596667 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 20:16:56.596667 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:16:56.596667 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 20:16:56.596667 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:16:56.596667 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 20:16:56.596667 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Jan 13 20:16:56.596667 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Jan 13 20:16:56.596667 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Jan 13 20:16:56.596667 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-arm64.raw: attempt #1 Jan 13 20:16:56.823239 systemd-networkd[772]: eth1: Gained IPv6LL Jan 13 20:16:57.013289 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 13 20:16:57.362304 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Jan 13 20:16:57.362304 ignition[951]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 13 20:16:57.366891 ignition[951]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:16:57.366891 ignition[951]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 20:16:57.366891 ignition[951]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 13 20:16:57.366891 ignition[951]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jan 13 20:16:57.366891 ignition[951]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 13 20:16:57.366891 ignition[951]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 13 20:16:57.366891 ignition[951]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jan 13 20:16:57.366891 ignition[951]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jan 13 20:16:57.366891 ignition[951]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jan 13 20:16:57.366891 ignition[951]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:16:57.366891 ignition[951]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 13 20:16:57.366891 ignition[951]: INFO : files: files passed Jan 13 20:16:57.366891 ignition[951]: INFO : Ignition finished successfully Jan 13 20:16:57.368617 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 13 20:16:57.377903 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 13 20:16:57.381930 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 13 20:16:57.388203 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 13 20:16:57.389890 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 13 20:16:57.399765 systemd-networkd[772]: eth0: Gained IPv6LL Jan 13 20:16:57.406366 initrd-setup-root-after-ignition[980]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:16:57.406366 initrd-setup-root-after-ignition[980]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:16:57.408954 initrd-setup-root-after-ignition[984]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 20:16:57.411549 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:16:57.412949 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 13 20:16:57.418935 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 13 20:16:57.464503 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 13 20:16:57.464664 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 13 20:16:57.466271 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 13 20:16:57.467279 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 13 20:16:57.468478 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 13 20:16:57.469932 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 13 20:16:57.498632 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:16:57.502947 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 13 20:16:57.519941 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:16:57.521552 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:16:57.522431 systemd[1]: Stopped target timers.target - Timer Units. Jan 13 20:16:57.523602 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 13 20:16:57.523768 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 20:16:57.525628 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 13 20:16:57.526433 systemd[1]: Stopped target basic.target - Basic System. Jan 13 20:16:57.527686 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 13 20:16:57.529294 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 20:16:57.530462 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 13 20:16:57.531593 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 13 20:16:57.532746 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 20:16:57.534010 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 13 20:16:57.534994 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 13 20:16:57.536163 systemd[1]: Stopped target swap.target - Swaps. Jan 13 20:16:57.537157 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 13 20:16:57.537304 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 13 20:16:57.538592 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:16:57.539343 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:16:57.540487 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 13 20:16:57.540937 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:16:57.541699 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 13 20:16:57.541866 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 13 20:16:57.543534 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 13 20:16:57.543680 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 20:16:57.544999 systemd[1]: ignition-files.service: Deactivated successfully. Jan 13 20:16:57.545169 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 13 20:16:57.546234 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jan 13 20:16:57.546335 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 13 20:16:57.554006 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 13 20:16:57.557384 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 13 20:16:57.558038 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 13 20:16:57.558218 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:16:57.560148 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 13 20:16:57.560273 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 20:16:57.575589 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 13 20:16:57.575743 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 13 20:16:57.583409 ignition[1004]: INFO : Ignition 2.20.0 Jan 13 20:16:57.583409 ignition[1004]: INFO : Stage: umount Jan 13 20:16:57.583409 ignition[1004]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 20:16:57.583409 ignition[1004]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 13 20:16:57.583409 ignition[1004]: INFO : umount: umount passed Jan 13 20:16:57.583409 ignition[1004]: INFO : Ignition finished successfully Jan 13 20:16:57.585183 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 13 20:16:57.586789 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 13 20:16:57.587921 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 13 20:16:57.587981 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 13 20:16:57.590944 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 13 20:16:57.591015 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 13 20:16:57.592047 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 13 20:16:57.592119 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 13 20:16:57.592944 systemd[1]: Stopped target network.target - Network. Jan 13 20:16:57.594456 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 13 20:16:57.594528 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 20:16:57.596828 systemd[1]: Stopped target paths.target - Path Units. Jan 13 20:16:57.597563 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 13 20:16:57.601100 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:16:57.601981 systemd[1]: Stopped target slices.target - Slice Units. Jan 13 20:16:57.603030 systemd[1]: Stopped target sockets.target - Socket Units. Jan 13 20:16:57.625375 systemd[1]: iscsid.socket: Deactivated successfully. Jan 13 20:16:57.625422 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 20:16:57.626463 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 13 20:16:57.626499 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 20:16:57.628204 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 13 20:16:57.628270 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 13 20:16:57.629229 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 13 20:16:57.629270 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 13 20:16:57.631368 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 13 20:16:57.635612 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 13 20:16:57.638449 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 13 20:16:57.638774 systemd-networkd[772]: eth1: DHCPv6 lease lost Jan 13 20:16:57.639400 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 13 20:16:57.639495 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 13 20:16:57.640685 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 13 20:16:57.641017 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 13 20:16:57.643051 systemd-networkd[772]: eth0: DHCPv6 lease lost Jan 13 20:16:57.645258 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 13 20:16:57.645386 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 13 20:16:57.648189 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 13 20:16:57.648368 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 13 20:16:57.650858 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 13 20:16:57.650926 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:16:57.657895 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 13 20:16:57.658488 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 13 20:16:57.658570 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 20:16:57.664476 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 13 20:16:57.664630 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:16:57.665629 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 13 20:16:57.665687 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 13 20:16:57.667668 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 13 20:16:57.667769 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:16:57.671452 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:16:57.687566 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 13 20:16:57.687894 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 13 20:16:57.691348 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 13 20:16:57.692141 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:16:57.693535 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 13 20:16:57.693975 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 13 20:16:57.695875 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 13 20:16:57.695957 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:16:57.696872 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 13 20:16:57.696940 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 13 20:16:57.699025 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 13 20:16:57.699132 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 13 20:16:57.701328 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 20:16:57.701389 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 20:16:57.711006 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 13 20:16:57.711591 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 13 20:16:57.711664 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:16:57.714379 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 13 20:16:57.714461 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:16:57.715491 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 13 20:16:57.715550 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:16:57.716724 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:16:57.716771 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:16:57.724857 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 13 20:16:57.725015 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 13 20:16:57.726367 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 13 20:16:57.735037 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 13 20:16:57.746844 systemd[1]: Switching root. Jan 13 20:16:57.783803 systemd-journald[236]: Journal stopped Jan 13 20:16:58.847486 systemd-journald[236]: Received SIGTERM from PID 1 (systemd). Jan 13 20:16:58.847580 kernel: SELinux: policy capability network_peer_controls=1 Jan 13 20:16:58.847600 kernel: SELinux: policy capability open_perms=1 Jan 13 20:16:58.847614 kernel: SELinux: policy capability extended_socket_class=1 Jan 13 20:16:58.847624 kernel: SELinux: policy capability always_check_network=0 Jan 13 20:16:58.847633 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 13 20:16:58.847643 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 13 20:16:58.847653 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 13 20:16:58.847668 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 13 20:16:58.847677 kernel: audit: type=1403 audit(1736799418.005:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 13 20:16:58.847688 systemd[1]: Successfully loaded SELinux policy in 39.853ms. Jan 13 20:16:58.848213 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 11.547ms. Jan 13 20:16:58.848242 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 20:16:58.848254 systemd[1]: Detected virtualization kvm. Jan 13 20:16:58.848265 systemd[1]: Detected architecture arm64. Jan 13 20:16:58.848275 systemd[1]: Detected first boot. Jan 13 20:16:58.848298 systemd[1]: Hostname set to . Jan 13 20:16:58.848309 systemd[1]: Initializing machine ID from VM UUID. Jan 13 20:16:58.848320 zram_generator::config[1046]: No configuration found. Jan 13 20:16:58.848333 systemd[1]: Populated /etc with preset unit settings. Jan 13 20:16:58.848344 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 13 20:16:58.848354 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 13 20:16:58.848369 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 13 20:16:58.848381 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 13 20:16:58.848391 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 13 20:16:58.848403 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 13 20:16:58.848414 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 13 20:16:58.848425 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 13 20:16:58.848435 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 13 20:16:58.848446 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 13 20:16:58.848456 systemd[1]: Created slice user.slice - User and Session Slice. Jan 13 20:16:58.848466 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 20:16:58.848479 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 20:16:58.848490 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 13 20:16:58.848503 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 13 20:16:58.848514 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 13 20:16:58.848525 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 20:16:58.848536 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 13 20:16:58.848546 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 20:16:58.848557 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 13 20:16:58.848568 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 13 20:16:58.848580 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 13 20:16:58.848590 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 13 20:16:58.848601 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 20:16:58.848611 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 20:16:58.848622 systemd[1]: Reached target slices.target - Slice Units. Jan 13 20:16:58.848632 systemd[1]: Reached target swap.target - Swaps. Jan 13 20:16:58.848643 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 13 20:16:58.848658 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 13 20:16:58.848670 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 20:16:58.848682 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 20:16:58.848692 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 20:16:58.848815 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 13 20:16:58.848832 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 13 20:16:58.848842 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 13 20:16:58.848853 systemd[1]: Mounting media.mount - External Media Directory... Jan 13 20:16:58.848863 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 13 20:16:58.848873 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 13 20:16:58.848887 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 13 20:16:58.848897 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 13 20:16:58.848908 systemd[1]: Reached target machines.target - Containers. Jan 13 20:16:58.848918 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 13 20:16:58.848928 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 13 20:16:58.848939 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 20:16:58.848950 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 13 20:16:58.848960 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:16:58.848972 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 20:16:58.848986 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 20:16:58.848998 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 13 20:16:58.849009 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:16:58.849020 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 13 20:16:58.849032 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 13 20:16:58.849042 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 13 20:16:58.849053 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 13 20:16:58.849077 systemd[1]: Stopped systemd-fsck-usr.service. Jan 13 20:16:58.849088 kernel: fuse: init (API version 7.39) Jan 13 20:16:58.849099 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 20:16:58.849110 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 20:16:58.849121 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 13 20:16:58.849132 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 13 20:16:58.849144 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 20:16:58.849155 systemd[1]: verity-setup.service: Deactivated successfully. Jan 13 20:16:58.849165 systemd[1]: Stopped verity-setup.service. Jan 13 20:16:58.849175 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 13 20:16:58.849185 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 13 20:16:58.849195 systemd[1]: Mounted media.mount - External Media Directory. Jan 13 20:16:58.849207 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 13 20:16:58.849220 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 13 20:16:58.849229 kernel: loop: module loaded Jan 13 20:16:58.849239 kernel: ACPI: bus type drm_connector registered Jan 13 20:16:58.849249 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 13 20:16:58.849259 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 20:16:58.849270 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 13 20:16:58.849280 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 13 20:16:58.849293 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:16:58.849303 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:16:58.849313 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 20:16:58.849324 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 20:16:58.849337 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 20:16:58.849347 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 20:16:58.849360 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 13 20:16:58.849372 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 13 20:16:58.849382 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:16:58.849392 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:16:58.849442 systemd-journald[1109]: Collecting audit messages is disabled. Jan 13 20:16:58.849471 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 20:16:58.849482 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 13 20:16:58.849495 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 13 20:16:58.849506 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 13 20:16:58.849518 systemd-journald[1109]: Journal started Jan 13 20:16:58.849546 systemd-journald[1109]: Runtime Journal (/run/log/journal/f00f97c8d9104e05a00ff76a63f7ba10) is 8.0M, max 76.5M, 68.5M free. Jan 13 20:16:58.573546 systemd[1]: Queued start job for default target multi-user.target. Jan 13 20:16:58.597077 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 13 20:16:58.597517 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 13 20:16:58.857858 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 13 20:16:58.866864 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 13 20:16:58.866955 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 13 20:16:58.866970 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 20:16:58.870844 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Jan 13 20:16:58.878401 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 13 20:16:58.885818 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 13 20:16:58.885924 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:16:58.900341 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 13 20:16:58.900416 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 20:16:58.905812 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 13 20:16:58.905901 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 20:16:58.920011 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 20:16:58.927860 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 13 20:16:58.933884 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 20:16:58.933973 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 20:16:58.946179 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 13 20:16:58.949356 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 13 20:16:58.950319 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 13 20:16:58.953844 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 13 20:16:58.966371 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 20:16:58.972214 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 13 20:16:58.995571 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 13 20:16:59.002746 kernel: loop0: detected capacity change from 0 to 116784 Jan 13 20:16:59.013194 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 13 20:16:59.030010 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Jan 13 20:16:59.037172 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Jan 13 20:16:59.039338 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 20:16:59.069730 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 13 20:16:59.070966 systemd-journald[1109]: Time spent on flushing to /var/log/journal/f00f97c8d9104e05a00ff76a63f7ba10 is 20.715ms for 1130 entries. Jan 13 20:16:59.070966 systemd-journald[1109]: System Journal (/var/log/journal/f00f97c8d9104e05a00ff76a63f7ba10) is 8.0M, max 584.8M, 576.8M free. Jan 13 20:16:59.100932 systemd-journald[1109]: Received client request to flush runtime journal. Jan 13 20:16:59.082187 systemd-tmpfiles[1143]: ACLs are not supported, ignoring. Jan 13 20:16:59.082199 systemd-tmpfiles[1143]: ACLs are not supported, ignoring. Jan 13 20:16:59.090603 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 13 20:16:59.093081 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Jan 13 20:16:59.097445 udevadm[1171]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Jan 13 20:16:59.105970 kernel: loop1: detected capacity change from 0 to 8 Jan 13 20:16:59.107864 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 13 20:16:59.112370 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 20:16:59.121104 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 13 20:16:59.128959 kernel: loop2: detected capacity change from 0 to 194096 Jan 13 20:16:59.183568 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 13 20:16:59.191472 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 20:16:59.194769 kernel: loop3: detected capacity change from 0 to 113552 Jan 13 20:16:59.223452 systemd-tmpfiles[1184]: ACLs are not supported, ignoring. Jan 13 20:16:59.223470 systemd-tmpfiles[1184]: ACLs are not supported, ignoring. Jan 13 20:16:59.230744 kernel: loop4: detected capacity change from 0 to 116784 Jan 13 20:16:59.232395 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 20:16:59.246753 kernel: loop5: detected capacity change from 0 to 8 Jan 13 20:16:59.249785 kernel: loop6: detected capacity change from 0 to 194096 Jan 13 20:16:59.286201 kernel: loop7: detected capacity change from 0 to 113552 Jan 13 20:16:59.313261 (sd-merge)[1187]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Jan 13 20:16:59.315164 (sd-merge)[1187]: Merged extensions into '/usr'. Jan 13 20:16:59.324076 systemd[1]: Reloading requested from client PID 1142 ('systemd-sysext') (unit systemd-sysext.service)... Jan 13 20:16:59.324255 systemd[1]: Reloading... Jan 13 20:16:59.426742 zram_generator::config[1214]: No configuration found. Jan 13 20:16:59.585463 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:16:59.643623 systemd[1]: Reloading finished in 318 ms. Jan 13 20:16:59.651414 ldconfig[1138]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 13 20:16:59.669978 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 13 20:16:59.671652 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 13 20:16:59.683043 systemd[1]: Starting ensure-sysext.service... Jan 13 20:16:59.688175 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 20:16:59.705857 systemd[1]: Reloading requested from client PID 1251 ('systemctl') (unit ensure-sysext.service)... Jan 13 20:16:59.705879 systemd[1]: Reloading... Jan 13 20:16:59.736612 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 13 20:16:59.736903 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 13 20:16:59.737633 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 13 20:16:59.738347 systemd-tmpfiles[1252]: ACLs are not supported, ignoring. Jan 13 20:16:59.738479 systemd-tmpfiles[1252]: ACLs are not supported, ignoring. Jan 13 20:16:59.745531 systemd-tmpfiles[1252]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 20:16:59.745544 systemd-tmpfiles[1252]: Skipping /boot Jan 13 20:16:59.763630 systemd-tmpfiles[1252]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 20:16:59.763944 systemd-tmpfiles[1252]: Skipping /boot Jan 13 20:16:59.807758 zram_generator::config[1279]: No configuration found. Jan 13 20:16:59.922902 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:16:59.981359 systemd[1]: Reloading finished in 275 ms. Jan 13 20:17:00.004032 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 13 20:17:00.010384 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 20:17:00.026974 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 20:17:00.039328 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 13 20:17:00.047281 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 13 20:17:00.057997 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 20:17:00.067106 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 20:17:00.076044 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 13 20:17:00.082340 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 13 20:17:00.090368 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:17:00.096797 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 20:17:00.101061 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:17:00.102915 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:17:00.110238 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 13 20:17:00.112645 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:17:00.113788 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:17:00.123145 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 13 20:17:00.134526 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 13 20:17:00.137275 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 13 20:17:00.144108 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:17:00.145579 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:17:00.149241 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 13 20:17:00.150626 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 20:17:00.152749 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 20:17:00.159808 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:17:00.160756 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:17:00.171280 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 13 20:17:00.185188 systemd-udevd[1328]: Using default interface naming scheme 'v255'. Jan 13 20:17:00.185964 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 20:17:00.192216 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 20:17:00.207318 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:17:00.208387 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:17:00.210430 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 13 20:17:00.212456 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 20:17:00.213191 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 20:17:00.215450 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:17:00.216801 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:17:00.227359 systemd[1]: Finished ensure-sysext.service. Jan 13 20:17:00.230470 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 13 20:17:00.233312 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 13 20:17:00.235383 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 20:17:00.236060 augenrules[1357]: No rules Jan 13 20:17:00.236253 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 20:17:00.239302 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 20:17:00.239566 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 20:17:00.240985 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:17:00.241689 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:17:00.251990 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 20:17:00.252167 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 20:17:00.259039 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 13 20:17:00.261896 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 13 20:17:00.278571 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 20:17:00.294911 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 20:17:00.322315 systemd-resolved[1322]: Positive Trust Anchors: Jan 13 20:17:00.322673 systemd-resolved[1322]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 20:17:00.322788 systemd-resolved[1322]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 20:17:00.330308 systemd-resolved[1322]: Using system hostname 'ci-4186-1-0-7-a3f46aeb9c'. Jan 13 20:17:00.350480 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 13 20:17:00.351307 systemd[1]: Reached target time-set.target - System Time Set. Jan 13 20:17:00.354504 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 20:17:00.355367 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 20:17:00.393003 systemd-networkd[1377]: lo: Link UP Jan 13 20:17:00.393016 systemd-networkd[1377]: lo: Gained carrier Jan 13 20:17:00.395300 systemd-networkd[1377]: Enumeration completed Jan 13 20:17:00.395431 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 20:17:00.396184 systemd[1]: Reached target network.target - Network. Jan 13 20:17:00.412120 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 13 20:17:00.413216 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 13 20:17:00.440802 systemd-networkd[1377]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 13 20:17:00.440819 systemd-networkd[1377]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 13 20:17:00.441763 systemd-networkd[1377]: eth0: Link UP Jan 13 20:17:00.441774 systemd-networkd[1377]: eth0: Gained carrier Jan 13 20:17:00.441797 systemd-networkd[1377]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 13 20:17:00.473314 systemd-networkd[1377]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 13 20:17:00.473327 systemd-networkd[1377]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 13 20:17:00.474914 systemd-networkd[1377]: eth1: Link UP Jan 13 20:17:00.474923 systemd-networkd[1377]: eth1: Gained carrier Jan 13 20:17:00.474948 systemd-networkd[1377]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 13 20:17:00.494927 systemd-networkd[1377]: eth0: DHCPv4 address 138.199.153.203/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 13 20:17:00.497421 systemd-timesyncd[1375]: Network configuration changed, trying to establish connection. Jan 13 20:17:00.524875 systemd-networkd[1377]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jan 13 20:17:00.525481 systemd-timesyncd[1375]: Network configuration changed, trying to establish connection. Jan 13 20:17:00.526007 systemd-timesyncd[1375]: Network configuration changed, trying to establish connection. Jan 13 20:17:00.540745 kernel: mousedev: PS/2 mouse device common for all mice Jan 13 20:17:00.567763 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 38 scanned by (udev-worker) (1393) Jan 13 20:17:00.624084 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jan 13 20:17:00.624245 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 13 20:17:00.628114 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 20:17:00.632294 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 20:17:00.640459 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 20:17:00.641599 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 20:17:00.641641 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 13 20:17:00.646187 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 20:17:00.648778 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 20:17:00.652622 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 20:17:00.653953 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 20:17:00.656890 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 20:17:00.658562 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 20:17:00.658825 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 20:17:00.663451 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 20:17:00.675844 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Jan 13 20:17:00.675916 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 13 20:17:00.675938 kernel: [drm] features: -context_init Jan 13 20:17:00.678750 kernel: [drm] number of scanouts: 1 Jan 13 20:17:00.678821 kernel: [drm] number of cap sets: 0 Jan 13 20:17:00.688872 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Jan 13 20:17:00.707749 kernel: Console: switching to colour frame buffer device 160x50 Jan 13 20:17:00.709247 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 13 20:17:00.716361 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 13 20:17:00.726138 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 13 20:17:00.734101 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:17:00.744369 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 20:17:00.744597 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:17:00.753203 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 20:17:00.757750 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 13 20:17:00.829837 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 20:17:00.890228 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Jan 13 20:17:00.896995 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Jan 13 20:17:00.926898 lvm[1440]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 13 20:17:00.960400 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Jan 13 20:17:00.962429 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 20:17:00.963337 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 20:17:00.964450 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 13 20:17:00.965421 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 13 20:17:00.966428 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 13 20:17:00.967157 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 13 20:17:00.967823 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 13 20:17:00.968451 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 13 20:17:00.968489 systemd[1]: Reached target paths.target - Path Units. Jan 13 20:17:00.968993 systemd[1]: Reached target timers.target - Timer Units. Jan 13 20:17:00.970791 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 13 20:17:00.973273 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 13 20:17:00.978917 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 13 20:17:00.981826 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Jan 13 20:17:00.983336 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 13 20:17:00.984201 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 20:17:00.984836 systemd[1]: Reached target basic.target - Basic System. Jan 13 20:17:00.985526 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 13 20:17:00.985556 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 13 20:17:00.993530 systemd[1]: Starting containerd.service - containerd container runtime... Jan 13 20:17:00.998094 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 13 20:17:00.998841 lvm[1444]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 13 20:17:01.003002 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 13 20:17:01.011946 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 13 20:17:01.016888 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 13 20:17:01.017821 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 13 20:17:01.024009 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 13 20:17:01.027030 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 13 20:17:01.038115 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jan 13 20:17:01.039331 jq[1448]: false Jan 13 20:17:01.043997 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 13 20:17:01.048001 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 13 20:17:01.055984 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 13 20:17:01.057412 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 13 20:17:01.059230 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 13 20:17:01.065671 systemd[1]: Starting update-engine.service - Update Engine... Jan 13 20:17:01.067889 dbus-daemon[1447]: [system] SELinux support is enabled Jan 13 20:17:01.074883 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 13 20:17:01.085779 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 13 20:17:01.092790 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Jan 13 20:17:01.097368 coreos-metadata[1446]: Jan 13 20:17:01.097 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jan 13 20:17:01.101438 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 13 20:17:01.103765 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 13 20:17:01.109432 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 13 20:17:01.112110 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 13 20:17:01.122748 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 13 20:17:01.124281 extend-filesystems[1449]: Found loop4 Jan 13 20:17:01.124281 extend-filesystems[1449]: Found loop5 Jan 13 20:17:01.124281 extend-filesystems[1449]: Found loop6 Jan 13 20:17:01.124281 extend-filesystems[1449]: Found loop7 Jan 13 20:17:01.124281 extend-filesystems[1449]: Found sda Jan 13 20:17:01.124281 extend-filesystems[1449]: Found sda1 Jan 13 20:17:01.124281 extend-filesystems[1449]: Found sda2 Jan 13 20:17:01.124281 extend-filesystems[1449]: Found sda3 Jan 13 20:17:01.124281 extend-filesystems[1449]: Found usr Jan 13 20:17:01.124281 extend-filesystems[1449]: Found sda4 Jan 13 20:17:01.124281 extend-filesystems[1449]: Found sda6 Jan 13 20:17:01.124281 extend-filesystems[1449]: Found sda7 Jan 13 20:17:01.124281 extend-filesystems[1449]: Found sda9 Jan 13 20:17:01.124281 extend-filesystems[1449]: Checking size of /dev/sda9 Jan 13 20:17:01.122802 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 13 20:17:01.140415 coreos-metadata[1446]: Jan 13 20:17:01.130 INFO Fetch successful Jan 13 20:17:01.140415 coreos-metadata[1446]: Jan 13 20:17:01.130 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jan 13 20:17:01.140415 coreos-metadata[1446]: Jan 13 20:17:01.131 INFO Fetch successful Jan 13 20:17:01.123922 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 13 20:17:01.123944 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 13 20:17:01.141629 (ntainerd)[1478]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 13 20:17:01.163905 jq[1460]: true Jan 13 20:17:01.164382 update_engine[1459]: I20250113 20:17:01.163589 1459 main.cc:92] Flatcar Update Engine starting Jan 13 20:17:01.171269 update_engine[1459]: I20250113 20:17:01.171216 1459 update_check_scheduler.cc:74] Next update check in 5m59s Jan 13 20:17:01.172671 systemd[1]: motdgen.service: Deactivated successfully. Jan 13 20:17:01.172921 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 13 20:17:01.173848 systemd[1]: Started update-engine.service - Update Engine. Jan 13 20:17:01.175774 tar[1469]: linux-arm64/helm Jan 13 20:17:01.179944 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 13 20:17:01.202315 extend-filesystems[1449]: Resized partition /dev/sda9 Jan 13 20:17:01.213578 extend-filesystems[1494]: resize2fs 1.47.1 (20-May-2024) Jan 13 20:17:01.227631 jq[1484]: true Jan 13 20:17:01.240740 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Jan 13 20:17:01.314493 systemd-logind[1457]: New seat seat0. Jan 13 20:17:01.333257 systemd-logind[1457]: Watching system buttons on /dev/input/event0 (Power Button) Jan 13 20:17:01.333280 systemd-logind[1457]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 13 20:17:01.334082 systemd[1]: Started systemd-logind.service - User Login Management. Jan 13 20:17:01.349424 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 38 scanned by (udev-worker) (1398) Jan 13 20:17:01.361750 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 13 20:17:01.363281 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 13 20:17:01.423808 bash[1521]: Updated "/home/core/.ssh/authorized_keys" Jan 13 20:17:01.425823 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Jan 13 20:17:01.426882 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 13 20:17:01.445230 systemd[1]: Starting sshkeys.service... Jan 13 20:17:01.462251 extend-filesystems[1494]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jan 13 20:17:01.462251 extend-filesystems[1494]: old_desc_blocks = 1, new_desc_blocks = 5 Jan 13 20:17:01.462251 extend-filesystems[1494]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Jan 13 20:17:01.469352 extend-filesystems[1449]: Resized filesystem in /dev/sda9 Jan 13 20:17:01.469352 extend-filesystems[1449]: Found sr0 Jan 13 20:17:01.471098 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 13 20:17:01.471282 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 13 20:17:01.481241 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 13 20:17:01.490232 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 13 20:17:01.548521 coreos-metadata[1528]: Jan 13 20:17:01.548 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jan 13 20:17:01.552940 coreos-metadata[1528]: Jan 13 20:17:01.551 INFO Fetch successful Jan 13 20:17:01.559199 unknown[1528]: wrote ssh authorized keys file for user: core Jan 13 20:17:01.561024 containerd[1478]: time="2025-01-13T20:17:01.560900440Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Jan 13 20:17:01.616622 update-ssh-keys[1534]: Updated "/home/core/.ssh/authorized_keys" Jan 13 20:17:01.617420 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 13 20:17:01.622357 systemd[1]: Finished sshkeys.service. Jan 13 20:17:01.636393 containerd[1478]: time="2025-01-13T20:17:01.636328520Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:17:01.639958 containerd[1478]: time="2025-01-13T20:17:01.639870880Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.71-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:17:01.639958 containerd[1478]: time="2025-01-13T20:17:01.639920480Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Jan 13 20:17:01.639958 containerd[1478]: time="2025-01-13T20:17:01.639940840Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Jan 13 20:17:01.640148 containerd[1478]: time="2025-01-13T20:17:01.640137360Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Jan 13 20:17:01.640170 containerd[1478]: time="2025-01-13T20:17:01.640157080Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Jan 13 20:17:01.640270 containerd[1478]: time="2025-01-13T20:17:01.640228480Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:17:01.640270 containerd[1478]: time="2025-01-13T20:17:01.640248840Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:17:01.640445 containerd[1478]: time="2025-01-13T20:17:01.640417720Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:17:01.640445 containerd[1478]: time="2025-01-13T20:17:01.640437360Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Jan 13 20:17:01.640526 containerd[1478]: time="2025-01-13T20:17:01.640450840Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:17:01.640526 containerd[1478]: time="2025-01-13T20:17:01.640460120Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Jan 13 20:17:01.640573 containerd[1478]: time="2025-01-13T20:17:01.640528560Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:17:01.642184 containerd[1478]: time="2025-01-13T20:17:01.642081160Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Jan 13 20:17:01.642277 containerd[1478]: time="2025-01-13T20:17:01.642251000Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 20:17:01.642277 containerd[1478]: time="2025-01-13T20:17:01.642276200Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Jan 13 20:17:01.642407 containerd[1478]: time="2025-01-13T20:17:01.642383920Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Jan 13 20:17:01.642826 containerd[1478]: time="2025-01-13T20:17:01.642461160Z" level=info msg="metadata content store policy set" policy=shared Jan 13 20:17:01.653435 containerd[1478]: time="2025-01-13T20:17:01.653373480Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Jan 13 20:17:01.653435 containerd[1478]: time="2025-01-13T20:17:01.653450800Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Jan 13 20:17:01.653565 containerd[1478]: time="2025-01-13T20:17:01.653470400Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Jan 13 20:17:01.653565 containerd[1478]: time="2025-01-13T20:17:01.653486960Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Jan 13 20:17:01.653565 containerd[1478]: time="2025-01-13T20:17:01.653504480Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Jan 13 20:17:01.653722 containerd[1478]: time="2025-01-13T20:17:01.653692080Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Jan 13 20:17:01.654983 containerd[1478]: time="2025-01-13T20:17:01.654944200Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Jan 13 20:17:01.655423 containerd[1478]: time="2025-01-13T20:17:01.655258520Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Jan 13 20:17:01.655423 containerd[1478]: time="2025-01-13T20:17:01.655287600Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Jan 13 20:17:01.655423 containerd[1478]: time="2025-01-13T20:17:01.655309440Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Jan 13 20:17:01.656108 containerd[1478]: time="2025-01-13T20:17:01.655736440Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Jan 13 20:17:01.656108 containerd[1478]: time="2025-01-13T20:17:01.655758320Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Jan 13 20:17:01.656108 containerd[1478]: time="2025-01-13T20:17:01.655774960Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Jan 13 20:17:01.656108 containerd[1478]: time="2025-01-13T20:17:01.655790320Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Jan 13 20:17:01.656108 containerd[1478]: time="2025-01-13T20:17:01.655813000Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Jan 13 20:17:01.656108 containerd[1478]: time="2025-01-13T20:17:01.655838800Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Jan 13 20:17:01.656108 containerd[1478]: time="2025-01-13T20:17:01.655853160Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Jan 13 20:17:01.656108 containerd[1478]: time="2025-01-13T20:17:01.655864280Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Jan 13 20:17:01.657759 containerd[1478]: time="2025-01-13T20:17:01.657735960Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.657759 containerd[1478]: time="2025-01-13T20:17:01.657761200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.657831 containerd[1478]: time="2025-01-13T20:17:01.657775640Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.657831 containerd[1478]: time="2025-01-13T20:17:01.657804520Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.657831 containerd[1478]: time="2025-01-13T20:17:01.657817000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.657831 containerd[1478]: time="2025-01-13T20:17:01.657829120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.657918 containerd[1478]: time="2025-01-13T20:17:01.657843200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.657918 containerd[1478]: time="2025-01-13T20:17:01.657855480Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.657918 containerd[1478]: time="2025-01-13T20:17:01.657875560Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.657918 containerd[1478]: time="2025-01-13T20:17:01.657892960Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.657918 containerd[1478]: time="2025-01-13T20:17:01.657904600Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.658351 containerd[1478]: time="2025-01-13T20:17:01.658024760Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.658351 containerd[1478]: time="2025-01-13T20:17:01.658084520Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.658351 containerd[1478]: time="2025-01-13T20:17:01.658106320Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Jan 13 20:17:01.658351 containerd[1478]: time="2025-01-13T20:17:01.658139720Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.658351 containerd[1478]: time="2025-01-13T20:17:01.658163200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.658351 containerd[1478]: time="2025-01-13T20:17:01.658175600Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Jan 13 20:17:01.658471 containerd[1478]: time="2025-01-13T20:17:01.658376720Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Jan 13 20:17:01.658471 containerd[1478]: time="2025-01-13T20:17:01.658405440Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Jan 13 20:17:01.658471 containerd[1478]: time="2025-01-13T20:17:01.658428400Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Jan 13 20:17:01.658471 containerd[1478]: time="2025-01-13T20:17:01.658440320Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Jan 13 20:17:01.658471 containerd[1478]: time="2025-01-13T20:17:01.658448880Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.658471 containerd[1478]: time="2025-01-13T20:17:01.658468400Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Jan 13 20:17:01.658574 containerd[1478]: time="2025-01-13T20:17:01.658481800Z" level=info msg="NRI interface is disabled by configuration." Jan 13 20:17:01.658574 containerd[1478]: time="2025-01-13T20:17:01.658501600Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Jan 13 20:17:01.661278 containerd[1478]: time="2025-01-13T20:17:01.660963480Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Jan 13 20:17:01.661278 containerd[1478]: time="2025-01-13T20:17:01.661037920Z" level=info msg="Connect containerd service" Jan 13 20:17:01.661278 containerd[1478]: time="2025-01-13T20:17:01.661132480Z" level=info msg="using legacy CRI server" Jan 13 20:17:01.661278 containerd[1478]: time="2025-01-13T20:17:01.661142240Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 13 20:17:01.661680 containerd[1478]: time="2025-01-13T20:17:01.661419240Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Jan 13 20:17:01.666279 containerd[1478]: time="2025-01-13T20:17:01.664328880Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 13 20:17:01.666279 containerd[1478]: time="2025-01-13T20:17:01.664885080Z" level=info msg="Start subscribing containerd event" Jan 13 20:17:01.666279 containerd[1478]: time="2025-01-13T20:17:01.664949200Z" level=info msg="Start recovering state" Jan 13 20:17:01.666279 containerd[1478]: time="2025-01-13T20:17:01.665028600Z" level=info msg="Start event monitor" Jan 13 20:17:01.666279 containerd[1478]: time="2025-01-13T20:17:01.665080160Z" level=info msg="Start snapshots syncer" Jan 13 20:17:01.666279 containerd[1478]: time="2025-01-13T20:17:01.665096320Z" level=info msg="Start cni network conf syncer for default" Jan 13 20:17:01.666279 containerd[1478]: time="2025-01-13T20:17:01.665104600Z" level=info msg="Start streaming server" Jan 13 20:17:01.666925 containerd[1478]: time="2025-01-13T20:17:01.666885520Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 13 20:17:01.666964 containerd[1478]: time="2025-01-13T20:17:01.666956960Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 13 20:17:01.667166 systemd[1]: Started containerd.service - containerd container runtime. Jan 13 20:17:01.667248 containerd[1478]: time="2025-01-13T20:17:01.667231480Z" level=info msg="containerd successfully booted in 0.112192s" Jan 13 20:17:01.678639 locksmithd[1488]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 13 20:17:01.814851 systemd-networkd[1377]: eth0: Gained IPv6LL Jan 13 20:17:01.815577 systemd-timesyncd[1375]: Network configuration changed, trying to establish connection. Jan 13 20:17:01.821134 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 13 20:17:01.823597 systemd[1]: Reached target network-online.target - Network is Online. Jan 13 20:17:01.833980 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:17:01.837491 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 13 20:17:01.883154 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 13 20:17:01.937150 tar[1469]: linux-arm64/LICENSE Jan 13 20:17:01.937150 tar[1469]: linux-arm64/README.md Jan 13 20:17:01.956495 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 13 20:17:02.092697 sshd_keygen[1487]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 13 20:17:02.127652 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 13 20:17:02.137235 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 13 20:17:02.146726 systemd[1]: issuegen.service: Deactivated successfully. Jan 13 20:17:02.147765 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 13 20:17:02.157535 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 13 20:17:02.169578 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 13 20:17:02.180288 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 13 20:17:02.191230 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 13 20:17:02.192186 systemd[1]: Reached target getty.target - Login Prompts. Jan 13 20:17:02.454968 systemd-networkd[1377]: eth1: Gained IPv6LL Jan 13 20:17:02.455632 systemd-timesyncd[1375]: Network configuration changed, trying to establish connection. Jan 13 20:17:02.653857 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:17:02.655114 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 13 20:17:02.657793 systemd[1]: Startup finished in 796ms (kernel) + 5.304s (initrd) + 4.693s (userspace) = 10.794s. Jan 13 20:17:02.662795 (kubelet)[1576]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:17:02.672782 agetty[1570]: failed to open credentials directory Jan 13 20:17:02.674313 agetty[1569]: failed to open credentials directory Jan 13 20:17:03.358439 kubelet[1576]: E0113 20:17:03.358343 1576 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:17:03.360443 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:17:03.360658 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:17:13.606522 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 13 20:17:13.615127 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:17:13.748344 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:17:13.760420 (kubelet)[1597]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:17:13.830811 kubelet[1597]: E0113 20:17:13.830219 1597 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:17:13.836477 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:17:13.837261 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:17:23.855760 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 13 20:17:23.863106 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:17:23.985852 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:17:23.997319 (kubelet)[1613]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:17:24.059773 kubelet[1613]: E0113 20:17:24.059576 1613 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:17:24.062495 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:17:24.062881 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:17:32.225546 systemd-timesyncd[1375]: Contacted time server 129.250.35.250:123 (2.flatcar.pool.ntp.org). Jan 13 20:17:32.225638 systemd-timesyncd[1375]: Initial clock synchronization to Mon 2025-01-13 20:17:32.225332 UTC. Jan 13 20:17:32.225700 systemd-resolved[1322]: Clock change detected. Flushing caches. Jan 13 20:17:33.676367 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 13 20:17:33.684003 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:17:33.842960 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:17:33.843096 (kubelet)[1630]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:17:33.899653 kubelet[1630]: E0113 20:17:33.899567 1630 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:17:33.903056 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:17:33.903375 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:17:43.925565 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 13 20:17:43.930885 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:17:44.067207 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:17:44.081569 (kubelet)[1646]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:17:44.131183 kubelet[1646]: E0113 20:17:44.131064 1646 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:17:44.134567 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:17:44.134755 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:17:46.441709 update_engine[1459]: I20250113 20:17:46.440761 1459 update_attempter.cc:509] Updating boot flags... Jan 13 20:17:46.491633 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 38 scanned by (udev-worker) (1663) Jan 13 20:17:46.553943 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 38 scanned by (udev-worker) (1664) Jan 13 20:17:54.175454 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jan 13 20:17:54.184254 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:17:54.323935 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:17:54.329709 (kubelet)[1680]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:17:54.386339 kubelet[1680]: E0113 20:17:54.386290 1680 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:17:54.389455 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:17:54.389616 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:18:04.425398 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Jan 13 20:18:04.432043 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:18:04.571835 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:18:04.583047 (kubelet)[1697]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:18:04.632386 kubelet[1697]: E0113 20:18:04.632323 1697 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:18:04.635177 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:18:04.635370 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:18:14.675711 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Jan 13 20:18:14.680920 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:18:14.808884 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:18:14.833384 (kubelet)[1712]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:18:14.891744 kubelet[1712]: E0113 20:18:14.891685 1712 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:18:14.894462 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:18:14.894842 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:18:24.925430 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. Jan 13 20:18:24.935937 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:18:25.070965 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:18:25.082548 (kubelet)[1729]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:18:25.131948 kubelet[1729]: E0113 20:18:25.131655 1729 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:18:25.135420 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:18:25.136016 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:18:35.175815 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. Jan 13 20:18:35.188998 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:18:35.306462 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:18:35.320065 (kubelet)[1745]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:18:35.368804 kubelet[1745]: E0113 20:18:35.368698 1745 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:18:35.373081 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:18:35.373579 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:18:45.425402 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. Jan 13 20:18:45.440181 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:18:45.557614 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:18:45.563651 (kubelet)[1761]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:18:45.618436 kubelet[1761]: E0113 20:18:45.618332 1761 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:18:45.621705 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:18:45.624884 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:18:47.549016 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 13 20:18:47.561956 systemd[1]: Started sshd@0-138.199.153.203:22-139.178.89.65:52124.service - OpenSSH per-connection server daemon (139.178.89.65:52124). Jan 13 20:18:48.574263 sshd[1770]: Accepted publickey for core from 139.178.89.65 port 52124 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:18:48.578843 sshd-session[1770]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:18:48.590058 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 13 20:18:48.597272 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 13 20:18:48.602988 systemd-logind[1457]: New session 1 of user core. Jan 13 20:18:48.614235 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 13 20:18:48.623068 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 13 20:18:48.627468 (systemd)[1774]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 13 20:18:48.741823 systemd[1774]: Queued start job for default target default.target. Jan 13 20:18:48.755221 systemd[1774]: Created slice app.slice - User Application Slice. Jan 13 20:18:48.755298 systemd[1774]: Reached target paths.target - Paths. Jan 13 20:18:48.755325 systemd[1774]: Reached target timers.target - Timers. Jan 13 20:18:48.759810 systemd[1774]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 13 20:18:48.774075 systemd[1774]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 13 20:18:48.774149 systemd[1774]: Reached target sockets.target - Sockets. Jan 13 20:18:48.774161 systemd[1774]: Reached target basic.target - Basic System. Jan 13 20:18:48.774209 systemd[1774]: Reached target default.target - Main User Target. Jan 13 20:18:48.774236 systemd[1774]: Startup finished in 137ms. Jan 13 20:18:48.774831 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 13 20:18:48.783021 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 13 20:18:49.485409 systemd[1]: Started sshd@1-138.199.153.203:22-139.178.89.65:52138.service - OpenSSH per-connection server daemon (139.178.89.65:52138). Jan 13 20:18:50.472629 sshd[1785]: Accepted publickey for core from 139.178.89.65 port 52138 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:18:50.475674 sshd-session[1785]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:18:50.482851 systemd-logind[1457]: New session 2 of user core. Jan 13 20:18:50.490026 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 13 20:18:51.158681 sshd[1787]: Connection closed by 139.178.89.65 port 52138 Jan 13 20:18:51.159823 sshd-session[1785]: pam_unix(sshd:session): session closed for user core Jan 13 20:18:51.164345 systemd[1]: sshd@1-138.199.153.203:22-139.178.89.65:52138.service: Deactivated successfully. Jan 13 20:18:51.166477 systemd[1]: session-2.scope: Deactivated successfully. Jan 13 20:18:51.168459 systemd-logind[1457]: Session 2 logged out. Waiting for processes to exit. Jan 13 20:18:51.170053 systemd-logind[1457]: Removed session 2. Jan 13 20:18:51.327779 systemd[1]: Started sshd@2-138.199.153.203:22-139.178.89.65:45726.service - OpenSSH per-connection server daemon (139.178.89.65:45726). Jan 13 20:18:52.334099 sshd[1792]: Accepted publickey for core from 139.178.89.65 port 45726 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:18:52.335288 sshd-session[1792]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:18:52.341749 systemd-logind[1457]: New session 3 of user core. Jan 13 20:18:52.350943 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 13 20:18:53.011706 sshd[1794]: Connection closed by 139.178.89.65 port 45726 Jan 13 20:18:53.012614 sshd-session[1792]: pam_unix(sshd:session): session closed for user core Jan 13 20:18:53.018850 systemd[1]: sshd@2-138.199.153.203:22-139.178.89.65:45726.service: Deactivated successfully. Jan 13 20:18:53.020550 systemd[1]: session-3.scope: Deactivated successfully. Jan 13 20:18:53.021955 systemd-logind[1457]: Session 3 logged out. Waiting for processes to exit. Jan 13 20:18:53.024140 systemd-logind[1457]: Removed session 3. Jan 13 20:18:53.190034 systemd[1]: Started sshd@3-138.199.153.203:22-139.178.89.65:45740.service - OpenSSH per-connection server daemon (139.178.89.65:45740). Jan 13 20:18:54.201261 sshd[1799]: Accepted publickey for core from 139.178.89.65 port 45740 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:18:54.203624 sshd-session[1799]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:18:54.212134 systemd-logind[1457]: New session 4 of user core. Jan 13 20:18:54.217929 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 13 20:18:54.889121 sshd[1801]: Connection closed by 139.178.89.65 port 45740 Jan 13 20:18:54.889821 sshd-session[1799]: pam_unix(sshd:session): session closed for user core Jan 13 20:18:54.893341 systemd[1]: sshd@3-138.199.153.203:22-139.178.89.65:45740.service: Deactivated successfully. Jan 13 20:18:54.896033 systemd[1]: session-4.scope: Deactivated successfully. Jan 13 20:18:54.898866 systemd-logind[1457]: Session 4 logged out. Waiting for processes to exit. Jan 13 20:18:54.900520 systemd-logind[1457]: Removed session 4. Jan 13 20:18:55.066110 systemd[1]: Started sshd@4-138.199.153.203:22-139.178.89.65:45750.service - OpenSSH per-connection server daemon (139.178.89.65:45750). Jan 13 20:18:55.675942 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. Jan 13 20:18:55.688093 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:18:55.803831 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:18:55.805197 (kubelet)[1816]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:18:55.858990 kubelet[1816]: E0113 20:18:55.858942 1816 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:18:55.862272 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:18:55.862398 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:18:56.063138 sshd[1806]: Accepted publickey for core from 139.178.89.65 port 45750 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:18:56.064729 sshd-session[1806]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:18:56.071249 systemd-logind[1457]: New session 5 of user core. Jan 13 20:18:56.076909 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 13 20:18:56.600200 sudo[1825]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 13 20:18:56.600515 sudo[1825]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:18:56.616227 sudo[1825]: pam_unix(sudo:session): session closed for user root Jan 13 20:18:56.777671 sshd[1824]: Connection closed by 139.178.89.65 port 45750 Jan 13 20:18:56.778871 sshd-session[1806]: pam_unix(sshd:session): session closed for user core Jan 13 20:18:56.784806 systemd[1]: sshd@4-138.199.153.203:22-139.178.89.65:45750.service: Deactivated successfully. Jan 13 20:18:56.787147 systemd[1]: session-5.scope: Deactivated successfully. Jan 13 20:18:56.788154 systemd-logind[1457]: Session 5 logged out. Waiting for processes to exit. Jan 13 20:18:56.789433 systemd-logind[1457]: Removed session 5. Jan 13 20:18:56.950435 systemd[1]: Started sshd@5-138.199.153.203:22-139.178.89.65:45764.service - OpenSSH per-connection server daemon (139.178.89.65:45764). Jan 13 20:18:57.962512 sshd[1830]: Accepted publickey for core from 139.178.89.65 port 45764 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:18:57.964853 sshd-session[1830]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:18:57.970495 systemd-logind[1457]: New session 6 of user core. Jan 13 20:18:57.976930 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 13 20:18:58.489273 sudo[1834]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 13 20:18:58.489993 sudo[1834]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:18:58.494316 sudo[1834]: pam_unix(sudo:session): session closed for user root Jan 13 20:18:58.501893 sudo[1833]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 13 20:18:58.502249 sudo[1833]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:18:58.527069 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 20:18:58.560085 augenrules[1856]: No rules Jan 13 20:18:58.561561 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 20:18:58.562778 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 20:18:58.564235 sudo[1833]: pam_unix(sudo:session): session closed for user root Jan 13 20:18:58.726776 sshd[1832]: Connection closed by 139.178.89.65 port 45764 Jan 13 20:18:58.727282 sshd-session[1830]: pam_unix(sshd:session): session closed for user core Jan 13 20:18:58.731348 systemd[1]: sshd@5-138.199.153.203:22-139.178.89.65:45764.service: Deactivated successfully. Jan 13 20:18:58.733578 systemd[1]: session-6.scope: Deactivated successfully. Jan 13 20:18:58.736215 systemd-logind[1457]: Session 6 logged out. Waiting for processes to exit. Jan 13 20:18:58.737664 systemd-logind[1457]: Removed session 6. Jan 13 20:18:58.913747 systemd[1]: Started sshd@6-138.199.153.203:22-139.178.89.65:45776.service - OpenSSH per-connection server daemon (139.178.89.65:45776). Jan 13 20:18:59.903179 sshd[1864]: Accepted publickey for core from 139.178.89.65 port 45776 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:18:59.905554 sshd-session[1864]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:18:59.911072 systemd-logind[1457]: New session 7 of user core. Jan 13 20:18:59.917950 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 13 20:19:00.427105 sudo[1867]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 13 20:19:00.427394 sudo[1867]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 20:19:00.774404 (dockerd)[1885]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 13 20:19:00.775128 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 13 20:19:01.031323 dockerd[1885]: time="2025-01-13T20:19:01.030780270Z" level=info msg="Starting up" Jan 13 20:19:01.150729 dockerd[1885]: time="2025-01-13T20:19:01.150640741Z" level=info msg="Loading containers: start." Jan 13 20:19:01.345673 kernel: Initializing XFRM netlink socket Jan 13 20:19:01.455911 systemd-networkd[1377]: docker0: Link UP Jan 13 20:19:01.496783 dockerd[1885]: time="2025-01-13T20:19:01.496731831Z" level=info msg="Loading containers: done." Jan 13 20:19:01.519096 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3139492719-merged.mount: Deactivated successfully. Jan 13 20:19:01.523917 dockerd[1885]: time="2025-01-13T20:19:01.523828678Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 13 20:19:01.524098 dockerd[1885]: time="2025-01-13T20:19:01.523969598Z" level=info msg="Docker daemon" commit=41ca978a0a5400cc24b274137efa9f25517fcc0b containerd-snapshotter=false storage-driver=overlay2 version=27.3.1 Jan 13 20:19:01.524233 dockerd[1885]: time="2025-01-13T20:19:01.524183638Z" level=info msg="Daemon has completed initialization" Jan 13 20:19:01.580653 dockerd[1885]: time="2025-01-13T20:19:01.579950052Z" level=info msg="API listen on /run/docker.sock" Jan 13 20:19:01.581034 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 13 20:19:02.791705 containerd[1478]: time="2025-01-13T20:19:02.791354474Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.8\"" Jan 13 20:19:03.440213 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4290727400.mount: Deactivated successfully. Jan 13 20:19:04.369189 containerd[1478]: time="2025-01-13T20:19:04.369127552Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:04.370544 containerd[1478]: time="2025-01-13T20:19:04.370297512Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.8: active requests=0, bytes read=29864102" Jan 13 20:19:04.371535 containerd[1478]: time="2025-01-13T20:19:04.371458273Z" level=info msg="ImageCreate event name:\"sha256:8202e87ffef091fe4f11dd113ff6f2ab16c70279775d224ddd8aa95e2dd0b966\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:04.376461 containerd[1478]: time="2025-01-13T20:19:04.376309154Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:f0e1b3de0c2e98e6c6abd73edf9d3b8e4d44460656cde0ebb92e2d9206961fcb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:04.377279 containerd[1478]: time="2025-01-13T20:19:04.377126434Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.8\" with image id \"sha256:8202e87ffef091fe4f11dd113ff6f2ab16c70279775d224ddd8aa95e2dd0b966\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:f0e1b3de0c2e98e6c6abd73edf9d3b8e4d44460656cde0ebb92e2d9206961fcb\", size \"29860810\" in 1.58572216s" Jan 13 20:19:04.377279 containerd[1478]: time="2025-01-13T20:19:04.377175994Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.8\" returns image reference \"sha256:8202e87ffef091fe4f11dd113ff6f2ab16c70279775d224ddd8aa95e2dd0b966\"" Jan 13 20:19:04.405235 containerd[1478]: time="2025-01-13T20:19:04.405188080Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.8\"" Jan 13 20:19:05.663736 containerd[1478]: time="2025-01-13T20:19:05.663635220Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:05.665542 containerd[1478]: time="2025-01-13T20:19:05.665474941Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.8: active requests=0, bytes read=26900714" Jan 13 20:19:05.666284 containerd[1478]: time="2025-01-13T20:19:05.666053341Z" level=info msg="ImageCreate event name:\"sha256:4b2191aa4d4d6ca9fbd7704b35401bfa6b0b90de75db22c425053e97fd5c8338\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:05.669620 containerd[1478]: time="2025-01-13T20:19:05.669023301Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:124f66b7e877eb5a80a40503057299bb60e6a5f2130905f4e3293dabf194c397\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:05.670442 containerd[1478]: time="2025-01-13T20:19:05.670392902Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.8\" with image id \"sha256:4b2191aa4d4d6ca9fbd7704b35401bfa6b0b90de75db22c425053e97fd5c8338\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:124f66b7e877eb5a80a40503057299bb60e6a5f2130905f4e3293dabf194c397\", size \"28303015\" in 1.265157782s" Jan 13 20:19:05.670559 containerd[1478]: time="2025-01-13T20:19:05.670541702Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.8\" returns image reference \"sha256:4b2191aa4d4d6ca9fbd7704b35401bfa6b0b90de75db22c425053e97fd5c8338\"" Jan 13 20:19:05.694690 containerd[1478]: time="2025-01-13T20:19:05.694648387Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.8\"" Jan 13 20:19:05.926904 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. Jan 13 20:19:05.941220 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:19:06.086248 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:19:06.092091 (kubelet)[2157]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:19:06.146896 kubelet[2157]: E0113 20:19:06.146214 2157 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:19:06.149890 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:19:06.150173 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:19:06.778166 containerd[1478]: time="2025-01-13T20:19:06.776819634Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:06.780102 containerd[1478]: time="2025-01-13T20:19:06.780046075Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.8: active requests=0, bytes read=16164352" Jan 13 20:19:06.781634 containerd[1478]: time="2025-01-13T20:19:06.781562635Z" level=info msg="ImageCreate event name:\"sha256:d43326c1723208785a33cdc1507082792eb041ca0d789c103c90180e31f65ca8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:06.786652 containerd[1478]: time="2025-01-13T20:19:06.786582236Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c8bdeac2590c99c1a77e33995423ddb6633ff90a82a2aa455442e0a8079ef8c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:06.788444 containerd[1478]: time="2025-01-13T20:19:06.788379716Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.8\" with image id \"sha256:d43326c1723208785a33cdc1507082792eb041ca0d789c103c90180e31f65ca8\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c8bdeac2590c99c1a77e33995423ddb6633ff90a82a2aa455442e0a8079ef8c7\", size \"17566671\" in 1.093430889s" Jan 13 20:19:06.788444 containerd[1478]: time="2025-01-13T20:19:06.788434516Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.8\" returns image reference \"sha256:d43326c1723208785a33cdc1507082792eb041ca0d789c103c90180e31f65ca8\"" Jan 13 20:19:06.814582 containerd[1478]: time="2025-01-13T20:19:06.814542601Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.8\"" Jan 13 20:19:07.764753 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4031359355.mount: Deactivated successfully. Jan 13 20:19:08.134839 containerd[1478]: time="2025-01-13T20:19:08.134393795Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:08.142734 containerd[1478]: time="2025-01-13T20:19:08.142575436Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.8: active requests=0, bytes read=25662037" Jan 13 20:19:08.174988 containerd[1478]: time="2025-01-13T20:19:08.174888761Z" level=info msg="ImageCreate event name:\"sha256:4612aebc0675831aedbbde7cd56b85db91f1fdcf05ef923072961538ec497adb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:08.182630 containerd[1478]: time="2025-01-13T20:19:08.181348642Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:f6d6be9417e22af78905000ac4fd134896bacd2188ea63c7cac8edd7a5d7e9b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:08.183025 containerd[1478]: time="2025-01-13T20:19:08.182971283Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.8\" with image id \"sha256:4612aebc0675831aedbbde7cd56b85db91f1fdcf05ef923072961538ec497adb\", repo tag \"registry.k8s.io/kube-proxy:v1.30.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:f6d6be9417e22af78905000ac4fd134896bacd2188ea63c7cac8edd7a5d7e9b5\", size \"25661030\" in 1.368384602s" Jan 13 20:19:08.183096 containerd[1478]: time="2025-01-13T20:19:08.183023803Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.8\" returns image reference \"sha256:4612aebc0675831aedbbde7cd56b85db91f1fdcf05ef923072961538ec497adb\"" Jan 13 20:19:08.209076 containerd[1478]: time="2025-01-13T20:19:08.208902247Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Jan 13 20:19:08.811464 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2889869349.mount: Deactivated successfully. Jan 13 20:19:09.495915 containerd[1478]: time="2025-01-13T20:19:09.495855450Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:09.499258 containerd[1478]: time="2025-01-13T20:19:09.498791886Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485461" Jan 13 20:19:09.500307 containerd[1478]: time="2025-01-13T20:19:09.500263864Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:09.506751 containerd[1478]: time="2025-01-13T20:19:09.506642662Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:09.508624 containerd[1478]: time="2025-01-13T20:19:09.508239642Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 1.299286035s" Jan 13 20:19:09.508624 containerd[1478]: time="2025-01-13T20:19:09.508296482Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" Jan 13 20:19:09.536123 containerd[1478]: time="2025-01-13T20:19:09.536036903Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Jan 13 20:19:10.084950 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3314829350.mount: Deactivated successfully. Jan 13 20:19:10.092657 containerd[1478]: time="2025-01-13T20:19:10.091666410Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:10.092657 containerd[1478]: time="2025-01-13T20:19:10.092644221Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=268841" Jan 13 20:19:10.093986 containerd[1478]: time="2025-01-13T20:19:10.093703394Z" level=info msg="ImageCreate event name:\"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:10.097910 containerd[1478]: time="2025-01-13T20:19:10.097863084Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:10.099002 containerd[1478]: time="2025-01-13T20:19:10.098864016Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"268051\" in 562.646791ms" Jan 13 20:19:10.099002 containerd[1478]: time="2025-01-13T20:19:10.098903656Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\"" Jan 13 20:19:10.123689 containerd[1478]: time="2025-01-13T20:19:10.123550590Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Jan 13 20:19:10.742409 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3229843092.mount: Deactivated successfully. Jan 13 20:19:12.288521 containerd[1478]: time="2025-01-13T20:19:12.288466140Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:12.291277 containerd[1478]: time="2025-01-13T20:19:12.291212091Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=66191552" Jan 13 20:19:12.293202 containerd[1478]: time="2025-01-13T20:19:12.292722428Z" level=info msg="ImageCreate event name:\"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:12.297808 containerd[1478]: time="2025-01-13T20:19:12.296456911Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:19:12.297808 containerd[1478]: time="2025-01-13T20:19:12.297648124Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"66189079\" in 2.17376429s" Jan 13 20:19:12.297808 containerd[1478]: time="2025-01-13T20:19:12.297691925Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\"" Jan 13 20:19:16.176138 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 13. Jan 13 20:19:16.187166 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:19:16.329806 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:19:16.336134 (kubelet)[2354]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 20:19:16.389466 kubelet[2354]: E0113 20:19:16.389390 2354 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 20:19:16.391844 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 20:19:16.391970 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 20:19:18.543252 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:19:18.560211 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:19:18.586569 systemd[1]: Reloading requested from client PID 2368 ('systemctl') (unit session-7.scope)... Jan 13 20:19:18.586590 systemd[1]: Reloading... Jan 13 20:19:18.709925 zram_generator::config[2406]: No configuration found. Jan 13 20:19:18.826843 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:19:18.910777 systemd[1]: Reloading finished in 323 ms. Jan 13 20:19:18.967274 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:19:18.971651 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:19:18.976170 systemd[1]: kubelet.service: Deactivated successfully. Jan 13 20:19:18.976491 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:19:18.984022 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:19:19.089091 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:19:19.100450 (kubelet)[2457]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 20:19:19.154540 kubelet[2457]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:19:19.154540 kubelet[2457]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 13 20:19:19.154540 kubelet[2457]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:19:19.154933 kubelet[2457]: I0113 20:19:19.154673 2457 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 20:19:20.152354 kubelet[2457]: I0113 20:19:20.152290 2457 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Jan 13 20:19:20.152354 kubelet[2457]: I0113 20:19:20.152335 2457 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 20:19:20.152837 kubelet[2457]: I0113 20:19:20.152700 2457 server.go:927] "Client rotation is on, will bootstrap in background" Jan 13 20:19:20.172560 kubelet[2457]: E0113 20:19:20.172414 2457 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://138.199.153.203:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:20.172560 kubelet[2457]: I0113 20:19:20.172483 2457 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 20:19:20.184175 kubelet[2457]: I0113 20:19:20.183821 2457 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 20:19:20.187181 kubelet[2457]: I0113 20:19:20.186026 2457 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 20:19:20.187181 kubelet[2457]: I0113 20:19:20.186100 2457 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4186-1-0-7-a3f46aeb9c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 13 20:19:20.187181 kubelet[2457]: I0113 20:19:20.186496 2457 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 20:19:20.187181 kubelet[2457]: I0113 20:19:20.186516 2457 container_manager_linux.go:301] "Creating device plugin manager" Jan 13 20:19:20.187564 kubelet[2457]: I0113 20:19:20.186852 2457 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:19:20.188685 kubelet[2457]: I0113 20:19:20.188574 2457 kubelet.go:400] "Attempting to sync node with API server" Jan 13 20:19:20.188814 kubelet[2457]: I0113 20:19:20.188795 2457 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 20:19:20.189081 kubelet[2457]: I0113 20:19:20.189068 2457 kubelet.go:312] "Adding apiserver pod source" Jan 13 20:19:20.189254 kubelet[2457]: I0113 20:19:20.189239 2457 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 20:19:20.190713 kubelet[2457]: W0113 20:19:20.190652 2457 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://138.199.153.203:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4186-1-0-7-a3f46aeb9c&limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:20.190888 kubelet[2457]: E0113 20:19:20.190869 2457 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://138.199.153.203:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4186-1-0-7-a3f46aeb9c&limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:20.191606 kubelet[2457]: I0113 20:19:20.191563 2457 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 13 20:19:20.192273 kubelet[2457]: I0113 20:19:20.192253 2457 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 20:19:20.192517 kubelet[2457]: W0113 20:19:20.192497 2457 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 13 20:19:20.193670 kubelet[2457]: I0113 20:19:20.193651 2457 server.go:1264] "Started kubelet" Jan 13 20:19:20.193932 kubelet[2457]: W0113 20:19:20.193893 2457 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://138.199.153.203:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:20.194015 kubelet[2457]: E0113 20:19:20.194004 2457 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://138.199.153.203:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:20.202818 kubelet[2457]: I0113 20:19:20.202751 2457 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 20:19:20.205142 kubelet[2457]: I0113 20:19:20.204332 2457 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 20:19:20.205142 kubelet[2457]: I0113 20:19:20.204800 2457 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 20:19:20.205910 kubelet[2457]: E0113 20:19:20.205436 2457 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://138.199.153.203:6443/api/v1/namespaces/default/events\": dial tcp 138.199.153.203:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4186-1-0-7-a3f46aeb9c.181a59ff7f654e06 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4186-1-0-7-a3f46aeb9c,UID:ci-4186-1-0-7-a3f46aeb9c,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4186-1-0-7-a3f46aeb9c,},FirstTimestamp:2025-01-13 20:19:20.193625606 +0000 UTC m=+1.085904256,LastTimestamp:2025-01-13 20:19:20.193625606 +0000 UTC m=+1.085904256,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4186-1-0-7-a3f46aeb9c,}" Jan 13 20:19:20.208430 kubelet[2457]: I0113 20:19:20.206281 2457 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 20:19:20.210653 kubelet[2457]: I0113 20:19:20.207973 2457 server.go:455] "Adding debug handlers to kubelet server" Jan 13 20:19:20.211765 kubelet[2457]: I0113 20:19:20.211743 2457 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 13 20:19:20.212180 kubelet[2457]: I0113 20:19:20.212165 2457 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Jan 13 20:19:20.216397 kubelet[2457]: E0113 20:19:20.216053 2457 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://138.199.153.203:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4186-1-0-7-a3f46aeb9c?timeout=10s\": dial tcp 138.199.153.203:6443: connect: connection refused" interval="200ms" Jan 13 20:19:20.216397 kubelet[2457]: I0113 20:19:20.216152 2457 reconciler.go:26] "Reconciler: start to sync state" Jan 13 20:19:20.216397 kubelet[2457]: I0113 20:19:20.216259 2457 factory.go:221] Registration of the systemd container factory successfully Jan 13 20:19:20.216397 kubelet[2457]: I0113 20:19:20.216330 2457 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 20:19:20.218633 kubelet[2457]: I0113 20:19:20.218500 2457 factory.go:221] Registration of the containerd container factory successfully Jan 13 20:19:20.225176 kubelet[2457]: W0113 20:19:20.225115 2457 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://138.199.153.203:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:20.225631 kubelet[2457]: E0113 20:19:20.225305 2457 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://138.199.153.203:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:20.233229 kubelet[2457]: I0113 20:19:20.233116 2457 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 20:19:20.234675 kubelet[2457]: I0113 20:19:20.234548 2457 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 20:19:20.234938 kubelet[2457]: I0113 20:19:20.234827 2457 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 13 20:19:20.234938 kubelet[2457]: I0113 20:19:20.234865 2457 kubelet.go:2337] "Starting kubelet main sync loop" Jan 13 20:19:20.234938 kubelet[2457]: E0113 20:19:20.234920 2457 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 13 20:19:20.242622 kubelet[2457]: E0113 20:19:20.242214 2457 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 13 20:19:20.242622 kubelet[2457]: W0113 20:19:20.242417 2457 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://138.199.153.203:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:20.242622 kubelet[2457]: E0113 20:19:20.242482 2457 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://138.199.153.203:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:20.247816 kubelet[2457]: I0113 20:19:20.247790 2457 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 13 20:19:20.247816 kubelet[2457]: I0113 20:19:20.247817 2457 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 13 20:19:20.247962 kubelet[2457]: I0113 20:19:20.247848 2457 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:19:20.250914 kubelet[2457]: I0113 20:19:20.250864 2457 policy_none.go:49] "None policy: Start" Jan 13 20:19:20.251817 kubelet[2457]: I0113 20:19:20.251792 2457 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 13 20:19:20.251899 kubelet[2457]: I0113 20:19:20.251825 2457 state_mem.go:35] "Initializing new in-memory state store" Jan 13 20:19:20.260316 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 13 20:19:20.279673 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 13 20:19:20.285322 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 13 20:19:20.297095 kubelet[2457]: I0113 20:19:20.297017 2457 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 20:19:20.297485 kubelet[2457]: I0113 20:19:20.297391 2457 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 13 20:19:20.297770 kubelet[2457]: I0113 20:19:20.297631 2457 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 20:19:20.303204 kubelet[2457]: E0113 20:19:20.303120 2457 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4186-1-0-7-a3f46aeb9c\" not found" Jan 13 20:19:20.316110 kubelet[2457]: I0113 20:19:20.316044 2457 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.316808 kubelet[2457]: E0113 20:19:20.316734 2457 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://138.199.153.203:6443/api/v1/nodes\": dial tcp 138.199.153.203:6443: connect: connection refused" node="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.336143 kubelet[2457]: I0113 20:19:20.335987 2457 topology_manager.go:215] "Topology Admit Handler" podUID="a26519b8774deb1c202ffcb1283bc33b" podNamespace="kube-system" podName="kube-apiserver-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.338493 kubelet[2457]: I0113 20:19:20.338256 2457 topology_manager.go:215] "Topology Admit Handler" podUID="6aeff1fe9ee461abe884fe1ae723e26d" podNamespace="kube-system" podName="kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.340532 kubelet[2457]: I0113 20:19:20.340168 2457 topology_manager.go:215] "Topology Admit Handler" podUID="75de2cd48fa3286daf8b3eabf2bc48a9" podNamespace="kube-system" podName="kube-scheduler-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.347938 systemd[1]: Created slice kubepods-burstable-poda26519b8774deb1c202ffcb1283bc33b.slice - libcontainer container kubepods-burstable-poda26519b8774deb1c202ffcb1283bc33b.slice. Jan 13 20:19:20.366967 systemd[1]: Created slice kubepods-burstable-pod75de2cd48fa3286daf8b3eabf2bc48a9.slice - libcontainer container kubepods-burstable-pod75de2cd48fa3286daf8b3eabf2bc48a9.slice. Jan 13 20:19:20.374866 systemd[1]: Created slice kubepods-burstable-pod6aeff1fe9ee461abe884fe1ae723e26d.slice - libcontainer container kubepods-burstable-pod6aeff1fe9ee461abe884fe1ae723e26d.slice. Jan 13 20:19:20.418007 kubelet[2457]: I0113 20:19:20.416618 2457 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6aeff1fe9ee461abe884fe1ae723e26d-flexvolume-dir\") pod \"kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"6aeff1fe9ee461abe884fe1ae723e26d\") " pod="kube-system/kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.418007 kubelet[2457]: I0113 20:19:20.417505 2457 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/75de2cd48fa3286daf8b3eabf2bc48a9-kubeconfig\") pod \"kube-scheduler-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"75de2cd48fa3286daf8b3eabf2bc48a9\") " pod="kube-system/kube-scheduler-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.418007 kubelet[2457]: I0113 20:19:20.417547 2457 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6aeff1fe9ee461abe884fe1ae723e26d-kubeconfig\") pod \"kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"6aeff1fe9ee461abe884fe1ae723e26d\") " pod="kube-system/kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.418007 kubelet[2457]: I0113 20:19:20.417587 2457 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6aeff1fe9ee461abe884fe1ae723e26d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"6aeff1fe9ee461abe884fe1ae723e26d\") " pod="kube-system/kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.418007 kubelet[2457]: I0113 20:19:20.417703 2457 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a26519b8774deb1c202ffcb1283bc33b-ca-certs\") pod \"kube-apiserver-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"a26519b8774deb1c202ffcb1283bc33b\") " pod="kube-system/kube-apiserver-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.418301 kubelet[2457]: I0113 20:19:20.417741 2457 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a26519b8774deb1c202ffcb1283bc33b-k8s-certs\") pod \"kube-apiserver-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"a26519b8774deb1c202ffcb1283bc33b\") " pod="kube-system/kube-apiserver-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.418301 kubelet[2457]: I0113 20:19:20.417765 2457 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a26519b8774deb1c202ffcb1283bc33b-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"a26519b8774deb1c202ffcb1283bc33b\") " pod="kube-system/kube-apiserver-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.418301 kubelet[2457]: I0113 20:19:20.417817 2457 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6aeff1fe9ee461abe884fe1ae723e26d-ca-certs\") pod \"kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"6aeff1fe9ee461abe884fe1ae723e26d\") " pod="kube-system/kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.418301 kubelet[2457]: I0113 20:19:20.417841 2457 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6aeff1fe9ee461abe884fe1ae723e26d-k8s-certs\") pod \"kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"6aeff1fe9ee461abe884fe1ae723e26d\") " pod="kube-system/kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.418301 kubelet[2457]: E0113 20:19:20.416878 2457 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://138.199.153.203:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4186-1-0-7-a3f46aeb9c?timeout=10s\": dial tcp 138.199.153.203:6443: connect: connection refused" interval="400ms" Jan 13 20:19:20.521114 kubelet[2457]: I0113 20:19:20.521013 2457 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.521870 kubelet[2457]: E0113 20:19:20.521830 2457 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://138.199.153.203:6443/api/v1/nodes\": dial tcp 138.199.153.203:6443: connect: connection refused" node="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.664065 containerd[1478]: time="2025-01-13T20:19:20.663541022Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4186-1-0-7-a3f46aeb9c,Uid:a26519b8774deb1c202ffcb1283bc33b,Namespace:kube-system,Attempt:0,}" Jan 13 20:19:20.673767 containerd[1478]: time="2025-01-13T20:19:20.673421191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4186-1-0-7-a3f46aeb9c,Uid:75de2cd48fa3286daf8b3eabf2bc48a9,Namespace:kube-system,Attempt:0,}" Jan 13 20:19:20.679409 containerd[1478]: time="2025-01-13T20:19:20.678885041Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c,Uid:6aeff1fe9ee461abe884fe1ae723e26d,Namespace:kube-system,Attempt:0,}" Jan 13 20:19:20.819058 kubelet[2457]: E0113 20:19:20.818980 2457 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://138.199.153.203:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4186-1-0-7-a3f46aeb9c?timeout=10s\": dial tcp 138.199.153.203:6443: connect: connection refused" interval="800ms" Jan 13 20:19:20.925824 kubelet[2457]: I0113 20:19:20.925132 2457 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:20.926139 kubelet[2457]: E0113 20:19:20.925960 2457 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://138.199.153.203:6443/api/v1/nodes\": dial tcp 138.199.153.203:6443: connect: connection refused" node="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:21.124255 kubelet[2457]: W0113 20:19:21.124158 2457 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://138.199.153.203:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:21.124255 kubelet[2457]: E0113 20:19:21.124230 2457 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://138.199.153.203:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:21.208107 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4166335401.mount: Deactivated successfully. Jan 13 20:19:21.222731 containerd[1478]: time="2025-01-13T20:19:21.222589471Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:19:21.223694 containerd[1478]: time="2025-01-13T20:19:21.223643800Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269193" Jan 13 20:19:21.225532 containerd[1478]: time="2025-01-13T20:19:21.225445936Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:19:21.230508 containerd[1478]: time="2025-01-13T20:19:21.230425860Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:19:21.234737 containerd[1478]: time="2025-01-13T20:19:21.234663578Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:19:21.236774 containerd[1478]: time="2025-01-13T20:19:21.236690235Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 562.609278ms" Jan 13 20:19:21.242640 containerd[1478]: time="2025-01-13T20:19:21.242394566Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 13 20:19:21.246264 containerd[1478]: time="2025-01-13T20:19:21.245987277Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 20:19:21.247118 containerd[1478]: time="2025-01-13T20:19:21.247030807Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 13 20:19:21.250574 containerd[1478]: time="2025-01-13T20:19:21.250464757Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 586.759854ms" Jan 13 20:19:21.280589 containerd[1478]: time="2025-01-13T20:19:21.279261851Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 600.280169ms" Jan 13 20:19:21.374315 containerd[1478]: time="2025-01-13T20:19:21.374167247Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:19:21.374315 containerd[1478]: time="2025-01-13T20:19:21.374259448Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:19:21.374315 containerd[1478]: time="2025-01-13T20:19:21.374275528Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:19:21.374676 containerd[1478]: time="2025-01-13T20:19:21.374622651Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:19:21.378174 containerd[1478]: time="2025-01-13T20:19:21.376529828Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:19:21.378174 containerd[1478]: time="2025-01-13T20:19:21.377977761Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:19:21.378174 containerd[1478]: time="2025-01-13T20:19:21.378011801Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:19:21.378174 containerd[1478]: time="2025-01-13T20:19:21.378122682Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:19:21.381658 containerd[1478]: time="2025-01-13T20:19:21.381317630Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:19:21.381658 containerd[1478]: time="2025-01-13T20:19:21.381394031Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:19:21.381658 containerd[1478]: time="2025-01-13T20:19:21.381410591Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:19:21.381658 containerd[1478]: time="2025-01-13T20:19:21.381523032Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:19:21.409859 systemd[1]: Started cri-containerd-acc4907c1c6f2120d7538898208329c6468dc642c044d068b9deb43580500efa.scope - libcontainer container acc4907c1c6f2120d7538898208329c6468dc642c044d068b9deb43580500efa. Jan 13 20:19:21.417340 systemd[1]: Started cri-containerd-69ac0d500ad38637117ba17350f7fca996956941613199042e2f35cfde8e9c77.scope - libcontainer container 69ac0d500ad38637117ba17350f7fca996956941613199042e2f35cfde8e9c77. Jan 13 20:19:21.426432 systemd[1]: Started cri-containerd-8d14acc280406613d063aa18dd33ab40af07eaef2250f28d026c52bc3dd230bc.scope - libcontainer container 8d14acc280406613d063aa18dd33ab40af07eaef2250f28d026c52bc3dd230bc. Jan 13 20:19:21.494878 containerd[1478]: time="2025-01-13T20:19:21.494188265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4186-1-0-7-a3f46aeb9c,Uid:75de2cd48fa3286daf8b3eabf2bc48a9,Namespace:kube-system,Attempt:0,} returns sandbox id \"8d14acc280406613d063aa18dd33ab40af07eaef2250f28d026c52bc3dd230bc\"" Jan 13 20:19:21.497385 containerd[1478]: time="2025-01-13T20:19:21.497339213Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4186-1-0-7-a3f46aeb9c,Uid:a26519b8774deb1c202ffcb1283bc33b,Namespace:kube-system,Attempt:0,} returns sandbox id \"acc4907c1c6f2120d7538898208329c6468dc642c044d068b9deb43580500efa\"" Jan 13 20:19:21.501579 containerd[1478]: time="2025-01-13T20:19:21.501533770Z" level=info msg="CreateContainer within sandbox \"8d14acc280406613d063aa18dd33ab40af07eaef2250f28d026c52bc3dd230bc\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 13 20:19:21.504970 containerd[1478]: time="2025-01-13T20:19:21.504931840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c,Uid:6aeff1fe9ee461abe884fe1ae723e26d,Namespace:kube-system,Attempt:0,} returns sandbox id \"69ac0d500ad38637117ba17350f7fca996956941613199042e2f35cfde8e9c77\"" Jan 13 20:19:21.506142 containerd[1478]: time="2025-01-13T20:19:21.506099530Z" level=info msg="CreateContainer within sandbox \"acc4907c1c6f2120d7538898208329c6468dc642c044d068b9deb43580500efa\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 13 20:19:21.510992 containerd[1478]: time="2025-01-13T20:19:21.510927933Z" level=info msg="CreateContainer within sandbox \"69ac0d500ad38637117ba17350f7fca996956941613199042e2f35cfde8e9c77\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 13 20:19:21.529849 containerd[1478]: time="2025-01-13T20:19:21.529662938Z" level=info msg="CreateContainer within sandbox \"acc4907c1c6f2120d7538898208329c6468dc642c044d068b9deb43580500efa\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"e098aa1d2184381d37b00125d41527236a29cd6d00d7bb799a251be696230663\"" Jan 13 20:19:21.531384 containerd[1478]: time="2025-01-13T20:19:21.530951629Z" level=info msg="StartContainer for \"e098aa1d2184381d37b00125d41527236a29cd6d00d7bb799a251be696230663\"" Jan 13 20:19:21.537297 containerd[1478]: time="2025-01-13T20:19:21.537224005Z" level=info msg="CreateContainer within sandbox \"69ac0d500ad38637117ba17350f7fca996956941613199042e2f35cfde8e9c77\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"f8ccbd0a8728367c341e17fad7f36507a333d203f9bc449f6ec6caf7a545f82a\"" Jan 13 20:19:21.538111 containerd[1478]: time="2025-01-13T20:19:21.537930051Z" level=info msg="StartContainer for \"f8ccbd0a8728367c341e17fad7f36507a333d203f9bc449f6ec6caf7a545f82a\"" Jan 13 20:19:21.538656 containerd[1478]: time="2025-01-13T20:19:21.538582297Z" level=info msg="CreateContainer within sandbox \"8d14acc280406613d063aa18dd33ab40af07eaef2250f28d026c52bc3dd230bc\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"e83fbe58e72cd88a28efdb6097d48ab6e42dcfd649c01b81166dadf26b733015\"" Jan 13 20:19:21.539635 containerd[1478]: time="2025-01-13T20:19:21.539227902Z" level=info msg="StartContainer for \"e83fbe58e72cd88a28efdb6097d48ab6e42dcfd649c01b81166dadf26b733015\"" Jan 13 20:19:21.571847 systemd[1]: Started cri-containerd-e098aa1d2184381d37b00125d41527236a29cd6d00d7bb799a251be696230663.scope - libcontainer container e098aa1d2184381d37b00125d41527236a29cd6d00d7bb799a251be696230663. Jan 13 20:19:21.596759 kubelet[2457]: W0113 20:19:21.595947 2457 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://138.199.153.203:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4186-1-0-7-a3f46aeb9c&limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:21.596759 kubelet[2457]: E0113 20:19:21.596025 2457 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://138.199.153.203:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4186-1-0-7-a3f46aeb9c&limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:21.600840 systemd[1]: Started cri-containerd-e83fbe58e72cd88a28efdb6097d48ab6e42dcfd649c01b81166dadf26b733015.scope - libcontainer container e83fbe58e72cd88a28efdb6097d48ab6e42dcfd649c01b81166dadf26b733015. Jan 13 20:19:21.612868 systemd[1]: Started cri-containerd-f8ccbd0a8728367c341e17fad7f36507a333d203f9bc449f6ec6caf7a545f82a.scope - libcontainer container f8ccbd0a8728367c341e17fad7f36507a333d203f9bc449f6ec6caf7a545f82a. Jan 13 20:19:21.620672 kubelet[2457]: E0113 20:19:21.620320 2457 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://138.199.153.203:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4186-1-0-7-a3f46aeb9c?timeout=10s\": dial tcp 138.199.153.203:6443: connect: connection refused" interval="1.6s" Jan 13 20:19:21.638314 containerd[1478]: time="2025-01-13T20:19:21.638148334Z" level=info msg="StartContainer for \"e098aa1d2184381d37b00125d41527236a29cd6d00d7bb799a251be696230663\" returns successfully" Jan 13 20:19:21.677535 containerd[1478]: time="2025-01-13T20:19:21.676743315Z" level=info msg="StartContainer for \"e83fbe58e72cd88a28efdb6097d48ab6e42dcfd649c01b81166dadf26b733015\" returns successfully" Jan 13 20:19:21.700100 containerd[1478]: time="2025-01-13T20:19:21.699566436Z" level=info msg="StartContainer for \"f8ccbd0a8728367c341e17fad7f36507a333d203f9bc449f6ec6caf7a545f82a\" returns successfully" Jan 13 20:19:21.718625 kubelet[2457]: W0113 20:19:21.717914 2457 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://138.199.153.203:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:21.718625 kubelet[2457]: E0113 20:19:21.717990 2457 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://138.199.153.203:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:21.730020 kubelet[2457]: I0113 20:19:21.729630 2457 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:21.730020 kubelet[2457]: E0113 20:19:21.729964 2457 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://138.199.153.203:6443/api/v1/nodes\": dial tcp 138.199.153.203:6443: connect: connection refused" node="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:21.764373 kubelet[2457]: W0113 20:19:21.764166 2457 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://138.199.153.203:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:21.764373 kubelet[2457]: E0113 20:19:21.764247 2457 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://138.199.153.203:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 138.199.153.203:6443: connect: connection refused Jan 13 20:19:23.332655 kubelet[2457]: I0113 20:19:23.331832 2457 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:23.937758 kubelet[2457]: E0113 20:19:23.937644 2457 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4186-1-0-7-a3f46aeb9c\" not found" node="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:24.039871 kubelet[2457]: I0113 20:19:24.039660 2457 kubelet_node_status.go:76] "Successfully registered node" node="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:24.104995 kubelet[2457]: E0113 20:19:24.104947 2457 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4186-1-0-7-a3f46aeb9c\" not found" Jan 13 20:19:24.192809 kubelet[2457]: I0113 20:19:24.192426 2457 apiserver.go:52] "Watching apiserver" Jan 13 20:19:24.212752 kubelet[2457]: I0113 20:19:24.212700 2457 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Jan 13 20:19:24.711760 kubelet[2457]: E0113 20:19:24.711354 2457 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-4186-1-0-7-a3f46aeb9c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:26.543775 systemd[1]: Reloading requested from client PID 2732 ('systemctl') (unit session-7.scope)... Jan 13 20:19:26.544121 systemd[1]: Reloading... Jan 13 20:19:26.646678 zram_generator::config[2775]: No configuration found. Jan 13 20:19:26.765543 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 20:19:26.864962 systemd[1]: Reloading finished in 320 ms. Jan 13 20:19:26.907876 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:19:26.922343 systemd[1]: kubelet.service: Deactivated successfully. Jan 13 20:19:26.922891 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:19:26.923079 systemd[1]: kubelet.service: Consumed 1.542s CPU time, 112.8M memory peak, 0B memory swap peak. Jan 13 20:19:26.930976 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 20:19:27.071229 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 20:19:27.083037 (kubelet)[2817]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 20:19:27.142652 kubelet[2817]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:19:27.142652 kubelet[2817]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 13 20:19:27.142652 kubelet[2817]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 20:19:27.142981 kubelet[2817]: I0113 20:19:27.142673 2817 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 20:19:27.147676 kubelet[2817]: I0113 20:19:27.147634 2817 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Jan 13 20:19:27.147676 kubelet[2817]: I0113 20:19:27.147665 2817 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 20:19:27.147890 kubelet[2817]: I0113 20:19:27.147872 2817 server.go:927] "Client rotation is on, will bootstrap in background" Jan 13 20:19:27.149740 kubelet[2817]: I0113 20:19:27.149577 2817 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 13 20:19:27.151323 kubelet[2817]: I0113 20:19:27.151272 2817 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 20:19:27.161588 kubelet[2817]: I0113 20:19:27.161526 2817 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 20:19:27.162311 kubelet[2817]: I0113 20:19:27.162192 2817 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 20:19:27.162539 kubelet[2817]: I0113 20:19:27.162232 2817 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4186-1-0-7-a3f46aeb9c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 13 20:19:27.162539 kubelet[2817]: I0113 20:19:27.162452 2817 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 20:19:27.162539 kubelet[2817]: I0113 20:19:27.162462 2817 container_manager_linux.go:301] "Creating device plugin manager" Jan 13 20:19:27.162539 kubelet[2817]: I0113 20:19:27.162514 2817 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:19:27.162734 kubelet[2817]: I0113 20:19:27.162652 2817 kubelet.go:400] "Attempting to sync node with API server" Jan 13 20:19:27.162734 kubelet[2817]: I0113 20:19:27.162666 2817 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 20:19:27.162734 kubelet[2817]: I0113 20:19:27.162693 2817 kubelet.go:312] "Adding apiserver pod source" Jan 13 20:19:27.162734 kubelet[2817]: I0113 20:19:27.162709 2817 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 20:19:27.163657 kubelet[2817]: I0113 20:19:27.163461 2817 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 13 20:19:27.163982 kubelet[2817]: I0113 20:19:27.163964 2817 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 20:19:27.165138 kubelet[2817]: I0113 20:19:27.164723 2817 server.go:1264] "Started kubelet" Jan 13 20:19:27.168587 kubelet[2817]: I0113 20:19:27.168554 2817 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 20:19:27.172981 kubelet[2817]: I0113 20:19:27.172926 2817 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 20:19:27.175665 kubelet[2817]: I0113 20:19:27.175227 2817 server.go:455] "Adding debug handlers to kubelet server" Jan 13 20:19:27.176649 kubelet[2817]: I0113 20:19:27.176254 2817 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 20:19:27.176649 kubelet[2817]: I0113 20:19:27.176504 2817 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 20:19:27.187234 kubelet[2817]: I0113 20:19:27.184968 2817 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 13 20:19:27.194767 kubelet[2817]: I0113 20:19:27.194732 2817 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Jan 13 20:19:27.195082 kubelet[2817]: I0113 20:19:27.195068 2817 reconciler.go:26] "Reconciler: start to sync state" Jan 13 20:19:27.201935 kubelet[2817]: I0113 20:19:27.201887 2817 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 20:19:27.204087 kubelet[2817]: I0113 20:19:27.204048 2817 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 20:19:27.204268 kubelet[2817]: I0113 20:19:27.204255 2817 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 13 20:19:27.204343 kubelet[2817]: I0113 20:19:27.204335 2817 kubelet.go:2337] "Starting kubelet main sync loop" Jan 13 20:19:27.204635 kubelet[2817]: E0113 20:19:27.204438 2817 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 13 20:19:27.214740 kubelet[2817]: I0113 20:19:27.214695 2817 factory.go:221] Registration of the systemd container factory successfully Jan 13 20:19:27.214871 kubelet[2817]: I0113 20:19:27.214800 2817 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 20:19:27.222636 kubelet[2817]: I0113 20:19:27.219378 2817 factory.go:221] Registration of the containerd container factory successfully Jan 13 20:19:27.289176 kubelet[2817]: I0113 20:19:27.289141 2817 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.294705 kubelet[2817]: I0113 20:19:27.294668 2817 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 13 20:19:27.295114 kubelet[2817]: I0113 20:19:27.295089 2817 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 13 20:19:27.295377 kubelet[2817]: I0113 20:19:27.295281 2817 state_mem.go:36] "Initialized new in-memory state store" Jan 13 20:19:27.296075 kubelet[2817]: I0113 20:19:27.296045 2817 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 13 20:19:27.296384 kubelet[2817]: I0113 20:19:27.296296 2817 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 13 20:19:27.296542 kubelet[2817]: I0113 20:19:27.296523 2817 policy_none.go:49] "None policy: Start" Jan 13 20:19:27.300629 kubelet[2817]: I0113 20:19:27.298854 2817 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 13 20:19:27.300629 kubelet[2817]: I0113 20:19:27.298915 2817 state_mem.go:35] "Initializing new in-memory state store" Jan 13 20:19:27.300629 kubelet[2817]: I0113 20:19:27.299187 2817 state_mem.go:75] "Updated machine memory state" Jan 13 20:19:27.305349 kubelet[2817]: E0113 20:19:27.305296 2817 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 13 20:19:27.306179 kubelet[2817]: I0113 20:19:27.306147 2817 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 20:19:27.310382 kubelet[2817]: I0113 20:19:27.310301 2817 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 13 20:19:27.312213 kubelet[2817]: I0113 20:19:27.311839 2817 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 20:19:27.320302 kubelet[2817]: I0113 20:19:27.317767 2817 kubelet_node_status.go:112] "Node was previously registered" node="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.320302 kubelet[2817]: I0113 20:19:27.317856 2817 kubelet_node_status.go:76] "Successfully registered node" node="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.506649 kubelet[2817]: I0113 20:19:27.506357 2817 topology_manager.go:215] "Topology Admit Handler" podUID="75de2cd48fa3286daf8b3eabf2bc48a9" podNamespace="kube-system" podName="kube-scheduler-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.506649 kubelet[2817]: I0113 20:19:27.506570 2817 topology_manager.go:215] "Topology Admit Handler" podUID="a26519b8774deb1c202ffcb1283bc33b" podNamespace="kube-system" podName="kube-apiserver-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.506840 kubelet[2817]: I0113 20:19:27.506671 2817 topology_manager.go:215] "Topology Admit Handler" podUID="6aeff1fe9ee461abe884fe1ae723e26d" podNamespace="kube-system" podName="kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.519673 kubelet[2817]: E0113 20:19:27.519516 2817 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c\" already exists" pod="kube-system/kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.522238 kubelet[2817]: E0113 20:19:27.520623 2817 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4186-1-0-7-a3f46aeb9c\" already exists" pod="kube-system/kube-apiserver-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.599493 kubelet[2817]: I0113 20:19:27.598039 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6aeff1fe9ee461abe884fe1ae723e26d-ca-certs\") pod \"kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"6aeff1fe9ee461abe884fe1ae723e26d\") " pod="kube-system/kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.599493 kubelet[2817]: I0113 20:19:27.599454 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6aeff1fe9ee461abe884fe1ae723e26d-flexvolume-dir\") pod \"kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"6aeff1fe9ee461abe884fe1ae723e26d\") " pod="kube-system/kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.599714 kubelet[2817]: I0113 20:19:27.599519 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a26519b8774deb1c202ffcb1283bc33b-ca-certs\") pod \"kube-apiserver-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"a26519b8774deb1c202ffcb1283bc33b\") " pod="kube-system/kube-apiserver-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.599714 kubelet[2817]: I0113 20:19:27.599541 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a26519b8774deb1c202ffcb1283bc33b-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"a26519b8774deb1c202ffcb1283bc33b\") " pod="kube-system/kube-apiserver-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.599714 kubelet[2817]: I0113 20:19:27.599583 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6aeff1fe9ee461abe884fe1ae723e26d-k8s-certs\") pod \"kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"6aeff1fe9ee461abe884fe1ae723e26d\") " pod="kube-system/kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.599714 kubelet[2817]: I0113 20:19:27.599626 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6aeff1fe9ee461abe884fe1ae723e26d-kubeconfig\") pod \"kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"6aeff1fe9ee461abe884fe1ae723e26d\") " pod="kube-system/kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.599714 kubelet[2817]: I0113 20:19:27.599647 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6aeff1fe9ee461abe884fe1ae723e26d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"6aeff1fe9ee461abe884fe1ae723e26d\") " pod="kube-system/kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.599833 kubelet[2817]: I0113 20:19:27.599667 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/75de2cd48fa3286daf8b3eabf2bc48a9-kubeconfig\") pod \"kube-scheduler-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"75de2cd48fa3286daf8b3eabf2bc48a9\") " pod="kube-system/kube-scheduler-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:27.599833 kubelet[2817]: I0113 20:19:27.599795 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a26519b8774deb1c202ffcb1283bc33b-k8s-certs\") pod \"kube-apiserver-ci-4186-1-0-7-a3f46aeb9c\" (UID: \"a26519b8774deb1c202ffcb1283bc33b\") " pod="kube-system/kube-apiserver-ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:19:28.169427 kubelet[2817]: I0113 20:19:28.169186 2817 apiserver.go:52] "Watching apiserver" Jan 13 20:19:28.195539 kubelet[2817]: I0113 20:19:28.195265 2817 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Jan 13 20:19:28.376766 kubelet[2817]: I0113 20:19:28.376621 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4186-1-0-7-a3f46aeb9c" podStartSLOduration=3.376590496 podStartE2EDuration="3.376590496s" podCreationTimestamp="2025-01-13 20:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:19:28.354237892 +0000 UTC m=+1.266897744" watchObservedRunningTime="2025-01-13 20:19:28.376590496 +0000 UTC m=+1.289250348" Jan 13 20:19:28.404333 kubelet[2817]: I0113 20:19:28.403966 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4186-1-0-7-a3f46aeb9c" podStartSLOduration=1.403950056 podStartE2EDuration="1.403950056s" podCreationTimestamp="2025-01-13 20:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:19:28.376919178 +0000 UTC m=+1.289579030" watchObservedRunningTime="2025-01-13 20:19:28.403950056 +0000 UTC m=+1.316609868" Jan 13 20:19:28.441918 kubelet[2817]: I0113 20:19:28.441748 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4186-1-0-7-a3f46aeb9c" podStartSLOduration=2.441728893 podStartE2EDuration="2.441728893s" podCreationTimestamp="2025-01-13 20:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:19:28.404205538 +0000 UTC m=+1.316865390" watchObservedRunningTime="2025-01-13 20:19:28.441728893 +0000 UTC m=+1.354388745" Jan 13 20:19:32.662446 sudo[1867]: pam_unix(sudo:session): session closed for user root Jan 13 20:19:32.823629 sshd[1866]: Connection closed by 139.178.89.65 port 45776 Jan 13 20:19:32.822719 sshd-session[1864]: pam_unix(sshd:session): session closed for user core Jan 13 20:19:32.829182 systemd[1]: sshd@6-138.199.153.203:22-139.178.89.65:45776.service: Deactivated successfully. Jan 13 20:19:32.832257 systemd[1]: session-7.scope: Deactivated successfully. Jan 13 20:19:32.832579 systemd[1]: session-7.scope: Consumed 8.103s CPU time, 189.6M memory peak, 0B memory swap peak. Jan 13 20:19:32.833629 systemd-logind[1457]: Session 7 logged out. Waiting for processes to exit. Jan 13 20:19:32.835489 systemd-logind[1457]: Removed session 7. Jan 13 20:19:43.897051 kubelet[2817]: I0113 20:19:43.896940 2817 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 13 20:19:43.898243 kubelet[2817]: I0113 20:19:43.898146 2817 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 13 20:19:43.898342 containerd[1478]: time="2025-01-13T20:19:43.897825485Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 13 20:19:44.862190 kubelet[2817]: I0113 20:19:44.862140 2817 topology_manager.go:215] "Topology Admit Handler" podUID="9594e33d-57a9-4ef5-9124-467a8f9104de" podNamespace="kube-system" podName="kube-proxy-gxgjv" Jan 13 20:19:44.888933 systemd[1]: Created slice kubepods-besteffort-pod9594e33d_57a9_4ef5_9124_467a8f9104de.slice - libcontainer container kubepods-besteffort-pod9594e33d_57a9_4ef5_9124_467a8f9104de.slice. Jan 13 20:19:44.924980 kubelet[2817]: I0113 20:19:44.924612 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/9594e33d-57a9-4ef5-9124-467a8f9104de-kube-proxy\") pod \"kube-proxy-gxgjv\" (UID: \"9594e33d-57a9-4ef5-9124-467a8f9104de\") " pod="kube-system/kube-proxy-gxgjv" Jan 13 20:19:44.924980 kubelet[2817]: I0113 20:19:44.924665 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9594e33d-57a9-4ef5-9124-467a8f9104de-xtables-lock\") pod \"kube-proxy-gxgjv\" (UID: \"9594e33d-57a9-4ef5-9124-467a8f9104de\") " pod="kube-system/kube-proxy-gxgjv" Jan 13 20:19:44.924980 kubelet[2817]: I0113 20:19:44.924691 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9594e33d-57a9-4ef5-9124-467a8f9104de-lib-modules\") pod \"kube-proxy-gxgjv\" (UID: \"9594e33d-57a9-4ef5-9124-467a8f9104de\") " pod="kube-system/kube-proxy-gxgjv" Jan 13 20:19:44.924980 kubelet[2817]: I0113 20:19:44.924717 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj7sq\" (UniqueName: \"kubernetes.io/projected/9594e33d-57a9-4ef5-9124-467a8f9104de-kube-api-access-bj7sq\") pod \"kube-proxy-gxgjv\" (UID: \"9594e33d-57a9-4ef5-9124-467a8f9104de\") " pod="kube-system/kube-proxy-gxgjv" Jan 13 20:19:44.942431 kubelet[2817]: I0113 20:19:44.941606 2817 topology_manager.go:215] "Topology Admit Handler" podUID="165dbc9f-0adf-4a2f-b0a7-d975c523a53f" podNamespace="tigera-operator" podName="tigera-operator-7bc55997bb-8xkf8" Jan 13 20:19:44.957794 systemd[1]: Created slice kubepods-besteffort-pod165dbc9f_0adf_4a2f_b0a7_d975c523a53f.slice - libcontainer container kubepods-besteffort-pod165dbc9f_0adf_4a2f_b0a7_d975c523a53f.slice. Jan 13 20:19:45.026398 kubelet[2817]: I0113 20:19:45.025914 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thtl8\" (UniqueName: \"kubernetes.io/projected/165dbc9f-0adf-4a2f-b0a7-d975c523a53f-kube-api-access-thtl8\") pod \"tigera-operator-7bc55997bb-8xkf8\" (UID: \"165dbc9f-0adf-4a2f-b0a7-d975c523a53f\") " pod="tigera-operator/tigera-operator-7bc55997bb-8xkf8" Jan 13 20:19:45.026398 kubelet[2817]: I0113 20:19:45.026008 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/165dbc9f-0adf-4a2f-b0a7-d975c523a53f-var-lib-calico\") pod \"tigera-operator-7bc55997bb-8xkf8\" (UID: \"165dbc9f-0adf-4a2f-b0a7-d975c523a53f\") " pod="tigera-operator/tigera-operator-7bc55997bb-8xkf8" Jan 13 20:19:45.201492 containerd[1478]: time="2025-01-13T20:19:45.200950330Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gxgjv,Uid:9594e33d-57a9-4ef5-9124-467a8f9104de,Namespace:kube-system,Attempt:0,}" Jan 13 20:19:45.243003 containerd[1478]: time="2025-01-13T20:19:45.242767773Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:19:45.243003 containerd[1478]: time="2025-01-13T20:19:45.242827173Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:19:45.243003 containerd[1478]: time="2025-01-13T20:19:45.242837653Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:19:45.243003 containerd[1478]: time="2025-01-13T20:19:45.242908533Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:19:45.262784 containerd[1478]: time="2025-01-13T20:19:45.262068066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-8xkf8,Uid:165dbc9f-0adf-4a2f-b0a7-d975c523a53f,Namespace:tigera-operator,Attempt:0,}" Jan 13 20:19:45.270830 systemd[1]: Started cri-containerd-d03d12512a870383dc5752e8a3e1a6bb2587458f40a0a40768251b73623ab45c.scope - libcontainer container d03d12512a870383dc5752e8a3e1a6bb2587458f40a0a40768251b73623ab45c. Jan 13 20:19:45.299177 containerd[1478]: time="2025-01-13T20:19:45.298881444Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:19:45.300489 containerd[1478]: time="2025-01-13T20:19:45.299082205Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:19:45.300658 containerd[1478]: time="2025-01-13T20:19:45.300472812Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:19:45.301138 containerd[1478]: time="2025-01-13T20:19:45.301099855Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:19:45.305826 containerd[1478]: time="2025-01-13T20:19:45.305779318Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gxgjv,Uid:9594e33d-57a9-4ef5-9124-467a8f9104de,Namespace:kube-system,Attempt:0,} returns sandbox id \"d03d12512a870383dc5752e8a3e1a6bb2587458f40a0a40768251b73623ab45c\"" Jan 13 20:19:45.311681 containerd[1478]: time="2025-01-13T20:19:45.311631946Z" level=info msg="CreateContainer within sandbox \"d03d12512a870383dc5752e8a3e1a6bb2587458f40a0a40768251b73623ab45c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 13 20:19:45.325895 systemd[1]: Started cri-containerd-60889e28563ac19dd18f35de621660bd1465b36f720b72cec6c03b837e342729.scope - libcontainer container 60889e28563ac19dd18f35de621660bd1465b36f720b72cec6c03b837e342729. Jan 13 20:19:45.337203 containerd[1478]: time="2025-01-13T20:19:45.337062749Z" level=info msg="CreateContainer within sandbox \"d03d12512a870383dc5752e8a3e1a6bb2587458f40a0a40768251b73623ab45c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ff63aba9bd12ae97b0d2c753f70642269f3feca93a188224c24ac28758ea46a8\"" Jan 13 20:19:45.340695 containerd[1478]: time="2025-01-13T20:19:45.339426041Z" level=info msg="StartContainer for \"ff63aba9bd12ae97b0d2c753f70642269f3feca93a188224c24ac28758ea46a8\"" Jan 13 20:19:45.374326 containerd[1478]: time="2025-01-13T20:19:45.374279250Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-8xkf8,Uid:165dbc9f-0adf-4a2f-b0a7-d975c523a53f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"60889e28563ac19dd18f35de621660bd1465b36f720b72cec6c03b837e342729\"" Jan 13 20:19:45.381779 containerd[1478]: time="2025-01-13T20:19:45.381731606Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Jan 13 20:19:45.385484 systemd[1]: Started cri-containerd-ff63aba9bd12ae97b0d2c753f70642269f3feca93a188224c24ac28758ea46a8.scope - libcontainer container ff63aba9bd12ae97b0d2c753f70642269f3feca93a188224c24ac28758ea46a8. Jan 13 20:19:45.417746 containerd[1478]: time="2025-01-13T20:19:45.417685500Z" level=info msg="StartContainer for \"ff63aba9bd12ae97b0d2c753f70642269f3feca93a188224c24ac28758ea46a8\" returns successfully" Jan 13 20:19:46.963121 containerd[1478]: time="2025-01-13T20:19:46.963056917Z" level=error msg="PullImage \"quay.io/tigera/operator:v1.36.2\" failed" error="failed to pull and unpack image \"quay.io/tigera/operator:v1.36.2\": failed to copy: httpReadSeeker: failed open: unexpected status code https://quay.io/v2/tigera/operator/blobs/sha256:50ef0a8b197b3139c841df991d460c2310a7f018c404fa06e52dfc39e4040982: 502 Bad Gateway" Jan 13 20:19:46.963583 containerd[1478]: time="2025-01-13T20:19:46.963187718Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=6967" Jan 13 20:19:46.963717 kubelet[2817]: E0113 20:19:46.963364 2817 remote_image.go:180] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"quay.io/tigera/operator:v1.36.2\": failed to copy: httpReadSeeker: failed open: unexpected status code https://quay.io/v2/tigera/operator/blobs/sha256:50ef0a8b197b3139c841df991d460c2310a7f018c404fa06e52dfc39e4040982: 502 Bad Gateway" image="quay.io/tigera/operator:v1.36.2" Jan 13 20:19:46.963717 kubelet[2817]: E0113 20:19:46.963443 2817 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"quay.io/tigera/operator:v1.36.2\": failed to copy: httpReadSeeker: failed open: unexpected status code https://quay.io/v2/tigera/operator/blobs/sha256:50ef0a8b197b3139c841df991d460c2310a7f018c404fa06e52dfc39e4040982: 502 Bad Gateway" image="quay.io/tigera/operator:v1.36.2" Jan 13 20:19:46.964865 kubelet[2817]: E0113 20:19:46.963683 2817 kuberuntime_manager.go:1256] container &Container{Name:tigera-operator,Image:quay.io/tigera/operator:v1.36.2,Command:[operator],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:tigera-operator,ValueFrom:nil,},EnvVar{Name:TIGERA_OPERATOR_INIT_IMAGE_VERSION,Value:v1.36.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-lib-calico,ReadOnly:true,MountPath:/var/lib/calico,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-thtl8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:kubernetes-services-endpoint,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tigera-operator-7bc55997bb-8xkf8_tigera-operator(165dbc9f-0adf-4a2f-b0a7-d975c523a53f): ErrImagePull: failed to pull and unpack image "quay.io/tigera/operator:v1.36.2": failed to copy: httpReadSeeker: failed open: unexpected status code https://quay.io/v2/tigera/operator/blobs/sha256:50ef0a8b197b3139c841df991d460c2310a7f018c404fa06e52dfc39e4040982: 502 Bad Gateway Jan 13 20:19:46.964991 kubelet[2817]: E0113 20:19:46.963721 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with ErrImagePull: \"failed to pull and unpack image \\\"quay.io/tigera/operator:v1.36.2\\\": failed to copy: httpReadSeeker: failed open: unexpected status code https://quay.io/v2/tigera/operator/blobs/sha256:50ef0a8b197b3139c841df991d460c2310a7f018c404fa06e52dfc39e4040982: 502 Bad Gateway\"" pod="tigera-operator/tigera-operator-7bc55997bb-8xkf8" podUID="165dbc9f-0adf-4a2f-b0a7-d975c523a53f" Jan 13 20:19:47.226819 kubelet[2817]: I0113 20:19:47.225907 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-gxgjv" podStartSLOduration=3.225878378 podStartE2EDuration="3.225878378s" podCreationTimestamp="2025-01-13 20:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:19:46.338866243 +0000 UTC m=+19.251526135" watchObservedRunningTime="2025-01-13 20:19:47.225878378 +0000 UTC m=+20.138538310" Jan 13 20:19:47.325950 kubelet[2817]: E0113 20:19:47.325688 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/tigera/operator:v1.36.2\\\"\"" pod="tigera-operator/tigera-operator-7bc55997bb-8xkf8" podUID="165dbc9f-0adf-4a2f-b0a7-d975c523a53f" Jan 13 20:20:02.208110 containerd[1478]: time="2025-01-13T20:20:02.207813866Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Jan 13 20:20:03.037390 kernel: hrtimer: interrupt took 2622289 ns Jan 13 20:20:06.192689 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1868289925.mount: Deactivated successfully. Jan 13 20:20:06.556563 containerd[1478]: time="2025-01-13T20:20:06.556407105Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:06.558308 containerd[1478]: time="2025-01-13T20:20:06.558201951Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=19126008" Jan 13 20:20:06.559532 containerd[1478]: time="2025-01-13T20:20:06.559447955Z" level=info msg="ImageCreate event name:\"sha256:30d521e4e84764b396aacbb2a373ca7a573f84571e3955b34329652acccfb73c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:06.562500 containerd[1478]: time="2025-01-13T20:20:06.562428644Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:06.563966 containerd[1478]: time="2025-01-13T20:20:06.563278967Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:30d521e4e84764b396aacbb2a373ca7a573f84571e3955b34329652acccfb73c\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"19120155\" in 4.355412661s" Jan 13 20:20:06.563966 containerd[1478]: time="2025-01-13T20:20:06.563324167Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:30d521e4e84764b396aacbb2a373ca7a573f84571e3955b34329652acccfb73c\"" Jan 13 20:20:06.567217 containerd[1478]: time="2025-01-13T20:20:06.567165139Z" level=info msg="CreateContainer within sandbox \"60889e28563ac19dd18f35de621660bd1465b36f720b72cec6c03b837e342729\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 13 20:20:06.585619 containerd[1478]: time="2025-01-13T20:20:06.585547957Z" level=info msg="CreateContainer within sandbox \"60889e28563ac19dd18f35de621660bd1465b36f720b72cec6c03b837e342729\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d853fcf395080df1cd38ab3f891cdb2f8a6ddf60a830ab56643c86fac4450f84\"" Jan 13 20:20:06.586168 containerd[1478]: time="2025-01-13T20:20:06.586143639Z" level=info msg="StartContainer for \"d853fcf395080df1cd38ab3f891cdb2f8a6ddf60a830ab56643c86fac4450f84\"" Jan 13 20:20:06.621938 systemd[1]: Started cri-containerd-d853fcf395080df1cd38ab3f891cdb2f8a6ddf60a830ab56643c86fac4450f84.scope - libcontainer container d853fcf395080df1cd38ab3f891cdb2f8a6ddf60a830ab56643c86fac4450f84. Jan 13 20:20:06.654510 containerd[1478]: time="2025-01-13T20:20:06.654368694Z" level=info msg="StartContainer for \"d853fcf395080df1cd38ab3f891cdb2f8a6ddf60a830ab56643c86fac4450f84\" returns successfully" Jan 13 20:20:10.543515 kubelet[2817]: I0113 20:20:10.543316 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7bc55997bb-8xkf8" podStartSLOduration=5.358204002 podStartE2EDuration="26.543297097s" podCreationTimestamp="2025-01-13 20:19:44 +0000 UTC" firstStartedPulling="2025-01-13 20:19:45.379881837 +0000 UTC m=+18.292541689" lastFinishedPulling="2025-01-13 20:20:06.564974932 +0000 UTC m=+39.477634784" observedRunningTime="2025-01-13 20:20:07.394891484 +0000 UTC m=+40.307551336" watchObservedRunningTime="2025-01-13 20:20:10.543297097 +0000 UTC m=+43.455956949" Jan 13 20:20:10.545816 kubelet[2817]: I0113 20:20:10.544532 2817 topology_manager.go:215] "Topology Admit Handler" podUID="aca6560c-1ca6-44a6-a2a8-7f3e123c3251" podNamespace="calico-system" podName="calico-typha-666d87df88-scng9" Jan 13 20:20:10.557668 systemd[1]: Created slice kubepods-besteffort-podaca6560c_1ca6_44a6_a2a8_7f3e123c3251.slice - libcontainer container kubepods-besteffort-podaca6560c_1ca6_44a6_a2a8_7f3e123c3251.slice. Jan 13 20:20:10.609507 kubelet[2817]: I0113 20:20:10.609429 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbwx6\" (UniqueName: \"kubernetes.io/projected/aca6560c-1ca6-44a6-a2a8-7f3e123c3251-kube-api-access-qbwx6\") pod \"calico-typha-666d87df88-scng9\" (UID: \"aca6560c-1ca6-44a6-a2a8-7f3e123c3251\") " pod="calico-system/calico-typha-666d87df88-scng9" Jan 13 20:20:10.609892 kubelet[2817]: I0113 20:20:10.609543 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aca6560c-1ca6-44a6-a2a8-7f3e123c3251-tigera-ca-bundle\") pod \"calico-typha-666d87df88-scng9\" (UID: \"aca6560c-1ca6-44a6-a2a8-7f3e123c3251\") " pod="calico-system/calico-typha-666d87df88-scng9" Jan 13 20:20:10.609892 kubelet[2817]: I0113 20:20:10.609571 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/aca6560c-1ca6-44a6-a2a8-7f3e123c3251-typha-certs\") pod \"calico-typha-666d87df88-scng9\" (UID: \"aca6560c-1ca6-44a6-a2a8-7f3e123c3251\") " pod="calico-system/calico-typha-666d87df88-scng9" Jan 13 20:20:10.674660 kubelet[2817]: I0113 20:20:10.672851 2817 topology_manager.go:215] "Topology Admit Handler" podUID="6ace5514-3f20-4dc8-a659-9a446e1b2450" podNamespace="calico-system" podName="calico-node-lp5z6" Jan 13 20:20:10.691621 systemd[1]: Created slice kubepods-besteffort-pod6ace5514_3f20_4dc8_a659_9a446e1b2450.slice - libcontainer container kubepods-besteffort-pod6ace5514_3f20_4dc8_a659_9a446e1b2450.slice. Jan 13 20:20:10.710283 kubelet[2817]: I0113 20:20:10.710222 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/6ace5514-3f20-4dc8-a659-9a446e1b2450-flexvol-driver-host\") pod \"calico-node-lp5z6\" (UID: \"6ace5514-3f20-4dc8-a659-9a446e1b2450\") " pod="calico-system/calico-node-lp5z6" Jan 13 20:20:10.710678 kubelet[2817]: I0113 20:20:10.710656 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/6ace5514-3f20-4dc8-a659-9a446e1b2450-policysync\") pod \"calico-node-lp5z6\" (UID: \"6ace5514-3f20-4dc8-a659-9a446e1b2450\") " pod="calico-system/calico-node-lp5z6" Jan 13 20:20:10.710678 kubelet[2817]: I0113 20:20:10.710751 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/6ace5514-3f20-4dc8-a659-9a446e1b2450-var-run-calico\") pod \"calico-node-lp5z6\" (UID: \"6ace5514-3f20-4dc8-a659-9a446e1b2450\") " pod="calico-system/calico-node-lp5z6" Jan 13 20:20:10.710678 kubelet[2817]: I0113 20:20:10.710779 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6ace5514-3f20-4dc8-a659-9a446e1b2450-var-lib-calico\") pod \"calico-node-lp5z6\" (UID: \"6ace5514-3f20-4dc8-a659-9a446e1b2450\") " pod="calico-system/calico-node-lp5z6" Jan 13 20:20:10.711150 kubelet[2817]: I0113 20:20:10.711019 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbj9f\" (UniqueName: \"kubernetes.io/projected/6ace5514-3f20-4dc8-a659-9a446e1b2450-kube-api-access-nbj9f\") pod \"calico-node-lp5z6\" (UID: \"6ace5514-3f20-4dc8-a659-9a446e1b2450\") " pod="calico-system/calico-node-lp5z6" Jan 13 20:20:10.711150 kubelet[2817]: I0113 20:20:10.711077 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/6ace5514-3f20-4dc8-a659-9a446e1b2450-node-certs\") pod \"calico-node-lp5z6\" (UID: \"6ace5514-3f20-4dc8-a659-9a446e1b2450\") " pod="calico-system/calico-node-lp5z6" Jan 13 20:20:10.711339 kubelet[2817]: I0113 20:20:10.711257 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6ace5514-3f20-4dc8-a659-9a446e1b2450-xtables-lock\") pod \"calico-node-lp5z6\" (UID: \"6ace5514-3f20-4dc8-a659-9a446e1b2450\") " pod="calico-system/calico-node-lp5z6" Jan 13 20:20:10.711339 kubelet[2817]: I0113 20:20:10.711297 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/6ace5514-3f20-4dc8-a659-9a446e1b2450-cni-net-dir\") pod \"calico-node-lp5z6\" (UID: \"6ace5514-3f20-4dc8-a659-9a446e1b2450\") " pod="calico-system/calico-node-lp5z6" Jan 13 20:20:10.711626 kubelet[2817]: I0113 20:20:10.711527 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/6ace5514-3f20-4dc8-a659-9a446e1b2450-cni-log-dir\") pod \"calico-node-lp5z6\" (UID: \"6ace5514-3f20-4dc8-a659-9a446e1b2450\") " pod="calico-system/calico-node-lp5z6" Jan 13 20:20:10.711626 kubelet[2817]: I0113 20:20:10.711574 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/6ace5514-3f20-4dc8-a659-9a446e1b2450-cni-bin-dir\") pod \"calico-node-lp5z6\" (UID: \"6ace5514-3f20-4dc8-a659-9a446e1b2450\") " pod="calico-system/calico-node-lp5z6" Jan 13 20:20:10.713404 kubelet[2817]: I0113 20:20:10.711833 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ace5514-3f20-4dc8-a659-9a446e1b2450-lib-modules\") pod \"calico-node-lp5z6\" (UID: \"6ace5514-3f20-4dc8-a659-9a446e1b2450\") " pod="calico-system/calico-node-lp5z6" Jan 13 20:20:10.713838 kubelet[2817]: I0113 20:20:10.711868 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ace5514-3f20-4dc8-a659-9a446e1b2450-tigera-ca-bundle\") pod \"calico-node-lp5z6\" (UID: \"6ace5514-3f20-4dc8-a659-9a446e1b2450\") " pod="calico-system/calico-node-lp5z6" Jan 13 20:20:10.795938 kubelet[2817]: I0113 20:20:10.795323 2817 topology_manager.go:215] "Topology Admit Handler" podUID="d410432e-4da0-436d-8d3b-2586cacab46b" podNamespace="calico-system" podName="csi-node-driver-vhlrh" Jan 13 20:20:10.795938 kubelet[2817]: E0113 20:20:10.795647 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vhlrh" podUID="d410432e-4da0-436d-8d3b-2586cacab46b" Jan 13 20:20:10.815679 kubelet[2817]: E0113 20:20:10.815442 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.815679 kubelet[2817]: W0113 20:20:10.815533 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.815679 kubelet[2817]: E0113 20:20:10.815560 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.816900 kubelet[2817]: E0113 20:20:10.816868 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.817244 kubelet[2817]: W0113 20:20:10.817005 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.817244 kubelet[2817]: E0113 20:20:10.817039 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.817644 kubelet[2817]: E0113 20:20:10.817627 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.817892 kubelet[2817]: W0113 20:20:10.817802 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.818131 kubelet[2817]: E0113 20:20:10.817990 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.818431 kubelet[2817]: E0113 20:20:10.818400 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.818660 kubelet[2817]: W0113 20:20:10.818414 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.818660 kubelet[2817]: E0113 20:20:10.818622 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.819408 kubelet[2817]: E0113 20:20:10.819264 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.819408 kubelet[2817]: W0113 20:20:10.819279 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.819881 kubelet[2817]: E0113 20:20:10.819670 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.820363 kubelet[2817]: E0113 20:20:10.820350 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.820518 kubelet[2817]: W0113 20:20:10.820480 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.820797 kubelet[2817]: E0113 20:20:10.820615 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.821858 kubelet[2817]: E0113 20:20:10.821673 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.821858 kubelet[2817]: W0113 20:20:10.821689 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.822510 kubelet[2817]: E0113 20:20:10.822227 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.822510 kubelet[2817]: W0113 20:20:10.822243 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.823561 kubelet[2817]: E0113 20:20:10.823141 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.823561 kubelet[2817]: E0113 20:20:10.823173 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.823801 kubelet[2817]: E0113 20:20:10.823784 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.823962 kubelet[2817]: W0113 20:20:10.823859 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.824136 kubelet[2817]: E0113 20:20:10.824122 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.824368 kubelet[2817]: E0113 20:20:10.824296 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.824477 kubelet[2817]: W0113 20:20:10.824427 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.824766 kubelet[2817]: E0113 20:20:10.824748 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.825415 kubelet[2817]: E0113 20:20:10.825324 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.825415 kubelet[2817]: W0113 20:20:10.825340 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.825710 kubelet[2817]: E0113 20:20:10.825664 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.826696 kubelet[2817]: E0113 20:20:10.826527 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.826696 kubelet[2817]: W0113 20:20:10.826542 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.827025 kubelet[2817]: E0113 20:20:10.826801 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.827351 kubelet[2817]: E0113 20:20:10.827173 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.827351 kubelet[2817]: W0113 20:20:10.827187 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.827886 kubelet[2817]: E0113 20:20:10.827668 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.828938 kubelet[2817]: E0113 20:20:10.828913 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.829315 kubelet[2817]: W0113 20:20:10.829253 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.830847 kubelet[2817]: E0113 20:20:10.830649 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.831221 kubelet[2817]: E0113 20:20:10.831187 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.831221 kubelet[2817]: W0113 20:20:10.831203 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.831943 kubelet[2817]: E0113 20:20:10.831762 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.832165 kubelet[2817]: E0113 20:20:10.832150 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.834319 kubelet[2817]: W0113 20:20:10.832224 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.835227 kubelet[2817]: E0113 20:20:10.834856 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.835227 kubelet[2817]: E0113 20:20:10.835006 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.835227 kubelet[2817]: W0113 20:20:10.835019 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.835699 kubelet[2817]: E0113 20:20:10.835663 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.836193 kubelet[2817]: E0113 20:20:10.835995 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.836193 kubelet[2817]: W0113 20:20:10.836010 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.836563 kubelet[2817]: E0113 20:20:10.836387 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.836857 kubelet[2817]: E0113 20:20:10.836808 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.836975 kubelet[2817]: W0113 20:20:10.836914 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.837304 kubelet[2817]: E0113 20:20:10.837206 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.838942 kubelet[2817]: E0113 20:20:10.838920 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.839155 kubelet[2817]: W0113 20:20:10.839052 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.839284 kubelet[2817]: E0113 20:20:10.839250 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.839735 kubelet[2817]: E0113 20:20:10.839624 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.839735 kubelet[2817]: W0113 20:20:10.839644 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.840008 kubelet[2817]: E0113 20:20:10.839875 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.840218 kubelet[2817]: E0113 20:20:10.840137 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.840323 kubelet[2817]: W0113 20:20:10.840301 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.840480 kubelet[2817]: E0113 20:20:10.840436 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.840796 kubelet[2817]: E0113 20:20:10.840685 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.840796 kubelet[2817]: W0113 20:20:10.840698 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.840957 kubelet[2817]: E0113 20:20:10.840929 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.841235 kubelet[2817]: E0113 20:20:10.841136 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.841235 kubelet[2817]: W0113 20:20:10.841150 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.841331 kubelet[2817]: E0113 20:20:10.841318 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.841618 kubelet[2817]: E0113 20:20:10.841560 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.841618 kubelet[2817]: W0113 20:20:10.841574 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.843139 kubelet[2817]: E0113 20:20:10.841713 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.843697 kubelet[2817]: E0113 20:20:10.843565 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.843800 kubelet[2817]: W0113 20:20:10.843784 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.843937 kubelet[2817]: E0113 20:20:10.843923 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.844278 kubelet[2817]: E0113 20:20:10.844263 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.844385 kubelet[2817]: W0113 20:20:10.844371 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.844679 kubelet[2817]: E0113 20:20:10.844625 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.845014 kubelet[2817]: E0113 20:20:10.844912 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.845014 kubelet[2817]: W0113 20:20:10.844923 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.845151 kubelet[2817]: E0113 20:20:10.845118 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.845546 kubelet[2817]: E0113 20:20:10.845517 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.845546 kubelet[2817]: W0113 20:20:10.845531 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.845911 kubelet[2817]: E0113 20:20:10.845831 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.846584 kubelet[2817]: E0113 20:20:10.846565 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.846889 kubelet[2817]: W0113 20:20:10.846733 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.847201 kubelet[2817]: E0113 20:20:10.847028 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.847318 kubelet[2817]: E0113 20:20:10.847307 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.847381 kubelet[2817]: W0113 20:20:10.847370 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.847714 kubelet[2817]: E0113 20:20:10.847541 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.847928 kubelet[2817]: E0113 20:20:10.847915 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.847987 kubelet[2817]: W0113 20:20:10.847977 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.848121 kubelet[2817]: E0113 20:20:10.848108 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.848730 kubelet[2817]: E0113 20:20:10.848712 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.848903 kubelet[2817]: W0113 20:20:10.848822 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.849034 kubelet[2817]: E0113 20:20:10.848956 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.849801 kubelet[2817]: E0113 20:20:10.849231 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.850032 kubelet[2817]: W0113 20:20:10.849940 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.850136 kubelet[2817]: E0113 20:20:10.850096 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.850706 kubelet[2817]: E0113 20:20:10.850450 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.850706 kubelet[2817]: W0113 20:20:10.850563 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.851664 kubelet[2817]: E0113 20:20:10.851641 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.851848 kubelet[2817]: E0113 20:20:10.851821 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.851848 kubelet[2817]: W0113 20:20:10.851832 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.852046 kubelet[2817]: E0113 20:20:10.852033 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.852180 kubelet[2817]: E0113 20:20:10.852170 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.852292 kubelet[2817]: W0113 20:20:10.852232 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.852496 kubelet[2817]: E0113 20:20:10.852406 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.853669 kubelet[2817]: E0113 20:20:10.853649 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.853838 kubelet[2817]: W0113 20:20:10.853750 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.853936 kubelet[2817]: E0113 20:20:10.853887 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.854821 kubelet[2817]: E0113 20:20:10.854803 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.854992 kubelet[2817]: W0113 20:20:10.854917 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.855388 kubelet[2817]: E0113 20:20:10.855054 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.855945 kubelet[2817]: E0113 20:20:10.855930 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.857732 kubelet[2817]: W0113 20:20:10.857697 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.858300 kubelet[2817]: E0113 20:20:10.858275 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.859027 kubelet[2817]: W0113 20:20:10.858379 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.859027 kubelet[2817]: E0113 20:20:10.858404 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.859027 kubelet[2817]: E0113 20:20:10.858437 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.867418 kubelet[2817]: E0113 20:20:10.867343 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.869258 kubelet[2817]: W0113 20:20:10.869215 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.869567 kubelet[2817]: E0113 20:20:10.869548 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.874888 containerd[1478]: time="2025-01-13T20:20:10.874846111Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-666d87df88-scng9,Uid:aca6560c-1ca6-44a6-a2a8-7f3e123c3251,Namespace:calico-system,Attempt:0,}" Jan 13 20:20:10.882700 kubelet[2817]: E0113 20:20:10.881945 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.882700 kubelet[2817]: W0113 20:20:10.882308 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.882700 kubelet[2817]: E0113 20:20:10.882343 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.891873 kubelet[2817]: E0113 20:20:10.891836 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.892207 kubelet[2817]: W0113 20:20:10.892049 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.892207 kubelet[2817]: E0113 20:20:10.892084 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.892671 kubelet[2817]: E0113 20:20:10.892525 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.892671 kubelet[2817]: W0113 20:20:10.892551 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.892671 kubelet[2817]: E0113 20:20:10.892568 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.893209 kubelet[2817]: E0113 20:20:10.893070 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.893209 kubelet[2817]: W0113 20:20:10.893087 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.893209 kubelet[2817]: E0113 20:20:10.893102 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.893617 kubelet[2817]: E0113 20:20:10.893500 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.893617 kubelet[2817]: W0113 20:20:10.893516 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.893617 kubelet[2817]: E0113 20:20:10.893530 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.894073 kubelet[2817]: E0113 20:20:10.893946 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.894073 kubelet[2817]: W0113 20:20:10.893961 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.894073 kubelet[2817]: E0113 20:20:10.893975 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.896521 kubelet[2817]: E0113 20:20:10.896323 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.896521 kubelet[2817]: W0113 20:20:10.896346 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.896521 kubelet[2817]: E0113 20:20:10.896367 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.896964 kubelet[2817]: E0113 20:20:10.896889 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.896964 kubelet[2817]: W0113 20:20:10.896902 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.896964 kubelet[2817]: E0113 20:20:10.896916 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.899239 kubelet[2817]: E0113 20:20:10.897498 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.899239 kubelet[2817]: W0113 20:20:10.897512 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.899239 kubelet[2817]: E0113 20:20:10.897530 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.902340 kubelet[2817]: E0113 20:20:10.900401 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.902340 kubelet[2817]: W0113 20:20:10.900421 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.902340 kubelet[2817]: E0113 20:20:10.900444 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.905986 kubelet[2817]: E0113 20:20:10.904711 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.905986 kubelet[2817]: W0113 20:20:10.904737 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.905986 kubelet[2817]: E0113 20:20:10.904764 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.908361 kubelet[2817]: E0113 20:20:10.908122 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.908361 kubelet[2817]: W0113 20:20:10.908142 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.908361 kubelet[2817]: E0113 20:20:10.908164 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.910418 kubelet[2817]: E0113 20:20:10.909991 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.910418 kubelet[2817]: W0113 20:20:10.910016 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.910418 kubelet[2817]: E0113 20:20:10.910036 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.911999 kubelet[2817]: E0113 20:20:10.911852 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.911999 kubelet[2817]: W0113 20:20:10.911875 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.911999 kubelet[2817]: E0113 20:20:10.911897 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.912432 kubelet[2817]: E0113 20:20:10.912309 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.912432 kubelet[2817]: W0113 20:20:10.912326 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.912432 kubelet[2817]: E0113 20:20:10.912338 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.914379 kubelet[2817]: E0113 20:20:10.913921 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.914379 kubelet[2817]: W0113 20:20:10.913944 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.914379 kubelet[2817]: E0113 20:20:10.913959 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.916225 kubelet[2817]: E0113 20:20:10.915877 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.916225 kubelet[2817]: W0113 20:20:10.915899 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.916225 kubelet[2817]: E0113 20:20:10.915918 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.916942 kubelet[2817]: E0113 20:20:10.916922 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.917219 kubelet[2817]: W0113 20:20:10.917201 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.917492 kubelet[2817]: E0113 20:20:10.917381 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.918568 kubelet[2817]: E0113 20:20:10.918112 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.918568 kubelet[2817]: W0113 20:20:10.918128 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.918568 kubelet[2817]: E0113 20:20:10.918143 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.920022 kubelet[2817]: E0113 20:20:10.919894 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.921172 kubelet[2817]: W0113 20:20:10.920536 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.921172 kubelet[2817]: E0113 20:20:10.920981 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.922149 kubelet[2817]: E0113 20:20:10.922112 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.922423 kubelet[2817]: W0113 20:20:10.922405 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.923091 kubelet[2817]: E0113 20:20:10.922720 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.923343 kubelet[2817]: E0113 20:20:10.923279 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.924181 kubelet[2817]: W0113 20:20:10.923880 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.924181 kubelet[2817]: E0113 20:20:10.923912 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.924181 kubelet[2817]: I0113 20:20:10.923952 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/d410432e-4da0-436d-8d3b-2586cacab46b-varrun\") pod \"csi-node-driver-vhlrh\" (UID: \"d410432e-4da0-436d-8d3b-2586cacab46b\") " pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:10.925653 kubelet[2817]: E0113 20:20:10.925219 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.925653 kubelet[2817]: W0113 20:20:10.925238 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.925653 kubelet[2817]: E0113 20:20:10.925290 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.925653 kubelet[2817]: I0113 20:20:10.925317 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d410432e-4da0-436d-8d3b-2586cacab46b-kubelet-dir\") pod \"csi-node-driver-vhlrh\" (UID: \"d410432e-4da0-436d-8d3b-2586cacab46b\") " pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:10.926696 kubelet[2817]: E0113 20:20:10.926122 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.926696 kubelet[2817]: W0113 20:20:10.926145 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.926696 kubelet[2817]: E0113 20:20:10.926167 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.927859 kubelet[2817]: E0113 20:20:10.927827 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.927859 kubelet[2817]: W0113 20:20:10.927855 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.928263 kubelet[2817]: E0113 20:20:10.927973 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.928263 kubelet[2817]: E0113 20:20:10.928141 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.928263 kubelet[2817]: W0113 20:20:10.928151 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.928263 kubelet[2817]: E0113 20:20:10.928187 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.928263 kubelet[2817]: I0113 20:20:10.928216 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l7wz\" (UniqueName: \"kubernetes.io/projected/d410432e-4da0-436d-8d3b-2586cacab46b-kube-api-access-6l7wz\") pod \"csi-node-driver-vhlrh\" (UID: \"d410432e-4da0-436d-8d3b-2586cacab46b\") " pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:10.929261 kubelet[2817]: E0113 20:20:10.929225 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.929261 kubelet[2817]: W0113 20:20:10.929248 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.929507 kubelet[2817]: E0113 20:20:10.929373 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.930616 kubelet[2817]: E0113 20:20:10.930048 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.930616 kubelet[2817]: W0113 20:20:10.930072 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.930616 kubelet[2817]: E0113 20:20:10.930091 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.933788 kubelet[2817]: E0113 20:20:10.933733 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.933788 kubelet[2817]: W0113 20:20:10.933774 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.934101 kubelet[2817]: E0113 20:20:10.933811 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.934480 kubelet[2817]: I0113 20:20:10.934281 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d410432e-4da0-436d-8d3b-2586cacab46b-registration-dir\") pod \"csi-node-driver-vhlrh\" (UID: \"d410432e-4da0-436d-8d3b-2586cacab46b\") " pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:10.934799 kubelet[2817]: E0113 20:20:10.934767 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.934799 kubelet[2817]: W0113 20:20:10.934793 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.934877 kubelet[2817]: E0113 20:20:10.934818 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.937172 kubelet[2817]: E0113 20:20:10.937131 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.938220 kubelet[2817]: W0113 20:20:10.938171 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.938614 kubelet[2817]: E0113 20:20:10.938351 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.938614 kubelet[2817]: E0113 20:20:10.938587 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.938881 kubelet[2817]: W0113 20:20:10.938620 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.939056 kubelet[2817]: E0113 20:20:10.939024 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.939107 kubelet[2817]: I0113 20:20:10.939073 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d410432e-4da0-436d-8d3b-2586cacab46b-socket-dir\") pod \"csi-node-driver-vhlrh\" (UID: \"d410432e-4da0-436d-8d3b-2586cacab46b\") " pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:10.939515 kubelet[2817]: E0113 20:20:10.939492 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.939515 kubelet[2817]: W0113 20:20:10.939512 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.939719 kubelet[2817]: E0113 20:20:10.939530 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.939818 kubelet[2817]: E0113 20:20:10.939797 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.939818 kubelet[2817]: W0113 20:20:10.939813 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.939876 kubelet[2817]: E0113 20:20:10.939828 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.940811 kubelet[2817]: E0113 20:20:10.940776 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.940811 kubelet[2817]: W0113 20:20:10.940799 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.940894 kubelet[2817]: E0113 20:20:10.940819 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.941853 kubelet[2817]: E0113 20:20:10.941823 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:10.941853 kubelet[2817]: W0113 20:20:10.941846 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:10.941959 kubelet[2817]: E0113 20:20:10.941859 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:10.948819 containerd[1478]: time="2025-01-13T20:20:10.948563567Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:20:10.948819 containerd[1478]: time="2025-01-13T20:20:10.948649568Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:20:10.948819 containerd[1478]: time="2025-01-13T20:20:10.948662608Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:10.948819 containerd[1478]: time="2025-01-13T20:20:10.948767328Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:10.987039 systemd[1]: Started cri-containerd-073c8585f64bac9e9aa5386d58e94a9944fd4b164a776be04bde15fe52493668.scope - libcontainer container 073c8585f64bac9e9aa5386d58e94a9944fd4b164a776be04bde15fe52493668. Jan 13 20:20:10.999475 containerd[1478]: time="2025-01-13T20:20:10.999409517Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-lp5z6,Uid:6ace5514-3f20-4dc8-a659-9a446e1b2450,Namespace:calico-system,Attempt:0,}" Jan 13 20:20:11.041542 kubelet[2817]: E0113 20:20:11.041300 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.041897 kubelet[2817]: W0113 20:20:11.041777 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.041897 kubelet[2817]: E0113 20:20:11.041829 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.042928 kubelet[2817]: E0113 20:20:11.042774 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.042928 kubelet[2817]: W0113 20:20:11.042797 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.042928 kubelet[2817]: E0113 20:20:11.042821 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.044112 kubelet[2817]: E0113 20:20:11.044006 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.044112 kubelet[2817]: W0113 20:20:11.044040 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.044112 kubelet[2817]: E0113 20:20:11.044075 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.044698 kubelet[2817]: E0113 20:20:11.044677 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.044926 kubelet[2817]: W0113 20:20:11.044706 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.045428 kubelet[2817]: E0113 20:20:11.045388 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.046481 kubelet[2817]: E0113 20:20:11.046186 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.046481 kubelet[2817]: W0113 20:20:11.046208 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.049238 kubelet[2817]: E0113 20:20:11.049047 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.049238 kubelet[2817]: W0113 20:20:11.049079 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.050927 kubelet[2817]: E0113 20:20:11.050817 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.050927 kubelet[2817]: E0113 20:20:11.050875 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.052930 kubelet[2817]: E0113 20:20:11.052881 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.052930 kubelet[2817]: W0113 20:20:11.052916 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.052930 kubelet[2817]: E0113 20:20:11.052967 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.054720 kubelet[2817]: E0113 20:20:11.054673 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.054720 kubelet[2817]: W0113 20:20:11.054708 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.055064 kubelet[2817]: E0113 20:20:11.054975 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.056571 kubelet[2817]: E0113 20:20:11.056368 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.056571 kubelet[2817]: W0113 20:20:11.056392 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.057079 kubelet[2817]: E0113 20:20:11.056908 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.059117 kubelet[2817]: E0113 20:20:11.059022 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.059117 kubelet[2817]: W0113 20:20:11.059055 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.059774 kubelet[2817]: E0113 20:20:11.059496 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.062145 containerd[1478]: time="2025-01-13T20:20:11.061512696Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:20:11.062145 containerd[1478]: time="2025-01-13T20:20:11.061587976Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:20:11.062145 containerd[1478]: time="2025-01-13T20:20:11.061635456Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:11.062145 containerd[1478]: time="2025-01-13T20:20:11.061728577Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:11.063550 kubelet[2817]: E0113 20:20:11.063270 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.063550 kubelet[2817]: W0113 20:20:11.063300 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.065202 kubelet[2817]: E0113 20:20:11.064898 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.066158 kubelet[2817]: E0113 20:20:11.066079 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.066158 kubelet[2817]: W0113 20:20:11.066103 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.066791 kubelet[2817]: E0113 20:20:11.066650 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.067178 kubelet[2817]: E0113 20:20:11.067135 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.067178 kubelet[2817]: W0113 20:20:11.067175 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.067455 kubelet[2817]: E0113 20:20:11.067303 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.067620 kubelet[2817]: E0113 20:20:11.067546 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.067620 kubelet[2817]: W0113 20:20:11.067567 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.067784 kubelet[2817]: E0113 20:20:11.067640 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.067981 kubelet[2817]: E0113 20:20:11.067869 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.067981 kubelet[2817]: W0113 20:20:11.067883 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.068225 kubelet[2817]: E0113 20:20:11.068020 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.068447 kubelet[2817]: E0113 20:20:11.068429 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.068447 kubelet[2817]: W0113 20:20:11.068443 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.068779 kubelet[2817]: E0113 20:20:11.068686 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.069218 kubelet[2817]: E0113 20:20:11.069076 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.069218 kubelet[2817]: W0113 20:20:11.069092 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.069218 kubelet[2817]: E0113 20:20:11.069176 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.070107 kubelet[2817]: E0113 20:20:11.070082 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.070107 kubelet[2817]: W0113 20:20:11.070102 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.070665 kubelet[2817]: E0113 20:20:11.070415 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.070955 kubelet[2817]: E0113 20:20:11.070926 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.071831 kubelet[2817]: W0113 20:20:11.071697 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.072032 kubelet[2817]: E0113 20:20:11.071933 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.072724 containerd[1478]: time="2025-01-13T20:20:11.072682728Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-666d87df88-scng9,Uid:aca6560c-1ca6-44a6-a2a8-7f3e123c3251,Namespace:calico-system,Attempt:0,} returns sandbox id \"073c8585f64bac9e9aa5386d58e94a9944fd4b164a776be04bde15fe52493668\"" Jan 13 20:20:11.073080 kubelet[2817]: E0113 20:20:11.072956 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.073080 kubelet[2817]: W0113 20:20:11.072992 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.073618 kubelet[2817]: E0113 20:20:11.073214 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.074037 kubelet[2817]: E0113 20:20:11.073835 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.074037 kubelet[2817]: W0113 20:20:11.073854 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.075317 kubelet[2817]: E0113 20:20:11.075172 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.076952 kubelet[2817]: E0113 20:20:11.076373 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.076952 kubelet[2817]: W0113 20:20:11.076399 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.077409 kubelet[2817]: E0113 20:20:11.077384 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.079295 kubelet[2817]: E0113 20:20:11.078826 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.079295 kubelet[2817]: W0113 20:20:11.078858 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.080521 kubelet[2817]: E0113 20:20:11.080110 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.080521 kubelet[2817]: W0113 20:20:11.080134 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.080521 kubelet[2817]: E0113 20:20:11.080234 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.080521 kubelet[2817]: E0113 20:20:11.080352 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.081038 containerd[1478]: time="2025-01-13T20:20:11.080965832Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Jan 13 20:20:11.081952 kubelet[2817]: E0113 20:20:11.081926 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.082197 kubelet[2817]: W0113 20:20:11.082042 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.082535 kubelet[2817]: E0113 20:20:11.082286 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.103538 systemd[1]: Started cri-containerd-abbce71297145ea6dc3ab8b9590a51a491c5f20170a5f3c1b2b52c7d4ea2b995.scope - libcontainer container abbce71297145ea6dc3ab8b9590a51a491c5f20170a5f3c1b2b52c7d4ea2b995. Jan 13 20:20:11.106524 kubelet[2817]: E0113 20:20:11.106272 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:11.106524 kubelet[2817]: W0113 20:20:11.106308 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:11.106524 kubelet[2817]: E0113 20:20:11.106334 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:11.143652 containerd[1478]: time="2025-01-13T20:20:11.143399292Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-lp5z6,Uid:6ace5514-3f20-4dc8-a659-9a446e1b2450,Namespace:calico-system,Attempt:0,} returns sandbox id \"abbce71297145ea6dc3ab8b9590a51a491c5f20170a5f3c1b2b52c7d4ea2b995\"" Jan 13 20:20:12.693246 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2714530462.mount: Deactivated successfully. Jan 13 20:20:13.205131 kubelet[2817]: E0113 20:20:13.205069 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vhlrh" podUID="d410432e-4da0-436d-8d3b-2586cacab46b" Jan 13 20:20:13.488142 containerd[1478]: time="2025-01-13T20:20:13.486985485Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:13.489913 containerd[1478]: time="2025-01-13T20:20:13.489680533Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=29231308" Jan 13 20:20:13.491986 containerd[1478]: time="2025-01-13T20:20:13.491205417Z" level=info msg="ImageCreate event name:\"sha256:1d1fc316829ae1650b0b1629b54232520f297e7c3b1444eecd290ae088902a28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:13.496950 containerd[1478]: time="2025-01-13T20:20:13.496902433Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:13.507864 containerd[1478]: time="2025-01-13T20:20:13.507807983Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:1d1fc316829ae1650b0b1629b54232520f297e7c3b1444eecd290ae088902a28\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"29231162\" in 2.426784591s" Jan 13 20:20:13.508202 containerd[1478]: time="2025-01-13T20:20:13.508060064Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:1d1fc316829ae1650b0b1629b54232520f297e7c3b1444eecd290ae088902a28\"" Jan 13 20:20:13.512785 containerd[1478]: time="2025-01-13T20:20:13.512115235Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Jan 13 20:20:13.532648 containerd[1478]: time="2025-01-13T20:20:13.532588213Z" level=info msg="CreateContainer within sandbox \"073c8585f64bac9e9aa5386d58e94a9944fd4b164a776be04bde15fe52493668\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 13 20:20:13.549232 containerd[1478]: time="2025-01-13T20:20:13.549182539Z" level=info msg="CreateContainer within sandbox \"073c8585f64bac9e9aa5386d58e94a9944fd4b164a776be04bde15fe52493668\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"e8501f37deabf757a7367b6534174eebc2ce2af9a0743c6db70a74d8553b7e02\"" Jan 13 20:20:13.550099 containerd[1478]: time="2025-01-13T20:20:13.550068141Z" level=info msg="StartContainer for \"e8501f37deabf757a7367b6534174eebc2ce2af9a0743c6db70a74d8553b7e02\"" Jan 13 20:20:13.586838 systemd[1]: Started cri-containerd-e8501f37deabf757a7367b6534174eebc2ce2af9a0743c6db70a74d8553b7e02.scope - libcontainer container e8501f37deabf757a7367b6534174eebc2ce2af9a0743c6db70a74d8553b7e02. Jan 13 20:20:13.630797 containerd[1478]: time="2025-01-13T20:20:13.630350326Z" level=info msg="StartContainer for \"e8501f37deabf757a7367b6534174eebc2ce2af9a0743c6db70a74d8553b7e02\" returns successfully" Jan 13 20:20:14.437658 kubelet[2817]: I0113 20:20:14.437150 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-666d87df88-scng9" podStartSLOduration=2.004711513 podStartE2EDuration="4.43712824s" podCreationTimestamp="2025-01-13 20:20:10 +0000 UTC" firstStartedPulling="2025-01-13 20:20:11.078480585 +0000 UTC m=+43.991140397" lastFinishedPulling="2025-01-13 20:20:13.510897192 +0000 UTC m=+46.423557124" observedRunningTime="2025-01-13 20:20:14.416125742 +0000 UTC m=+47.328785634" watchObservedRunningTime="2025-01-13 20:20:14.43712824 +0000 UTC m=+47.349788092" Jan 13 20:20:14.449771 kubelet[2817]: E0113 20:20:14.449474 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.449771 kubelet[2817]: W0113 20:20:14.449503 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.449771 kubelet[2817]: E0113 20:20:14.449615 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.451161 kubelet[2817]: E0113 20:20:14.449875 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.451161 kubelet[2817]: W0113 20:20:14.449886 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.451161 kubelet[2817]: E0113 20:20:14.449897 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.452725 kubelet[2817]: E0113 20:20:14.451591 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.452725 kubelet[2817]: W0113 20:20:14.451721 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.452725 kubelet[2817]: E0113 20:20:14.451746 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.453954 kubelet[2817]: E0113 20:20:14.453423 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.453954 kubelet[2817]: W0113 20:20:14.453442 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.453954 kubelet[2817]: E0113 20:20:14.453856 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.454634 kubelet[2817]: E0113 20:20:14.454616 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.454808 kubelet[2817]: W0113 20:20:14.454681 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.454808 kubelet[2817]: E0113 20:20:14.454696 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.455702 kubelet[2817]: E0113 20:20:14.455422 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.455702 kubelet[2817]: W0113 20:20:14.455498 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.455911 kubelet[2817]: E0113 20:20:14.455827 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.456475 kubelet[2817]: E0113 20:20:14.456434 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.456629 kubelet[2817]: W0113 20:20:14.456464 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.456629 kubelet[2817]: E0113 20:20:14.456582 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.456975 kubelet[2817]: E0113 20:20:14.456961 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.457119 kubelet[2817]: W0113 20:20:14.457043 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.457119 kubelet[2817]: E0113 20:20:14.457059 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.457530 kubelet[2817]: E0113 20:20:14.457412 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.457530 kubelet[2817]: W0113 20:20:14.457424 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.457530 kubelet[2817]: E0113 20:20:14.457435 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.457865 kubelet[2817]: E0113 20:20:14.457802 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.457865 kubelet[2817]: W0113 20:20:14.457815 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.457865 kubelet[2817]: E0113 20:20:14.457825 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.458352 kubelet[2817]: E0113 20:20:14.458280 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.458352 kubelet[2817]: W0113 20:20:14.458295 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.458352 kubelet[2817]: E0113 20:20:14.458306 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.459118 kubelet[2817]: E0113 20:20:14.458949 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.459118 kubelet[2817]: W0113 20:20:14.458963 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.459118 kubelet[2817]: E0113 20:20:14.458974 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.459639 kubelet[2817]: E0113 20:20:14.459434 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.459639 kubelet[2817]: W0113 20:20:14.459465 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.459639 kubelet[2817]: E0113 20:20:14.459486 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.460321 kubelet[2817]: E0113 20:20:14.460101 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.460321 kubelet[2817]: W0113 20:20:14.460165 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.460321 kubelet[2817]: E0113 20:20:14.460226 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.461240 kubelet[2817]: E0113 20:20:14.461126 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.461240 kubelet[2817]: W0113 20:20:14.461161 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.461240 kubelet[2817]: E0113 20:20:14.461173 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.481709 kubelet[2817]: E0113 20:20:14.481475 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.481709 kubelet[2817]: W0113 20:20:14.481506 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.481709 kubelet[2817]: E0113 20:20:14.481528 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.482582 kubelet[2817]: E0113 20:20:14.482271 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.482582 kubelet[2817]: W0113 20:20:14.482292 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.482582 kubelet[2817]: E0113 20:20:14.482310 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.482976 kubelet[2817]: E0113 20:20:14.482908 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.483232 kubelet[2817]: W0113 20:20:14.483054 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.483232 kubelet[2817]: E0113 20:20:14.483088 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.483507 kubelet[2817]: E0113 20:20:14.483470 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.483507 kubelet[2817]: W0113 20:20:14.483492 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.483684 kubelet[2817]: E0113 20:20:14.483529 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.483785 kubelet[2817]: E0113 20:20:14.483771 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.483785 kubelet[2817]: W0113 20:20:14.483781 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.484101 kubelet[2817]: E0113 20:20:14.483813 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.484250 kubelet[2817]: E0113 20:20:14.484236 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.484502 kubelet[2817]: W0113 20:20:14.484302 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.484502 kubelet[2817]: E0113 20:20:14.484339 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.485012 kubelet[2817]: E0113 20:20:14.484831 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.485012 kubelet[2817]: W0113 20:20:14.484848 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.485012 kubelet[2817]: E0113 20:20:14.484882 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.485636 kubelet[2817]: E0113 20:20:14.485190 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.485636 kubelet[2817]: W0113 20:20:14.485205 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.485636 kubelet[2817]: E0113 20:20:14.485235 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.486162 kubelet[2817]: E0113 20:20:14.486040 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.486162 kubelet[2817]: W0113 20:20:14.486059 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.486893 kubelet[2817]: E0113 20:20:14.486858 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.487119 kubelet[2817]: E0113 20:20:14.487102 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.487305 kubelet[2817]: W0113 20:20:14.487186 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.487397 kubelet[2817]: E0113 20:20:14.487380 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.487631 kubelet[2817]: E0113 20:20:14.487559 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.487631 kubelet[2817]: W0113 20:20:14.487572 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.487631 kubelet[2817]: E0113 20:20:14.487624 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.488729 kubelet[2817]: E0113 20:20:14.488349 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.488729 kubelet[2817]: W0113 20:20:14.488368 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.488729 kubelet[2817]: E0113 20:20:14.488395 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.489007 kubelet[2817]: E0113 20:20:14.488976 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.489007 kubelet[2817]: W0113 20:20:14.489000 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.489066 kubelet[2817]: E0113 20:20:14.489018 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.489384 kubelet[2817]: E0113 20:20:14.489368 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.489791 kubelet[2817]: W0113 20:20:14.489682 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.489791 kubelet[2817]: E0113 20:20:14.489721 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.490231 kubelet[2817]: E0113 20:20:14.490129 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.490231 kubelet[2817]: W0113 20:20:14.490149 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.490231 kubelet[2817]: E0113 20:20:14.490191 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.491653 kubelet[2817]: E0113 20:20:14.491374 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.491653 kubelet[2817]: W0113 20:20:14.491403 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.491653 kubelet[2817]: E0113 20:20:14.491504 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.492091 kubelet[2817]: E0113 20:20:14.492071 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.492656 kubelet[2817]: W0113 20:20:14.492232 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.492656 kubelet[2817]: E0113 20:20:14.492259 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:14.492976 kubelet[2817]: E0113 20:20:14.492953 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:14.493079 kubelet[2817]: W0113 20:20:14.493036 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:14.493158 kubelet[2817]: E0113 20:20:14.493143 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.207082 kubelet[2817]: E0113 20:20:15.205639 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vhlrh" podUID="d410432e-4da0-436d-8d3b-2586cacab46b" Jan 13 20:20:15.468912 kubelet[2817]: E0113 20:20:15.468802 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.468912 kubelet[2817]: W0113 20:20:15.468831 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.469405 kubelet[2817]: E0113 20:20:15.468943 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.469405 kubelet[2817]: E0113 20:20:15.469361 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.469405 kubelet[2817]: W0113 20:20:15.469373 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.469405 kubelet[2817]: E0113 20:20:15.469396 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.469711 kubelet[2817]: E0113 20:20:15.469577 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.469711 kubelet[2817]: W0113 20:20:15.469606 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.469711 kubelet[2817]: E0113 20:20:15.469616 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.469922 kubelet[2817]: E0113 20:20:15.469907 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.469964 kubelet[2817]: W0113 20:20:15.469924 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.469964 kubelet[2817]: E0113 20:20:15.469936 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.470419 kubelet[2817]: E0113 20:20:15.470387 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.470505 kubelet[2817]: W0113 20:20:15.470487 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.470537 kubelet[2817]: E0113 20:20:15.470508 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.470956 kubelet[2817]: E0113 20:20:15.470937 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.470956 kubelet[2817]: W0113 20:20:15.470953 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.471055 kubelet[2817]: E0113 20:20:15.470971 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.471423 kubelet[2817]: E0113 20:20:15.471384 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.471423 kubelet[2817]: W0113 20:20:15.471399 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.471522 kubelet[2817]: E0113 20:20:15.471411 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.472137 kubelet[2817]: E0113 20:20:15.472106 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.472137 kubelet[2817]: W0113 20:20:15.472131 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.472229 kubelet[2817]: E0113 20:20:15.472143 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.472682 kubelet[2817]: E0113 20:20:15.472582 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.472682 kubelet[2817]: W0113 20:20:15.472650 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.472682 kubelet[2817]: E0113 20:20:15.472663 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.473259 kubelet[2817]: E0113 20:20:15.473147 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.473259 kubelet[2817]: W0113 20:20:15.473164 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.473350 kubelet[2817]: E0113 20:20:15.473324 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.473781 kubelet[2817]: E0113 20:20:15.473699 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.473781 kubelet[2817]: W0113 20:20:15.473714 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.473781 kubelet[2817]: E0113 20:20:15.473727 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.474156 kubelet[2817]: E0113 20:20:15.474052 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.474156 kubelet[2817]: W0113 20:20:15.474138 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.474156 kubelet[2817]: E0113 20:20:15.474158 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.474784 kubelet[2817]: E0113 20:20:15.474515 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.474784 kubelet[2817]: W0113 20:20:15.474717 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.474784 kubelet[2817]: E0113 20:20:15.474731 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.475385 kubelet[2817]: E0113 20:20:15.475090 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.475385 kubelet[2817]: W0113 20:20:15.475107 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.475385 kubelet[2817]: E0113 20:20:15.475117 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.475635 kubelet[2817]: E0113 20:20:15.475618 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.475635 kubelet[2817]: W0113 20:20:15.475635 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.475706 kubelet[2817]: E0113 20:20:15.475646 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.491203 kubelet[2817]: E0113 20:20:15.491098 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.491203 kubelet[2817]: W0113 20:20:15.491127 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.491203 kubelet[2817]: E0113 20:20:15.491150 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.492048 kubelet[2817]: E0113 20:20:15.491728 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.492048 kubelet[2817]: W0113 20:20:15.491744 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.492048 kubelet[2817]: E0113 20:20:15.491855 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.493305 kubelet[2817]: E0113 20:20:15.492657 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.493305 kubelet[2817]: W0113 20:20:15.492698 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.493305 kubelet[2817]: E0113 20:20:15.492718 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.493305 kubelet[2817]: E0113 20:20:15.493088 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.493305 kubelet[2817]: W0113 20:20:15.493099 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.493305 kubelet[2817]: E0113 20:20:15.493117 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.494832 kubelet[2817]: E0113 20:20:15.493461 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.494832 kubelet[2817]: W0113 20:20:15.493472 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.494832 kubelet[2817]: E0113 20:20:15.493491 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.494832 kubelet[2817]: E0113 20:20:15.493730 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.494832 kubelet[2817]: W0113 20:20:15.493741 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.494832 kubelet[2817]: E0113 20:20:15.493751 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.495004 kubelet[2817]: E0113 20:20:15.494979 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.495004 kubelet[2817]: W0113 20:20:15.494998 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.495068 kubelet[2817]: E0113 20:20:15.495019 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.496271 kubelet[2817]: E0113 20:20:15.496041 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.496271 kubelet[2817]: W0113 20:20:15.496059 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.496271 kubelet[2817]: E0113 20:20:15.496096 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.496397 kubelet[2817]: E0113 20:20:15.496312 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.496397 kubelet[2817]: W0113 20:20:15.496323 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.496810 kubelet[2817]: E0113 20:20:15.496520 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.496810 kubelet[2817]: W0113 20:20:15.496540 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.496810 kubelet[2817]: E0113 20:20:15.496518 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.496810 kubelet[2817]: E0113 20:20:15.496709 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.496953 kubelet[2817]: E0113 20:20:15.496855 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.496953 kubelet[2817]: W0113 20:20:15.496865 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.496953 kubelet[2817]: E0113 20:20:15.496881 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.497319 kubelet[2817]: E0113 20:20:15.497214 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.497319 kubelet[2817]: W0113 20:20:15.497227 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.497319 kubelet[2817]: E0113 20:20:15.497250 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.498981 kubelet[2817]: E0113 20:20:15.498678 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.498981 kubelet[2817]: W0113 20:20:15.498704 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.498981 kubelet[2817]: E0113 20:20:15.498733 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.499685 kubelet[2817]: E0113 20:20:15.499381 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.499685 kubelet[2817]: W0113 20:20:15.499400 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.499685 kubelet[2817]: E0113 20:20:15.499549 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.500004 kubelet[2817]: E0113 20:20:15.499964 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.500004 kubelet[2817]: W0113 20:20:15.499978 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.500296 kubelet[2817]: E0113 20:20:15.500120 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.500571 kubelet[2817]: E0113 20:20:15.500557 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.500944 kubelet[2817]: W0113 20:20:15.500842 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.501340 kubelet[2817]: E0113 20:20:15.501014 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.501534 kubelet[2817]: E0113 20:20:15.501521 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.501700 kubelet[2817]: W0113 20:20:15.501656 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.501737 kubelet[2817]: E0113 20:20:15.501711 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.502532 kubelet[2817]: E0113 20:20:15.502501 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 20:20:15.502905 kubelet[2817]: W0113 20:20:15.502641 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 20:20:15.502905 kubelet[2817]: E0113 20:20:15.502667 2817 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 20:20:15.525133 containerd[1478]: time="2025-01-13T20:20:15.525079047Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:15.526112 containerd[1478]: time="2025-01-13T20:20:15.525679409Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5117811" Jan 13 20:20:15.526944 containerd[1478]: time="2025-01-13T20:20:15.526897692Z" level=info msg="ImageCreate event name:\"sha256:ece9bca32e64e726de8bbfc9e175a3ca91e0881cd40352bfcd1d107411f4f348\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:15.529974 containerd[1478]: time="2025-01-13T20:20:15.529924980Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:15.531128 containerd[1478]: time="2025-01-13T20:20:15.530887503Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:ece9bca32e64e726de8bbfc9e175a3ca91e0881cd40352bfcd1d107411f4f348\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6487425\" in 2.018696467s" Jan 13 20:20:15.531128 containerd[1478]: time="2025-01-13T20:20:15.530936503Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:ece9bca32e64e726de8bbfc9e175a3ca91e0881cd40352bfcd1d107411f4f348\"" Jan 13 20:20:15.535009 containerd[1478]: time="2025-01-13T20:20:15.534966834Z" level=info msg="CreateContainer within sandbox \"abbce71297145ea6dc3ab8b9590a51a491c5f20170a5f3c1b2b52c7d4ea2b995\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 13 20:20:15.566703 containerd[1478]: time="2025-01-13T20:20:15.566409999Z" level=info msg="CreateContainer within sandbox \"abbce71297145ea6dc3ab8b9590a51a491c5f20170a5f3c1b2b52c7d4ea2b995\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"eaed979140597faaa15c93d1846894d50be3c40f07494341d3dc5571a760df5e\"" Jan 13 20:20:15.568942 containerd[1478]: time="2025-01-13T20:20:15.567465722Z" level=info msg="StartContainer for \"eaed979140597faaa15c93d1846894d50be3c40f07494341d3dc5571a760df5e\"" Jan 13 20:20:15.604862 systemd[1]: Started cri-containerd-eaed979140597faaa15c93d1846894d50be3c40f07494341d3dc5571a760df5e.scope - libcontainer container eaed979140597faaa15c93d1846894d50be3c40f07494341d3dc5571a760df5e. Jan 13 20:20:15.656304 containerd[1478]: time="2025-01-13T20:20:15.656056482Z" level=info msg="StartContainer for \"eaed979140597faaa15c93d1846894d50be3c40f07494341d3dc5571a760df5e\" returns successfully" Jan 13 20:20:15.679986 systemd[1]: cri-containerd-eaed979140597faaa15c93d1846894d50be3c40f07494341d3dc5571a760df5e.scope: Deactivated successfully. Jan 13 20:20:15.709973 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-eaed979140597faaa15c93d1846894d50be3c40f07494341d3dc5571a760df5e-rootfs.mount: Deactivated successfully. Jan 13 20:20:15.879262 containerd[1478]: time="2025-01-13T20:20:15.878922605Z" level=info msg="shim disconnected" id=eaed979140597faaa15c93d1846894d50be3c40f07494341d3dc5571a760df5e namespace=k8s.io Jan 13 20:20:15.879262 containerd[1478]: time="2025-01-13T20:20:15.878995525Z" level=warning msg="cleaning up after shim disconnected" id=eaed979140597faaa15c93d1846894d50be3c40f07494341d3dc5571a760df5e namespace=k8s.io Jan 13 20:20:15.879262 containerd[1478]: time="2025-01-13T20:20:15.879006885Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:20:15.898267 containerd[1478]: time="2025-01-13T20:20:15.898140577Z" level=warning msg="cleanup warnings time=\"2025-01-13T20:20:15Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Jan 13 20:20:16.411655 containerd[1478]: time="2025-01-13T20:20:16.410563906Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Jan 13 20:20:17.208500 kubelet[2817]: E0113 20:20:17.206356 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vhlrh" podUID="d410432e-4da0-436d-8d3b-2586cacab46b" Jan 13 20:20:19.176662 containerd[1478]: time="2025-01-13T20:20:19.176579691Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:19.178776 containerd[1478]: time="2025-01-13T20:20:19.178726257Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=89703123" Jan 13 20:20:19.180033 containerd[1478]: time="2025-01-13T20:20:19.179986100Z" level=info msg="ImageCreate event name:\"sha256:e5ca62af4ff61b88f55fe4e0d7723151103d3f6a470fd4ebb311a2de27a9597f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:19.184023 containerd[1478]: time="2025-01-13T20:20:19.183972710Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:19.185992 containerd[1478]: time="2025-01-13T20:20:19.185346914Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:e5ca62af4ff61b88f55fe4e0d7723151103d3f6a470fd4ebb311a2de27a9597f\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"91072777\" in 2.774731248s" Jan 13 20:20:19.185992 containerd[1478]: time="2025-01-13T20:20:19.185801195Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:e5ca62af4ff61b88f55fe4e0d7723151103d3f6a470fd4ebb311a2de27a9597f\"" Jan 13 20:20:19.190555 containerd[1478]: time="2025-01-13T20:20:19.190404047Z" level=info msg="CreateContainer within sandbox \"abbce71297145ea6dc3ab8b9590a51a491c5f20170a5f3c1b2b52c7d4ea2b995\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 13 20:20:19.205699 kubelet[2817]: E0113 20:20:19.204976 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vhlrh" podUID="d410432e-4da0-436d-8d3b-2586cacab46b" Jan 13 20:20:19.218755 containerd[1478]: time="2025-01-13T20:20:19.218381798Z" level=info msg="CreateContainer within sandbox \"abbce71297145ea6dc3ab8b9590a51a491c5f20170a5f3c1b2b52c7d4ea2b995\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"4f939010cde373b7feaaebc62c0ea1a46a8a5dc1f126a27b228abb644f6cd4d0\"" Jan 13 20:20:19.219197 containerd[1478]: time="2025-01-13T20:20:19.219163720Z" level=info msg="StartContainer for \"4f939010cde373b7feaaebc62c0ea1a46a8a5dc1f126a27b228abb644f6cd4d0\"" Jan 13 20:20:19.264872 systemd[1]: Started cri-containerd-4f939010cde373b7feaaebc62c0ea1a46a8a5dc1f126a27b228abb644f6cd4d0.scope - libcontainer container 4f939010cde373b7feaaebc62c0ea1a46a8a5dc1f126a27b228abb644f6cd4d0. Jan 13 20:20:19.305541 containerd[1478]: time="2025-01-13T20:20:19.305136699Z" level=info msg="StartContainer for \"4f939010cde373b7feaaebc62c0ea1a46a8a5dc1f126a27b228abb644f6cd4d0\" returns successfully" Jan 13 20:20:19.842111 containerd[1478]: time="2025-01-13T20:20:19.842034025Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 13 20:20:19.848324 systemd[1]: cri-containerd-4f939010cde373b7feaaebc62c0ea1a46a8a5dc1f126a27b228abb644f6cd4d0.scope: Deactivated successfully. Jan 13 20:20:19.951333 kubelet[2817]: I0113 20:20:19.951295 2817 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Jan 13 20:20:19.986826 kubelet[2817]: I0113 20:20:19.985756 2817 topology_manager.go:215] "Topology Admit Handler" podUID="ff1a3779-f857-4921-b0c5-fdad56861f50" podNamespace="kube-system" podName="coredns-7db6d8ff4d-khp7b" Jan 13 20:20:19.988627 kubelet[2817]: I0113 20:20:19.988468 2817 topology_manager.go:215] "Topology Admit Handler" podUID="acbb6d2d-5611-4557-91bf-b12ca46c13f5" podNamespace="kube-system" podName="coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:19.990467 kubelet[2817]: I0113 20:20:19.989294 2817 topology_manager.go:215] "Topology Admit Handler" podUID="2fb62ec7-4c06-48e1-aa87-7b62ac4da84a" podNamespace="calico-system" podName="calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:19.992224 kubelet[2817]: I0113 20:20:19.992109 2817 topology_manager.go:215] "Topology Admit Handler" podUID="3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4" podNamespace="calico-apiserver" podName="calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:19.994349 kubelet[2817]: I0113 20:20:19.993548 2817 topology_manager.go:215] "Topology Admit Handler" podUID="189c7963-e6bf-46b5-b6d5-9d268e857385" podNamespace="calico-apiserver" podName="calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:20.002895 systemd[1]: Created slice kubepods-burstable-podff1a3779_f857_4921_b0c5_fdad56861f50.slice - libcontainer container kubepods-burstable-podff1a3779_f857_4921_b0c5_fdad56861f50.slice. Jan 13 20:20:20.017689 systemd[1]: Created slice kubepods-burstable-podacbb6d2d_5611_4557_91bf_b12ca46c13f5.slice - libcontainer container kubepods-burstable-podacbb6d2d_5611_4557_91bf_b12ca46c13f5.slice. Jan 13 20:20:20.029473 kubelet[2817]: I0113 20:20:20.028872 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/189c7963-e6bf-46b5-b6d5-9d268e857385-calico-apiserver-certs\") pod \"calico-apiserver-7776878d6f-zsmfk\" (UID: \"189c7963-e6bf-46b5-b6d5-9d268e857385\") " pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:20.029473 kubelet[2817]: I0113 20:20:20.028923 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff1a3779-f857-4921-b0c5-fdad56861f50-config-volume\") pod \"coredns-7db6d8ff4d-khp7b\" (UID: \"ff1a3779-f857-4921-b0c5-fdad56861f50\") " pod="kube-system/coredns-7db6d8ff4d-khp7b" Jan 13 20:20:20.029473 kubelet[2817]: I0113 20:20:20.028949 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh7pg\" (UniqueName: \"kubernetes.io/projected/3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4-kube-api-access-bh7pg\") pod \"calico-apiserver-7776878d6f-kzg6k\" (UID: \"3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4\") " pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:20.029473 kubelet[2817]: I0113 20:20:20.028968 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4-calico-apiserver-certs\") pod \"calico-apiserver-7776878d6f-kzg6k\" (UID: \"3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4\") " pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:20.029473 kubelet[2817]: I0113 20:20:20.028990 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fb62ec7-4c06-48e1-aa87-7b62ac4da84a-tigera-ca-bundle\") pod \"calico-kube-controllers-d9b896c9c-x5cqp\" (UID: \"2fb62ec7-4c06-48e1-aa87-7b62ac4da84a\") " pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:20.029730 kubelet[2817]: I0113 20:20:20.029008 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt9nc\" (UniqueName: \"kubernetes.io/projected/acbb6d2d-5611-4557-91bf-b12ca46c13f5-kube-api-access-xt9nc\") pod \"coredns-7db6d8ff4d-8n6tx\" (UID: \"acbb6d2d-5611-4557-91bf-b12ca46c13f5\") " pod="kube-system/coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:20.029730 kubelet[2817]: I0113 20:20:20.029026 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j62c\" (UniqueName: \"kubernetes.io/projected/ff1a3779-f857-4921-b0c5-fdad56861f50-kube-api-access-9j62c\") pod \"coredns-7db6d8ff4d-khp7b\" (UID: \"ff1a3779-f857-4921-b0c5-fdad56861f50\") " pod="kube-system/coredns-7db6d8ff4d-khp7b" Jan 13 20:20:20.029730 kubelet[2817]: I0113 20:20:20.029042 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acbb6d2d-5611-4557-91bf-b12ca46c13f5-config-volume\") pod \"coredns-7db6d8ff4d-8n6tx\" (UID: \"acbb6d2d-5611-4557-91bf-b12ca46c13f5\") " pod="kube-system/coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:20.029730 kubelet[2817]: I0113 20:20:20.029063 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5djqh\" (UniqueName: \"kubernetes.io/projected/2fb62ec7-4c06-48e1-aa87-7b62ac4da84a-kube-api-access-5djqh\") pod \"calico-kube-controllers-d9b896c9c-x5cqp\" (UID: \"2fb62ec7-4c06-48e1-aa87-7b62ac4da84a\") " pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:20.029730 kubelet[2817]: I0113 20:20:20.029082 2817 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6zbc\" (UniqueName: \"kubernetes.io/projected/189c7963-e6bf-46b5-b6d5-9d268e857385-kube-api-access-m6zbc\") pod \"calico-apiserver-7776878d6f-zsmfk\" (UID: \"189c7963-e6bf-46b5-b6d5-9d268e857385\") " pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:20.033698 systemd[1]: Created slice kubepods-besteffort-pod2fb62ec7_4c06_48e1_aa87_7b62ac4da84a.slice - libcontainer container kubepods-besteffort-pod2fb62ec7_4c06_48e1_aa87_7b62ac4da84a.slice. Jan 13 20:20:20.043785 systemd[1]: Created slice kubepods-besteffort-pod3b18b5d2_f018_4c67_a9be_1b6f49d4b5e4.slice - libcontainer container kubepods-besteffort-pod3b18b5d2_f018_4c67_a9be_1b6f49d4b5e4.slice. Jan 13 20:20:20.048966 containerd[1478]: time="2025-01-13T20:20:20.048893310Z" level=info msg="shim disconnected" id=4f939010cde373b7feaaebc62c0ea1a46a8a5dc1f126a27b228abb644f6cd4d0 namespace=k8s.io Jan 13 20:20:20.048966 containerd[1478]: time="2025-01-13T20:20:20.048959790Z" level=warning msg="cleaning up after shim disconnected" id=4f939010cde373b7feaaebc62c0ea1a46a8a5dc1f126a27b228abb644f6cd4d0 namespace=k8s.io Jan 13 20:20:20.048966 containerd[1478]: time="2025-01-13T20:20:20.048974510Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:20:20.060170 systemd[1]: Created slice kubepods-besteffort-pod189c7963_e6bf_46b5_b6d5_9d268e857385.slice - libcontainer container kubepods-besteffort-pod189c7963_e6bf_46b5_b6d5_9d268e857385.slice. Jan 13 20:20:20.214088 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4f939010cde373b7feaaebc62c0ea1a46a8a5dc1f126a27b228abb644f6cd4d0-rootfs.mount: Deactivated successfully. Jan 13 20:20:20.313559 containerd[1478]: time="2025-01-13T20:20:20.312966092Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-khp7b,Uid:ff1a3779-f857-4921-b0c5-fdad56861f50,Namespace:kube-system,Attempt:0,}" Jan 13 20:20:20.330223 containerd[1478]: time="2025-01-13T20:20:20.329808014Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8n6tx,Uid:acbb6d2d-5611-4557-91bf-b12ca46c13f5,Namespace:kube-system,Attempt:0,}" Jan 13 20:20:20.346807 containerd[1478]: time="2025-01-13T20:20:20.346683017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d9b896c9c-x5cqp,Uid:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,Namespace:calico-system,Attempt:0,}" Jan 13 20:20:20.356271 containerd[1478]: time="2025-01-13T20:20:20.356218561Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-kzg6k,Uid:3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4,Namespace:calico-apiserver,Attempt:0,}" Jan 13 20:20:20.365492 containerd[1478]: time="2025-01-13T20:20:20.365354304Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-zsmfk,Uid:189c7963-e6bf-46b5-b6d5-9d268e857385,Namespace:calico-apiserver,Attempt:0,}" Jan 13 20:20:20.435894 containerd[1478]: time="2025-01-13T20:20:20.435385279Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Jan 13 20:20:20.538011 containerd[1478]: time="2025-01-13T20:20:20.537823496Z" level=error msg="Failed to destroy network for sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.541843 containerd[1478]: time="2025-01-13T20:20:20.538157977Z" level=error msg="encountered an error cleaning up failed sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.541843 containerd[1478]: time="2025-01-13T20:20:20.538245177Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-khp7b,Uid:ff1a3779-f857-4921-b0c5-fdad56861f50,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.542129 kubelet[2817]: E0113 20:20:20.538487 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.542129 kubelet[2817]: E0113 20:20:20.538553 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-khp7b" Jan 13 20:20:20.542129 kubelet[2817]: E0113 20:20:20.538575 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-khp7b" Jan 13 20:20:20.544170 kubelet[2817]: E0113 20:20:20.538860 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-khp7b_kube-system(ff1a3779-f857-4921-b0c5-fdad56861f50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-khp7b_kube-system(ff1a3779-f857-4921-b0c5-fdad56861f50)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-khp7b" podUID="ff1a3779-f857-4921-b0c5-fdad56861f50" Jan 13 20:20:20.550623 containerd[1478]: time="2025-01-13T20:20:20.549709166Z" level=error msg="Failed to destroy network for sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.550623 containerd[1478]: time="2025-01-13T20:20:20.550388248Z" level=error msg="encountered an error cleaning up failed sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.551729 containerd[1478]: time="2025-01-13T20:20:20.551405610Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8n6tx,Uid:acbb6d2d-5611-4557-91bf-b12ca46c13f5,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.553320 kubelet[2817]: E0113 20:20:20.553058 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.553320 kubelet[2817]: E0113 20:20:20.553138 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:20.553320 kubelet[2817]: E0113 20:20:20.553160 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:20.553968 kubelet[2817]: E0113 20:20:20.553255 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-8n6tx_kube-system(acbb6d2d-5611-4557-91bf-b12ca46c13f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-8n6tx_kube-system(acbb6d2d-5611-4557-91bf-b12ca46c13f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-8n6tx" podUID="acbb6d2d-5611-4557-91bf-b12ca46c13f5" Jan 13 20:20:20.588501 containerd[1478]: time="2025-01-13T20:20:20.586337418Z" level=error msg="Failed to destroy network for sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.588860 containerd[1478]: time="2025-01-13T20:20:20.588825904Z" level=error msg="encountered an error cleaning up failed sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.589000 containerd[1478]: time="2025-01-13T20:20:20.588979385Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d9b896c9c-x5cqp,Uid:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.589836 kubelet[2817]: E0113 20:20:20.589288 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.589836 kubelet[2817]: E0113 20:20:20.589353 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:20.589836 kubelet[2817]: E0113 20:20:20.589383 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:20.590003 kubelet[2817]: E0113 20:20:20.589419 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-d9b896c9c-x5cqp_calico-system(2fb62ec7-4c06-48e1-aa87-7b62ac4da84a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-d9b896c9c-x5cqp_calico-system(2fb62ec7-4c06-48e1-aa87-7b62ac4da84a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" podUID="2fb62ec7-4c06-48e1-aa87-7b62ac4da84a" Jan 13 20:20:20.612891 containerd[1478]: time="2025-01-13T20:20:20.612810964Z" level=error msg="Failed to destroy network for sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.613216 containerd[1478]: time="2025-01-13T20:20:20.613185605Z" level=error msg="encountered an error cleaning up failed sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.613291 containerd[1478]: time="2025-01-13T20:20:20.613257445Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-zsmfk,Uid:189c7963-e6bf-46b5-b6d5-9d268e857385,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.613622 kubelet[2817]: E0113 20:20:20.613547 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.613798 kubelet[2817]: E0113 20:20:20.613739 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:20.613798 kubelet[2817]: E0113 20:20:20.613770 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:20.613927 kubelet[2817]: E0113 20:20:20.613901 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7776878d6f-zsmfk_calico-apiserver(189c7963-e6bf-46b5-b6d5-9d268e857385)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7776878d6f-zsmfk_calico-apiserver(189c7963-e6bf-46b5-b6d5-9d268e857385)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" podUID="189c7963-e6bf-46b5-b6d5-9d268e857385" Jan 13 20:20:20.617798 containerd[1478]: time="2025-01-13T20:20:20.617667977Z" level=error msg="Failed to destroy network for sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.618227 containerd[1478]: time="2025-01-13T20:20:20.618118658Z" level=error msg="encountered an error cleaning up failed sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.618227 containerd[1478]: time="2025-01-13T20:20:20.618185098Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-kzg6k,Uid:3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.618847 kubelet[2817]: E0113 20:20:20.618577 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:20.618847 kubelet[2817]: E0113 20:20:20.618694 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:20.618847 kubelet[2817]: E0113 20:20:20.618744 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:20.618999 kubelet[2817]: E0113 20:20:20.618804 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7776878d6f-kzg6k_calico-apiserver(3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7776878d6f-kzg6k_calico-apiserver(3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" podUID="3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4" Jan 13 20:20:21.208037 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d-shm.mount: Deactivated successfully. Jan 13 20:20:21.208503 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca-shm.mount: Deactivated successfully. Jan 13 20:20:21.218917 systemd[1]: Created slice kubepods-besteffort-podd410432e_4da0_436d_8d3b_2586cacab46b.slice - libcontainer container kubepods-besteffort-podd410432e_4da0_436d_8d3b_2586cacab46b.slice. Jan 13 20:20:21.221800 containerd[1478]: time="2025-01-13T20:20:21.221763044Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vhlrh,Uid:d410432e-4da0-436d-8d3b-2586cacab46b,Namespace:calico-system,Attempt:0,}" Jan 13 20:20:21.298884 containerd[1478]: time="2025-01-13T20:20:21.298667314Z" level=error msg="Failed to destroy network for sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.301119 containerd[1478]: time="2025-01-13T20:20:21.300982400Z" level=error msg="encountered an error cleaning up failed sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.301119 containerd[1478]: time="2025-01-13T20:20:21.301082760Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vhlrh,Uid:d410432e-4da0-436d-8d3b-2586cacab46b,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.302347 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0-shm.mount: Deactivated successfully. Jan 13 20:20:21.303473 kubelet[2817]: E0113 20:20:21.302523 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.303473 kubelet[2817]: E0113 20:20:21.302581 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:21.303473 kubelet[2817]: E0113 20:20:21.302613 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:21.303578 kubelet[2817]: E0113 20:20:21.302662 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vhlrh_calico-system(d410432e-4da0-436d-8d3b-2586cacab46b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vhlrh_calico-system(d410432e-4da0-436d-8d3b-2586cacab46b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vhlrh" podUID="d410432e-4da0-436d-8d3b-2586cacab46b" Jan 13 20:20:21.436757 kubelet[2817]: I0113 20:20:21.436167 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb" Jan 13 20:20:21.438166 containerd[1478]: time="2025-01-13T20:20:21.437922298Z" level=info msg="StopPodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\"" Jan 13 20:20:21.438637 kubelet[2817]: I0113 20:20:21.438226 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715" Jan 13 20:20:21.439623 containerd[1478]: time="2025-01-13T20:20:21.439159461Z" level=info msg="StopPodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\"" Jan 13 20:20:21.439623 containerd[1478]: time="2025-01-13T20:20:21.439451742Z" level=info msg="Ensure that sandbox 8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715 in task-service has been cleanup successfully" Jan 13 20:20:21.439945 containerd[1478]: time="2025-01-13T20:20:21.439827743Z" level=info msg="TearDown network for sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" successfully" Jan 13 20:20:21.439945 containerd[1478]: time="2025-01-13T20:20:21.439849983Z" level=info msg="StopPodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" returns successfully" Jan 13 20:20:21.440312 containerd[1478]: time="2025-01-13T20:20:21.440275504Z" level=info msg="Ensure that sandbox b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb in task-service has been cleanup successfully" Jan 13 20:20:21.441212 containerd[1478]: time="2025-01-13T20:20:21.441173226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-kzg6k,Uid:3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4,Namespace:calico-apiserver,Attempt:1,}" Jan 13 20:20:21.442764 containerd[1478]: time="2025-01-13T20:20:21.442710670Z" level=info msg="TearDown network for sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" successfully" Jan 13 20:20:21.442764 containerd[1478]: time="2025-01-13T20:20:21.442752990Z" level=info msg="StopPodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" returns successfully" Jan 13 20:20:21.443169 systemd[1]: run-netns-cni\x2d067b04c7\x2d7b0d\x2dc394\x2d2465\x2ddf34b4fb3481.mount: Deactivated successfully. Jan 13 20:20:21.443287 systemd[1]: run-netns-cni\x2d487fdb53\x2d9a98\x2dfd95\x2d7e80\x2da5fde011e9fa.mount: Deactivated successfully. Jan 13 20:20:21.447993 containerd[1478]: time="2025-01-13T20:20:21.447944723Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-zsmfk,Uid:189c7963-e6bf-46b5-b6d5-9d268e857385,Namespace:calico-apiserver,Attempt:1,}" Jan 13 20:20:21.449342 kubelet[2817]: I0113 20:20:21.448895 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0" Jan 13 20:20:21.450210 containerd[1478]: time="2025-01-13T20:20:21.449823808Z" level=info msg="StopPodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\"" Jan 13 20:20:21.450210 containerd[1478]: time="2025-01-13T20:20:21.450036488Z" level=info msg="Ensure that sandbox faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0 in task-service has been cleanup successfully" Jan 13 20:20:21.452838 containerd[1478]: time="2025-01-13T20:20:21.452695615Z" level=info msg="TearDown network for sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" successfully" Jan 13 20:20:21.453763 containerd[1478]: time="2025-01-13T20:20:21.453583897Z" level=info msg="StopPodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" returns successfully" Jan 13 20:20:21.454969 kubelet[2817]: I0113 20:20:21.454823 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a" Jan 13 20:20:21.457371 containerd[1478]: time="2025-01-13T20:20:21.457035946Z" level=info msg="StopPodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\"" Jan 13 20:20:21.457371 containerd[1478]: time="2025-01-13T20:20:21.457228666Z" level=info msg="Ensure that sandbox 2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a in task-service has been cleanup successfully" Jan 13 20:20:21.458034 containerd[1478]: time="2025-01-13T20:20:21.457870548Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vhlrh,Uid:d410432e-4da0-436d-8d3b-2586cacab46b,Namespace:calico-system,Attempt:1,}" Jan 13 20:20:21.461791 containerd[1478]: time="2025-01-13T20:20:21.460743195Z" level=info msg="TearDown network for sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" successfully" Jan 13 20:20:21.461791 containerd[1478]: time="2025-01-13T20:20:21.461348156Z" level=info msg="StopPodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" returns successfully" Jan 13 20:20:21.465189 kubelet[2817]: I0113 20:20:21.465036 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d" Jan 13 20:20:21.466778 containerd[1478]: time="2025-01-13T20:20:21.466311009Z" level=info msg="StopPodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\"" Jan 13 20:20:21.466778 containerd[1478]: time="2025-01-13T20:20:21.466537209Z" level=info msg="Ensure that sandbox 8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d in task-service has been cleanup successfully" Jan 13 20:20:21.469644 containerd[1478]: time="2025-01-13T20:20:21.468839095Z" level=info msg="TearDown network for sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" successfully" Jan 13 20:20:21.469644 containerd[1478]: time="2025-01-13T20:20:21.469199976Z" level=info msg="StopPodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" returns successfully" Jan 13 20:20:21.469644 containerd[1478]: time="2025-01-13T20:20:21.469065895Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d9b896c9c-x5cqp,Uid:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,Namespace:calico-system,Attempt:1,}" Jan 13 20:20:21.469824 kubelet[2817]: I0113 20:20:21.469362 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca" Jan 13 20:20:21.471747 containerd[1478]: time="2025-01-13T20:20:21.471555902Z" level=info msg="StopPodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\"" Jan 13 20:20:21.471836 containerd[1478]: time="2025-01-13T20:20:21.471759142Z" level=info msg="Ensure that sandbox 03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca in task-service has been cleanup successfully" Jan 13 20:20:21.473555 containerd[1478]: time="2025-01-13T20:20:21.473307986Z" level=info msg="TearDown network for sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" successfully" Jan 13 20:20:21.473555 containerd[1478]: time="2025-01-13T20:20:21.473334506Z" level=info msg="StopPodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" returns successfully" Jan 13 20:20:21.474834 containerd[1478]: time="2025-01-13T20:20:21.474798790Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8n6tx,Uid:acbb6d2d-5611-4557-91bf-b12ca46c13f5,Namespace:kube-system,Attempt:1,}" Jan 13 20:20:21.477680 containerd[1478]: time="2025-01-13T20:20:21.477197315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-khp7b,Uid:ff1a3779-f857-4921-b0c5-fdad56861f50,Namespace:kube-system,Attempt:1,}" Jan 13 20:20:21.693778 containerd[1478]: time="2025-01-13T20:20:21.693728051Z" level=error msg="Failed to destroy network for sandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.697500 containerd[1478]: time="2025-01-13T20:20:21.697327860Z" level=error msg="encountered an error cleaning up failed sandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.697946 containerd[1478]: time="2025-01-13T20:20:21.697917301Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-kzg6k,Uid:3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.698564 kubelet[2817]: E0113 20:20:21.698507 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.699042 kubelet[2817]: E0113 20:20:21.698573 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:21.699042 kubelet[2817]: E0113 20:20:21.698634 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:21.699042 kubelet[2817]: E0113 20:20:21.698680 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7776878d6f-kzg6k_calico-apiserver(3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7776878d6f-kzg6k_calico-apiserver(3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" podUID="3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4" Jan 13 20:20:21.729422 containerd[1478]: time="2025-01-13T20:20:21.729223019Z" level=error msg="Failed to destroy network for sandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.731061 containerd[1478]: time="2025-01-13T20:20:21.730999983Z" level=error msg="encountered an error cleaning up failed sandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.731223 containerd[1478]: time="2025-01-13T20:20:21.731095103Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8n6tx,Uid:acbb6d2d-5611-4557-91bf-b12ca46c13f5,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.731648 kubelet[2817]: E0113 20:20:21.731320 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.731648 kubelet[2817]: E0113 20:20:21.731396 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:21.731648 kubelet[2817]: E0113 20:20:21.731419 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:21.731824 containerd[1478]: time="2025-01-13T20:20:21.731510304Z" level=error msg="Failed to destroy network for sandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.731854 kubelet[2817]: E0113 20:20:21.731520 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-8n6tx_kube-system(acbb6d2d-5611-4557-91bf-b12ca46c13f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-8n6tx_kube-system(acbb6d2d-5611-4557-91bf-b12ca46c13f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-8n6tx" podUID="acbb6d2d-5611-4557-91bf-b12ca46c13f5" Jan 13 20:20:21.736905 containerd[1478]: time="2025-01-13T20:20:21.736854878Z" level=error msg="encountered an error cleaning up failed sandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.737386 containerd[1478]: time="2025-01-13T20:20:21.737161118Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vhlrh,Uid:d410432e-4da0-436d-8d3b-2586cacab46b,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.737570 kubelet[2817]: E0113 20:20:21.737509 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.738254 kubelet[2817]: E0113 20:20:21.737740 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:21.738254 kubelet[2817]: E0113 20:20:21.737776 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:21.738254 kubelet[2817]: E0113 20:20:21.738037 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vhlrh_calico-system(d410432e-4da0-436d-8d3b-2586cacab46b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vhlrh_calico-system(d410432e-4da0-436d-8d3b-2586cacab46b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vhlrh" podUID="d410432e-4da0-436d-8d3b-2586cacab46b" Jan 13 20:20:21.744954 containerd[1478]: time="2025-01-13T20:20:21.744904057Z" level=error msg="Failed to destroy network for sandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.747252 containerd[1478]: time="2025-01-13T20:20:21.747126463Z" level=error msg="encountered an error cleaning up failed sandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.747559 containerd[1478]: time="2025-01-13T20:20:21.747508904Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-zsmfk,Uid:189c7963-e6bf-46b5-b6d5-9d268e857385,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.748160 kubelet[2817]: E0113 20:20:21.748037 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.748160 kubelet[2817]: E0113 20:20:21.748107 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:21.748160 kubelet[2817]: E0113 20:20:21.748125 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:21.748386 kubelet[2817]: E0113 20:20:21.748169 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7776878d6f-zsmfk_calico-apiserver(189c7963-e6bf-46b5-b6d5-9d268e857385)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7776878d6f-zsmfk_calico-apiserver(189c7963-e6bf-46b5-b6d5-9d268e857385)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" podUID="189c7963-e6bf-46b5-b6d5-9d268e857385" Jan 13 20:20:21.773380 containerd[1478]: time="2025-01-13T20:20:21.773330208Z" level=error msg="Failed to destroy network for sandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.774446 containerd[1478]: time="2025-01-13T20:20:21.774124410Z" level=error msg="encountered an error cleaning up failed sandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.774446 containerd[1478]: time="2025-01-13T20:20:21.774259130Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-khp7b,Uid:ff1a3779-f857-4921-b0c5-fdad56861f50,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.774712 kubelet[2817]: E0113 20:20:21.774514 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.774712 kubelet[2817]: E0113 20:20:21.774575 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-khp7b" Jan 13 20:20:21.774712 kubelet[2817]: E0113 20:20:21.774622 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-khp7b" Jan 13 20:20:21.774861 kubelet[2817]: E0113 20:20:21.774670 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-khp7b_kube-system(ff1a3779-f857-4921-b0c5-fdad56861f50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-khp7b_kube-system(ff1a3779-f857-4921-b0c5-fdad56861f50)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-khp7b" podUID="ff1a3779-f857-4921-b0c5-fdad56861f50" Jan 13 20:20:21.782619 containerd[1478]: time="2025-01-13T20:20:21.782285950Z" level=error msg="Failed to destroy network for sandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.782752 containerd[1478]: time="2025-01-13T20:20:21.782664671Z" level=error msg="encountered an error cleaning up failed sandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.782752 containerd[1478]: time="2025-01-13T20:20:21.782733431Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d9b896c9c-x5cqp,Uid:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.784578 kubelet[2817]: E0113 20:20:21.782969 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:21.784578 kubelet[2817]: E0113 20:20:21.783033 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:21.784578 kubelet[2817]: E0113 20:20:21.783237 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:21.784809 kubelet[2817]: E0113 20:20:21.783556 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-d9b896c9c-x5cqp_calico-system(2fb62ec7-4c06-48e1-aa87-7b62ac4da84a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-d9b896c9c-x5cqp_calico-system(2fb62ec7-4c06-48e1-aa87-7b62ac4da84a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" podUID="2fb62ec7-4c06-48e1-aa87-7b62ac4da84a" Jan 13 20:20:22.215392 systemd[1]: run-netns-cni\x2dfe93ae31\x2dae31\x2d2f62\x2d2470\x2d3ec0b75d7cfd.mount: Deactivated successfully. Jan 13 20:20:22.216060 systemd[1]: run-netns-cni\x2ddf2d902d\x2ddcb1\x2dcf5f\x2da6ac\x2d713df74a2efe.mount: Deactivated successfully. Jan 13 20:20:22.216241 systemd[1]: run-netns-cni\x2d11b8b344\x2d3455\x2de12e\x2deaff\x2d10b702d464bd.mount: Deactivated successfully. Jan 13 20:20:22.216568 systemd[1]: run-netns-cni\x2dbb204a65\x2d75e4\x2d0735\x2d5be4\x2da75e1cc18c67.mount: Deactivated successfully. Jan 13 20:20:22.474449 kubelet[2817]: I0113 20:20:22.473951 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2" Jan 13 20:20:22.477402 containerd[1478]: time="2025-01-13T20:20:22.476189809Z" level=info msg="StopPodSandbox for \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\"" Jan 13 20:20:22.477402 containerd[1478]: time="2025-01-13T20:20:22.476580370Z" level=info msg="Ensure that sandbox 3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2 in task-service has been cleanup successfully" Jan 13 20:20:22.478170 kubelet[2817]: I0113 20:20:22.477862 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db" Jan 13 20:20:22.483005 containerd[1478]: time="2025-01-13T20:20:22.482252984Z" level=info msg="StopPodSandbox for \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\"" Jan 13 20:20:22.483005 containerd[1478]: time="2025-01-13T20:20:22.482523545Z" level=info msg="Ensure that sandbox 25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db in task-service has been cleanup successfully" Jan 13 20:20:22.483024 systemd[1]: run-netns-cni\x2d63080cbe\x2dd898\x2db9af\x2ddc67\x2da87bcb1b36ce.mount: Deactivated successfully. Jan 13 20:20:22.483739 kubelet[2817]: I0113 20:20:22.483456 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125" Jan 13 20:20:22.485735 containerd[1478]: time="2025-01-13T20:20:22.483537547Z" level=info msg="TearDown network for sandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\" successfully" Jan 13 20:20:22.485735 containerd[1478]: time="2025-01-13T20:20:22.483565147Z" level=info msg="StopPodSandbox for \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\" returns successfully" Jan 13 20:20:22.487549 systemd[1]: run-netns-cni\x2dfcd1032b\x2d53d9\x2d96e0\x2dd392\x2d300b0c13b376.mount: Deactivated successfully. Jan 13 20:20:22.489632 containerd[1478]: time="2025-01-13T20:20:22.488443319Z" level=info msg="StopPodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\"" Jan 13 20:20:22.489632 containerd[1478]: time="2025-01-13T20:20:22.488668200Z" level=info msg="TearDown network for sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" successfully" Jan 13 20:20:22.489632 containerd[1478]: time="2025-01-13T20:20:22.488684080Z" level=info msg="StopPodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" returns successfully" Jan 13 20:20:22.490312 containerd[1478]: time="2025-01-13T20:20:22.490014123Z" level=info msg="StopPodSandbox for \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\"" Jan 13 20:20:22.490312 containerd[1478]: time="2025-01-13T20:20:22.490047403Z" level=info msg="TearDown network for sandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\" successfully" Jan 13 20:20:22.490312 containerd[1478]: time="2025-01-13T20:20:22.490068683Z" level=info msg="StopPodSandbox for \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\" returns successfully" Jan 13 20:20:22.490312 containerd[1478]: time="2025-01-13T20:20:22.490207443Z" level=info msg="Ensure that sandbox 43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125 in task-service has been cleanup successfully" Jan 13 20:20:22.490680 containerd[1478]: time="2025-01-13T20:20:22.490653364Z" level=info msg="TearDown network for sandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\" successfully" Jan 13 20:20:22.490758 containerd[1478]: time="2025-01-13T20:20:22.490744525Z" level=info msg="StopPodSandbox for \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\" returns successfully" Jan 13 20:20:22.494040 containerd[1478]: time="2025-01-13T20:20:22.493582372Z" level=info msg="StopPodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\"" Jan 13 20:20:22.495118 systemd[1]: run-netns-cni\x2d4e943ad1\x2d1ff8\x2de93b\x2d1de8\x2daa8f0cbcf6a2.mount: Deactivated successfully. Jan 13 20:20:22.497616 containerd[1478]: time="2025-01-13T20:20:22.494252453Z" level=info msg="StopPodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\"" Jan 13 20:20:22.497616 containerd[1478]: time="2025-01-13T20:20:22.496731739Z" level=info msg="TearDown network for sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" successfully" Jan 13 20:20:22.497616 containerd[1478]: time="2025-01-13T20:20:22.496742579Z" level=info msg="StopPodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" returns successfully" Jan 13 20:20:22.497616 containerd[1478]: time="2025-01-13T20:20:22.494321973Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d9b896c9c-x5cqp,Uid:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,Namespace:calico-system,Attempt:2,}" Jan 13 20:20:22.499054 containerd[1478]: time="2025-01-13T20:20:22.498998305Z" level=info msg="TearDown network for sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" successfully" Jan 13 20:20:22.499054 containerd[1478]: time="2025-01-13T20:20:22.499033905Z" level=info msg="StopPodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" returns successfully" Jan 13 20:20:22.501683 containerd[1478]: time="2025-01-13T20:20:22.501626831Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8n6tx,Uid:acbb6d2d-5611-4557-91bf-b12ca46c13f5,Namespace:kube-system,Attempt:2,}" Jan 13 20:20:22.503091 containerd[1478]: time="2025-01-13T20:20:22.502751474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-khp7b,Uid:ff1a3779-f857-4921-b0c5-fdad56861f50,Namespace:kube-system,Attempt:2,}" Jan 13 20:20:22.503553 kubelet[2817]: I0113 20:20:22.503521 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0" Jan 13 20:20:22.507376 containerd[1478]: time="2025-01-13T20:20:22.507023684Z" level=info msg="StopPodSandbox for \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\"" Jan 13 20:20:22.508971 containerd[1478]: time="2025-01-13T20:20:22.508918729Z" level=info msg="Ensure that sandbox e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0 in task-service has been cleanup successfully" Jan 13 20:20:22.510069 kubelet[2817]: I0113 20:20:22.510043 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1" Jan 13 20:20:22.511353 containerd[1478]: time="2025-01-13T20:20:22.511093934Z" level=info msg="TearDown network for sandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\" successfully" Jan 13 20:20:22.511353 containerd[1478]: time="2025-01-13T20:20:22.511120214Z" level=info msg="StopPodSandbox for \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\" returns successfully" Jan 13 20:20:22.512443 containerd[1478]: time="2025-01-13T20:20:22.512384537Z" level=info msg="StopPodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\"" Jan 13 20:20:22.512772 containerd[1478]: time="2025-01-13T20:20:22.512741978Z" level=info msg="TearDown network for sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" successfully" Jan 13 20:20:22.512772 containerd[1478]: time="2025-01-13T20:20:22.512767938Z" level=info msg="StopPodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" returns successfully" Jan 13 20:20:22.512862 containerd[1478]: time="2025-01-13T20:20:22.512393537Z" level=info msg="StopPodSandbox for \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\"" Jan 13 20:20:22.513222 containerd[1478]: time="2025-01-13T20:20:22.513191219Z" level=info msg="Ensure that sandbox c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1 in task-service has been cleanup successfully" Jan 13 20:20:22.515085 containerd[1478]: time="2025-01-13T20:20:22.515040424Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vhlrh,Uid:d410432e-4da0-436d-8d3b-2586cacab46b,Namespace:calico-system,Attempt:2,}" Jan 13 20:20:22.516297 containerd[1478]: time="2025-01-13T20:20:22.516238787Z" level=info msg="TearDown network for sandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\" successfully" Jan 13 20:20:22.516297 containerd[1478]: time="2025-01-13T20:20:22.516268107Z" level=info msg="StopPodSandbox for \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\" returns successfully" Jan 13 20:20:22.517821 containerd[1478]: time="2025-01-13T20:20:22.517780031Z" level=info msg="StopPodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\"" Jan 13 20:20:22.518363 containerd[1478]: time="2025-01-13T20:20:22.517914631Z" level=info msg="TearDown network for sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" successfully" Jan 13 20:20:22.518363 containerd[1478]: time="2025-01-13T20:20:22.517929751Z" level=info msg="StopPodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" returns successfully" Jan 13 20:20:22.519311 containerd[1478]: time="2025-01-13T20:20:22.519158554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-zsmfk,Uid:189c7963-e6bf-46b5-b6d5-9d268e857385,Namespace:calico-apiserver,Attempt:2,}" Jan 13 20:20:22.521261 kubelet[2817]: I0113 20:20:22.521188 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587" Jan 13 20:20:22.525385 containerd[1478]: time="2025-01-13T20:20:22.525182889Z" level=info msg="StopPodSandbox for \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\"" Jan 13 20:20:22.525752 containerd[1478]: time="2025-01-13T20:20:22.525537449Z" level=info msg="Ensure that sandbox bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587 in task-service has been cleanup successfully" Jan 13 20:20:22.526579 containerd[1478]: time="2025-01-13T20:20:22.526087331Z" level=info msg="TearDown network for sandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\" successfully" Jan 13 20:20:22.526579 containerd[1478]: time="2025-01-13T20:20:22.526108211Z" level=info msg="StopPodSandbox for \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\" returns successfully" Jan 13 20:20:22.528856 containerd[1478]: time="2025-01-13T20:20:22.528424896Z" level=info msg="StopPodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\"" Jan 13 20:20:22.529257 containerd[1478]: time="2025-01-13T20:20:22.529160458Z" level=info msg="TearDown network for sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" successfully" Jan 13 20:20:22.529257 containerd[1478]: time="2025-01-13T20:20:22.529190098Z" level=info msg="StopPodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" returns successfully" Jan 13 20:20:22.531898 containerd[1478]: time="2025-01-13T20:20:22.531756545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-kzg6k,Uid:3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4,Namespace:calico-apiserver,Attempt:2,}" Jan 13 20:20:22.818221 containerd[1478]: time="2025-01-13T20:20:22.816304158Z" level=error msg="Failed to destroy network for sandbox \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.821002 containerd[1478]: time="2025-01-13T20:20:22.820956450Z" level=error msg="encountered an error cleaning up failed sandbox \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.821531 containerd[1478]: time="2025-01-13T20:20:22.821457171Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d9b896c9c-x5cqp,Uid:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.822519 kubelet[2817]: E0113 20:20:22.822176 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.822519 kubelet[2817]: E0113 20:20:22.822241 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:22.822519 kubelet[2817]: E0113 20:20:22.822262 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:22.822964 kubelet[2817]: E0113 20:20:22.822306 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-d9b896c9c-x5cqp_calico-system(2fb62ec7-4c06-48e1-aa87-7b62ac4da84a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-d9b896c9c-x5cqp_calico-system(2fb62ec7-4c06-48e1-aa87-7b62ac4da84a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" podUID="2fb62ec7-4c06-48e1-aa87-7b62ac4da84a" Jan 13 20:20:22.842108 containerd[1478]: time="2025-01-13T20:20:22.841872381Z" level=error msg="Failed to destroy network for sandbox \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.844356 containerd[1478]: time="2025-01-13T20:20:22.844292427Z" level=error msg="encountered an error cleaning up failed sandbox \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.844518 containerd[1478]: time="2025-01-13T20:20:22.844386307Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-kzg6k,Uid:3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.845057 kubelet[2817]: E0113 20:20:22.844927 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.845057 kubelet[2817]: E0113 20:20:22.844994 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:22.845057 kubelet[2817]: E0113 20:20:22.845017 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:22.845186 kubelet[2817]: E0113 20:20:22.845061 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7776878d6f-kzg6k_calico-apiserver(3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7776878d6f-kzg6k_calico-apiserver(3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" podUID="3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4" Jan 13 20:20:22.853399 containerd[1478]: time="2025-01-13T20:20:22.852571487Z" level=error msg="Failed to destroy network for sandbox \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.853399 containerd[1478]: time="2025-01-13T20:20:22.852852247Z" level=error msg="Failed to destroy network for sandbox \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.853399 containerd[1478]: time="2025-01-13T20:20:22.853017768Z" level=error msg="encountered an error cleaning up failed sandbox \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.853399 containerd[1478]: time="2025-01-13T20:20:22.853087608Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vhlrh,Uid:d410432e-4da0-436d-8d3b-2586cacab46b,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.853399 containerd[1478]: time="2025-01-13T20:20:22.853281768Z" level=error msg="encountered an error cleaning up failed sandbox \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.853399 containerd[1478]: time="2025-01-13T20:20:22.853321089Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8n6tx,Uid:acbb6d2d-5611-4557-91bf-b12ca46c13f5,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.854017 kubelet[2817]: E0113 20:20:22.853950 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.854146 kubelet[2817]: E0113 20:20:22.854026 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:22.854146 kubelet[2817]: E0113 20:20:22.854046 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:22.854146 kubelet[2817]: E0113 20:20:22.854100 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vhlrh_calico-system(d410432e-4da0-436d-8d3b-2586cacab46b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vhlrh_calico-system(d410432e-4da0-436d-8d3b-2586cacab46b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vhlrh" podUID="d410432e-4da0-436d-8d3b-2586cacab46b" Jan 13 20:20:22.854277 kubelet[2817]: E0113 20:20:22.853949 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.854277 kubelet[2817]: E0113 20:20:22.854162 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:22.854277 kubelet[2817]: E0113 20:20:22.854175 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:22.854374 kubelet[2817]: E0113 20:20:22.854194 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-8n6tx_kube-system(acbb6d2d-5611-4557-91bf-b12ca46c13f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-8n6tx_kube-system(acbb6d2d-5611-4557-91bf-b12ca46c13f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-8n6tx" podUID="acbb6d2d-5611-4557-91bf-b12ca46c13f5" Jan 13 20:20:22.856533 containerd[1478]: time="2025-01-13T20:20:22.855545134Z" level=error msg="Failed to destroy network for sandbox \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.856533 containerd[1478]: time="2025-01-13T20:20:22.855935055Z" level=error msg="encountered an error cleaning up failed sandbox \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.856533 containerd[1478]: time="2025-01-13T20:20:22.856029175Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-khp7b,Uid:ff1a3779-f857-4921-b0c5-fdad56861f50,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.857007 kubelet[2817]: E0113 20:20:22.856272 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.857007 kubelet[2817]: E0113 20:20:22.856329 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-khp7b" Jan 13 20:20:22.857007 kubelet[2817]: E0113 20:20:22.856349 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-khp7b" Jan 13 20:20:22.857106 kubelet[2817]: E0113 20:20:22.856391 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-khp7b_kube-system(ff1a3779-f857-4921-b0c5-fdad56861f50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-khp7b_kube-system(ff1a3779-f857-4921-b0c5-fdad56861f50)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-khp7b" podUID="ff1a3779-f857-4921-b0c5-fdad56861f50" Jan 13 20:20:22.864523 containerd[1478]: time="2025-01-13T20:20:22.864400076Z" level=error msg="Failed to destroy network for sandbox \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.865061 containerd[1478]: time="2025-01-13T20:20:22.865017877Z" level=error msg="encountered an error cleaning up failed sandbox \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.865132 containerd[1478]: time="2025-01-13T20:20:22.865088437Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-zsmfk,Uid:189c7963-e6bf-46b5-b6d5-9d268e857385,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.865703 kubelet[2817]: E0113 20:20:22.865328 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:22.865703 kubelet[2817]: E0113 20:20:22.865386 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:22.865703 kubelet[2817]: E0113 20:20:22.865407 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:22.865828 kubelet[2817]: E0113 20:20:22.865473 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7776878d6f-zsmfk_calico-apiserver(189c7963-e6bf-46b5-b6d5-9d268e857385)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7776878d6f-zsmfk_calico-apiserver(189c7963-e6bf-46b5-b6d5-9d268e857385)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" podUID="189c7963-e6bf-46b5-b6d5-9d268e857385" Jan 13 20:20:23.212362 systemd[1]: run-netns-cni\x2d2524d258\x2d7812\x2dc8fd\x2d64c9\x2d643a4bf6c4ad.mount: Deactivated successfully. Jan 13 20:20:23.212505 systemd[1]: run-netns-cni\x2d0be89bfe\x2dc619\x2d1bc8\x2d6f5c\x2d707410f98cf0.mount: Deactivated successfully. Jan 13 20:20:23.212557 systemd[1]: run-netns-cni\x2dab0f536b\x2de551\x2d3cc5\x2d929c\x2d2646336eea5a.mount: Deactivated successfully. Jan 13 20:20:23.531477 kubelet[2817]: I0113 20:20:23.531317 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4" Jan 13 20:20:23.537638 containerd[1478]: time="2025-01-13T20:20:23.534247411Z" level=info msg="StopPodSandbox for \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\"" Jan 13 20:20:23.537638 containerd[1478]: time="2025-01-13T20:20:23.534474131Z" level=info msg="Ensure that sandbox d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4 in task-service has been cleanup successfully" Jan 13 20:20:23.537638 containerd[1478]: time="2025-01-13T20:20:23.535087413Z" level=info msg="TearDown network for sandbox \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\" successfully" Jan 13 20:20:23.537638 containerd[1478]: time="2025-01-13T20:20:23.535108293Z" level=info msg="StopPodSandbox for \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\" returns successfully" Jan 13 20:20:23.539531 containerd[1478]: time="2025-01-13T20:20:23.538618021Z" level=info msg="StopPodSandbox for \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\"" Jan 13 20:20:23.539531 containerd[1478]: time="2025-01-13T20:20:23.538764702Z" level=info msg="TearDown network for sandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\" successfully" Jan 13 20:20:23.539531 containerd[1478]: time="2025-01-13T20:20:23.538777542Z" level=info msg="StopPodSandbox for \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\" returns successfully" Jan 13 20:20:23.539182 systemd[1]: run-netns-cni\x2dddaf9cb2\x2d4f54\x2d0a80\x2d90fa\x2da50249f0d578.mount: Deactivated successfully. Jan 13 20:20:23.541744 containerd[1478]: time="2025-01-13T20:20:23.541009987Z" level=info msg="StopPodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\"" Jan 13 20:20:23.541744 containerd[1478]: time="2025-01-13T20:20:23.541173227Z" level=info msg="TearDown network for sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" successfully" Jan 13 20:20:23.541744 containerd[1478]: time="2025-01-13T20:20:23.541187227Z" level=info msg="StopPodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" returns successfully" Jan 13 20:20:23.543161 containerd[1478]: time="2025-01-13T20:20:23.543116232Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-kzg6k,Uid:3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4,Namespace:calico-apiserver,Attempt:3,}" Jan 13 20:20:23.568354 kubelet[2817]: I0113 20:20:23.566307 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c" Jan 13 20:20:23.578335 containerd[1478]: time="2025-01-13T20:20:23.576962953Z" level=info msg="StopPodSandbox for \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\"" Jan 13 20:20:23.578335 containerd[1478]: time="2025-01-13T20:20:23.577170394Z" level=info msg="Ensure that sandbox 4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c in task-service has been cleanup successfully" Jan 13 20:20:23.582096 systemd[1]: run-netns-cni\x2d8f9d1896\x2dcac8\x2d3626\x2dbd34\x2d62cc12023f86.mount: Deactivated successfully. Jan 13 20:20:23.585177 containerd[1478]: time="2025-01-13T20:20:23.585133213Z" level=info msg="TearDown network for sandbox \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\" successfully" Jan 13 20:20:23.585721 containerd[1478]: time="2025-01-13T20:20:23.585361574Z" level=info msg="StopPodSandbox for \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\" returns successfully" Jan 13 20:20:23.586561 containerd[1478]: time="2025-01-13T20:20:23.586516976Z" level=info msg="StopPodSandbox for \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\"" Jan 13 20:20:23.588241 containerd[1478]: time="2025-01-13T20:20:23.588208220Z" level=info msg="TearDown network for sandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\" successfully" Jan 13 20:20:23.588678 containerd[1478]: time="2025-01-13T20:20:23.588571701Z" level=info msg="StopPodSandbox for \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\" returns successfully" Jan 13 20:20:23.592041 containerd[1478]: time="2025-01-13T20:20:23.591735029Z" level=info msg="StopPodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\"" Jan 13 20:20:23.592041 containerd[1478]: time="2025-01-13T20:20:23.591844829Z" level=info msg="TearDown network for sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" successfully" Jan 13 20:20:23.592041 containerd[1478]: time="2025-01-13T20:20:23.591855429Z" level=info msg="StopPodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" returns successfully" Jan 13 20:20:23.592253 kubelet[2817]: I0113 20:20:23.592191 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e" Jan 13 20:20:23.595489 containerd[1478]: time="2025-01-13T20:20:23.595438878Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d9b896c9c-x5cqp,Uid:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,Namespace:calico-system,Attempt:3,}" Jan 13 20:20:23.596546 containerd[1478]: time="2025-01-13T20:20:23.595985399Z" level=info msg="StopPodSandbox for \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\"" Jan 13 20:20:23.596546 containerd[1478]: time="2025-01-13T20:20:23.596301880Z" level=info msg="Ensure that sandbox f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e in task-service has been cleanup successfully" Jan 13 20:20:23.599625 containerd[1478]: time="2025-01-13T20:20:23.598963206Z" level=info msg="TearDown network for sandbox \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\" successfully" Jan 13 20:20:23.599824 containerd[1478]: time="2025-01-13T20:20:23.599793048Z" level=info msg="StopPodSandbox for \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\" returns successfully" Jan 13 20:20:23.603515 containerd[1478]: time="2025-01-13T20:20:23.603447497Z" level=info msg="StopPodSandbox for \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\"" Jan 13 20:20:23.606695 kubelet[2817]: I0113 20:20:23.606519 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f" Jan 13 20:20:23.606974 containerd[1478]: time="2025-01-13T20:20:23.606389744Z" level=info msg="TearDown network for sandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\" successfully" Jan 13 20:20:23.606974 containerd[1478]: time="2025-01-13T20:20:23.606896305Z" level=info msg="StopPodSandbox for \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\" returns successfully" Jan 13 20:20:23.612859 containerd[1478]: time="2025-01-13T20:20:23.612739439Z" level=info msg="StopPodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\"" Jan 13 20:20:23.613546 containerd[1478]: time="2025-01-13T20:20:23.613240841Z" level=info msg="StopPodSandbox for \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\"" Jan 13 20:20:23.614506 containerd[1478]: time="2025-01-13T20:20:23.614475844Z" level=info msg="Ensure that sandbox ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f in task-service has been cleanup successfully" Jan 13 20:20:23.615555 containerd[1478]: time="2025-01-13T20:20:23.613318881Z" level=info msg="TearDown network for sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" successfully" Jan 13 20:20:23.615873 containerd[1478]: time="2025-01-13T20:20:23.615819007Z" level=info msg="StopPodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" returns successfully" Jan 13 20:20:23.617414 containerd[1478]: time="2025-01-13T20:20:23.617352770Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8n6tx,Uid:acbb6d2d-5611-4557-91bf-b12ca46c13f5,Namespace:kube-system,Attempt:3,}" Jan 13 20:20:23.619564 containerd[1478]: time="2025-01-13T20:20:23.619482536Z" level=info msg="TearDown network for sandbox \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\" successfully" Jan 13 20:20:23.620474 containerd[1478]: time="2025-01-13T20:20:23.620373538Z" level=info msg="StopPodSandbox for \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\" returns successfully" Jan 13 20:20:23.621571 kubelet[2817]: I0113 20:20:23.621494 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24" Jan 13 20:20:23.623563 containerd[1478]: time="2025-01-13T20:20:23.623520665Z" level=info msg="StopPodSandbox for \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\"" Jan 13 20:20:23.626621 containerd[1478]: time="2025-01-13T20:20:23.624476308Z" level=info msg="TearDown network for sandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\" successfully" Jan 13 20:20:23.626621 containerd[1478]: time="2025-01-13T20:20:23.624501588Z" level=info msg="StopPodSandbox for \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\" returns successfully" Jan 13 20:20:23.626621 containerd[1478]: time="2025-01-13T20:20:23.624957349Z" level=info msg="StopPodSandbox for \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\"" Jan 13 20:20:23.626621 containerd[1478]: time="2025-01-13T20:20:23.625207629Z" level=info msg="Ensure that sandbox e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24 in task-service has been cleanup successfully" Jan 13 20:20:23.627336 containerd[1478]: time="2025-01-13T20:20:23.627305114Z" level=info msg="TearDown network for sandbox \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\" successfully" Jan 13 20:20:23.628462 containerd[1478]: time="2025-01-13T20:20:23.628356197Z" level=info msg="StopPodSandbox for \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\" returns successfully" Jan 13 20:20:23.628915 containerd[1478]: time="2025-01-13T20:20:23.627909916Z" level=info msg="StopPodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\"" Jan 13 20:20:23.629736 containerd[1478]: time="2025-01-13T20:20:23.629582800Z" level=info msg="TearDown network for sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" successfully" Jan 13 20:20:23.629881 containerd[1478]: time="2025-01-13T20:20:23.629861361Z" level=info msg="StopPodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" returns successfully" Jan 13 20:20:23.631406 containerd[1478]: time="2025-01-13T20:20:23.631366844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-khp7b,Uid:ff1a3779-f857-4921-b0c5-fdad56861f50,Namespace:kube-system,Attempt:3,}" Jan 13 20:20:23.633355 containerd[1478]: time="2025-01-13T20:20:23.633124008Z" level=info msg="StopPodSandbox for \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\"" Jan 13 20:20:23.633355 containerd[1478]: time="2025-01-13T20:20:23.633230689Z" level=info msg="TearDown network for sandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\" successfully" Jan 13 20:20:23.633355 containerd[1478]: time="2025-01-13T20:20:23.633240569Z" level=info msg="StopPodSandbox for \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\" returns successfully" Jan 13 20:20:23.635002 containerd[1478]: time="2025-01-13T20:20:23.634719852Z" level=info msg="StopPodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\"" Jan 13 20:20:23.635002 containerd[1478]: time="2025-01-13T20:20:23.634908093Z" level=info msg="TearDown network for sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" successfully" Jan 13 20:20:23.635002 containerd[1478]: time="2025-01-13T20:20:23.634923613Z" level=info msg="StopPodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" returns successfully" Jan 13 20:20:23.636026 kubelet[2817]: I0113 20:20:23.635929 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47" Jan 13 20:20:23.637677 containerd[1478]: time="2025-01-13T20:20:23.637261498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vhlrh,Uid:d410432e-4da0-436d-8d3b-2586cacab46b,Namespace:calico-system,Attempt:3,}" Jan 13 20:20:23.639390 containerd[1478]: time="2025-01-13T20:20:23.639019463Z" level=info msg="StopPodSandbox for \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\"" Jan 13 20:20:23.639390 containerd[1478]: time="2025-01-13T20:20:23.639236343Z" level=info msg="Ensure that sandbox 1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47 in task-service has been cleanup successfully" Jan 13 20:20:23.644194 containerd[1478]: time="2025-01-13T20:20:23.644126635Z" level=info msg="TearDown network for sandbox \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\" successfully" Jan 13 20:20:23.644194 containerd[1478]: time="2025-01-13T20:20:23.644170635Z" level=info msg="StopPodSandbox for \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\" returns successfully" Jan 13 20:20:23.646151 containerd[1478]: time="2025-01-13T20:20:23.645968439Z" level=info msg="StopPodSandbox for \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\"" Jan 13 20:20:23.646151 containerd[1478]: time="2025-01-13T20:20:23.646087880Z" level=info msg="TearDown network for sandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\" successfully" Jan 13 20:20:23.646151 containerd[1478]: time="2025-01-13T20:20:23.646100400Z" level=info msg="StopPodSandbox for \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\" returns successfully" Jan 13 20:20:23.649257 containerd[1478]: time="2025-01-13T20:20:23.647589723Z" level=info msg="StopPodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\"" Jan 13 20:20:23.649257 containerd[1478]: time="2025-01-13T20:20:23.648716606Z" level=info msg="TearDown network for sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" successfully" Jan 13 20:20:23.649257 containerd[1478]: time="2025-01-13T20:20:23.648731806Z" level=info msg="StopPodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" returns successfully" Jan 13 20:20:23.651092 containerd[1478]: time="2025-01-13T20:20:23.651041491Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-zsmfk,Uid:189c7963-e6bf-46b5-b6d5-9d268e857385,Namespace:calico-apiserver,Attempt:3,}" Jan 13 20:20:23.691377 containerd[1478]: time="2025-01-13T20:20:23.691308588Z" level=error msg="Failed to destroy network for sandbox \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.695094 containerd[1478]: time="2025-01-13T20:20:23.695037797Z" level=error msg="encountered an error cleaning up failed sandbox \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.695395 containerd[1478]: time="2025-01-13T20:20:23.695363398Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-kzg6k,Uid:3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.696963 kubelet[2817]: E0113 20:20:23.696900 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.697135 kubelet[2817]: E0113 20:20:23.697103 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:23.697261 kubelet[2817]: E0113 20:20:23.697135 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:23.697340 kubelet[2817]: E0113 20:20:23.697295 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7776878d6f-kzg6k_calico-apiserver(3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7776878d6f-kzg6k_calico-apiserver(3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" podUID="3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4" Jan 13 20:20:23.915953 containerd[1478]: time="2025-01-13T20:20:23.915812568Z" level=error msg="Failed to destroy network for sandbox \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.917977 containerd[1478]: time="2025-01-13T20:20:23.915862688Z" level=error msg="Failed to destroy network for sandbox \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.919205 containerd[1478]: time="2025-01-13T20:20:23.919075896Z" level=error msg="encountered an error cleaning up failed sandbox \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.919287 containerd[1478]: time="2025-01-13T20:20:23.919256296Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-khp7b,Uid:ff1a3779-f857-4921-b0c5-fdad56861f50,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.921270 kubelet[2817]: E0113 20:20:23.921062 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.921270 kubelet[2817]: E0113 20:20:23.921245 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-khp7b" Jan 13 20:20:23.921679 kubelet[2817]: E0113 20:20:23.921270 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-khp7b" Jan 13 20:20:23.921679 kubelet[2817]: E0113 20:20:23.921370 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-khp7b_kube-system(ff1a3779-f857-4921-b0c5-fdad56861f50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-khp7b_kube-system(ff1a3779-f857-4921-b0c5-fdad56861f50)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-khp7b" podUID="ff1a3779-f857-4921-b0c5-fdad56861f50" Jan 13 20:20:23.925985 containerd[1478]: time="2025-01-13T20:20:23.925648592Z" level=error msg="encountered an error cleaning up failed sandbox \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.925985 containerd[1478]: time="2025-01-13T20:20:23.925748792Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8n6tx,Uid:acbb6d2d-5611-4557-91bf-b12ca46c13f5,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.926349 kubelet[2817]: E0113 20:20:23.926128 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.926349 kubelet[2817]: E0113 20:20:23.926192 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:23.926349 kubelet[2817]: E0113 20:20:23.926211 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:23.926494 kubelet[2817]: E0113 20:20:23.926251 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-8n6tx_kube-system(acbb6d2d-5611-4557-91bf-b12ca46c13f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-8n6tx_kube-system(acbb6d2d-5611-4557-91bf-b12ca46c13f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-8n6tx" podUID="acbb6d2d-5611-4557-91bf-b12ca46c13f5" Jan 13 20:20:23.929874 containerd[1478]: time="2025-01-13T20:20:23.929082360Z" level=error msg="Failed to destroy network for sandbox \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.929874 containerd[1478]: time="2025-01-13T20:20:23.929815922Z" level=error msg="encountered an error cleaning up failed sandbox \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.932773 containerd[1478]: time="2025-01-13T20:20:23.930883124Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d9b896c9c-x5cqp,Uid:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.932929 kubelet[2817]: E0113 20:20:23.931153 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.932929 kubelet[2817]: E0113 20:20:23.931205 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:23.932929 kubelet[2817]: E0113 20:20:23.931224 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:23.933051 kubelet[2817]: E0113 20:20:23.931310 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-d9b896c9c-x5cqp_calico-system(2fb62ec7-4c06-48e1-aa87-7b62ac4da84a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-d9b896c9c-x5cqp_calico-system(2fb62ec7-4c06-48e1-aa87-7b62ac4da84a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" podUID="2fb62ec7-4c06-48e1-aa87-7b62ac4da84a" Jan 13 20:20:23.953164 containerd[1478]: time="2025-01-13T20:20:23.953103138Z" level=error msg="Failed to destroy network for sandbox \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.954345 containerd[1478]: time="2025-01-13T20:20:23.954296061Z" level=error msg="encountered an error cleaning up failed sandbox \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.954494 containerd[1478]: time="2025-01-13T20:20:23.954382221Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vhlrh,Uid:d410432e-4da0-436d-8d3b-2586cacab46b,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.954778 kubelet[2817]: E0113 20:20:23.954736 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.954849 kubelet[2817]: E0113 20:20:23.954798 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:23.954849 kubelet[2817]: E0113 20:20:23.954822 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:23.954894 kubelet[2817]: E0113 20:20:23.954863 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vhlrh_calico-system(d410432e-4da0-436d-8d3b-2586cacab46b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vhlrh_calico-system(d410432e-4da0-436d-8d3b-2586cacab46b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vhlrh" podUID="d410432e-4da0-436d-8d3b-2586cacab46b" Jan 13 20:20:23.956561 containerd[1478]: time="2025-01-13T20:20:23.956522226Z" level=error msg="Failed to destroy network for sandbox \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.957865 containerd[1478]: time="2025-01-13T20:20:23.957385428Z" level=error msg="encountered an error cleaning up failed sandbox \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.958295 containerd[1478]: time="2025-01-13T20:20:23.958255190Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-zsmfk,Uid:189c7963-e6bf-46b5-b6d5-9d268e857385,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.959130 kubelet[2817]: E0113 20:20:23.959078 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:23.959200 kubelet[2817]: E0113 20:20:23.959166 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:23.959200 kubelet[2817]: E0113 20:20:23.959189 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:23.959543 kubelet[2817]: E0113 20:20:23.959290 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7776878d6f-zsmfk_calico-apiserver(189c7963-e6bf-46b5-b6d5-9d268e857385)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7776878d6f-zsmfk_calico-apiserver(189c7963-e6bf-46b5-b6d5-9d268e857385)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" podUID="189c7963-e6bf-46b5-b6d5-9d268e857385" Jan 13 20:20:24.213760 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30-shm.mount: Deactivated successfully. Jan 13 20:20:24.213870 systemd[1]: run-netns-cni\x2d611c9af2\x2dd4fa\x2d4ba2\x2ddefb\x2dc2c987065d04.mount: Deactivated successfully. Jan 13 20:20:24.213918 systemd[1]: run-netns-cni\x2dc0c5bf42\x2d7ad7\x2d0524\x2d005e\x2d6248992d80a4.mount: Deactivated successfully. Jan 13 20:20:24.213964 systemd[1]: run-netns-cni\x2d5b5e484b\x2dd10e\x2db968\x2de386\x2dbeca4c30771b.mount: Deactivated successfully. Jan 13 20:20:24.214010 systemd[1]: run-netns-cni\x2dec1aed28\x2d1e3b\x2dc5a2\x2de78e\x2dad6d8f71831c.mount: Deactivated successfully. Jan 13 20:20:24.656008 kubelet[2817]: I0113 20:20:24.655794 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9" Jan 13 20:20:24.658300 containerd[1478]: time="2025-01-13T20:20:24.658249652Z" level=info msg="StopPodSandbox for \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\"" Jan 13 20:20:24.661614 containerd[1478]: time="2025-01-13T20:20:24.661545260Z" level=info msg="Ensure that sandbox 0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9 in task-service has been cleanup successfully" Jan 13 20:20:24.666966 containerd[1478]: time="2025-01-13T20:20:24.666896752Z" level=info msg="TearDown network for sandbox \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\" successfully" Jan 13 20:20:24.669402 containerd[1478]: time="2025-01-13T20:20:24.667491314Z" level=info msg="StopPodSandbox for \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\" returns successfully" Jan 13 20:20:24.671841 containerd[1478]: time="2025-01-13T20:20:24.670945882Z" level=info msg="StopPodSandbox for \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\"" Jan 13 20:20:24.672700 systemd[1]: run-netns-cni\x2de40e1f9d\x2dec19\x2d6d9e\x2df490\x2d724b2a97ae54.mount: Deactivated successfully. Jan 13 20:20:24.675495 containerd[1478]: time="2025-01-13T20:20:24.673735128Z" level=info msg="TearDown network for sandbox \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\" successfully" Jan 13 20:20:24.675495 containerd[1478]: time="2025-01-13T20:20:24.673786969Z" level=info msg="StopPodSandbox for \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\" returns successfully" Jan 13 20:20:24.677188 containerd[1478]: time="2025-01-13T20:20:24.676856336Z" level=info msg="StopPodSandbox for \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\"" Jan 13 20:20:24.677188 containerd[1478]: time="2025-01-13T20:20:24.677057256Z" level=info msg="TearDown network for sandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\" successfully" Jan 13 20:20:24.677188 containerd[1478]: time="2025-01-13T20:20:24.677072056Z" level=info msg="StopPodSandbox for \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\" returns successfully" Jan 13 20:20:24.678127 containerd[1478]: time="2025-01-13T20:20:24.677887178Z" level=info msg="StopPodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\"" Jan 13 20:20:24.678127 containerd[1478]: time="2025-01-13T20:20:24.678055179Z" level=info msg="TearDown network for sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" successfully" Jan 13 20:20:24.678127 containerd[1478]: time="2025-01-13T20:20:24.678066899Z" level=info msg="StopPodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" returns successfully" Jan 13 20:20:24.679290 containerd[1478]: time="2025-01-13T20:20:24.679248582Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-khp7b,Uid:ff1a3779-f857-4921-b0c5-fdad56861f50,Namespace:kube-system,Attempt:4,}" Jan 13 20:20:24.684292 kubelet[2817]: I0113 20:20:24.684255 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64" Jan 13 20:20:24.687483 containerd[1478]: time="2025-01-13T20:20:24.687048920Z" level=info msg="StopPodSandbox for \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\"" Jan 13 20:20:24.687483 containerd[1478]: time="2025-01-13T20:20:24.687248561Z" level=info msg="Ensure that sandbox 5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64 in task-service has been cleanup successfully" Jan 13 20:20:24.687818 containerd[1478]: time="2025-01-13T20:20:24.687794122Z" level=info msg="TearDown network for sandbox \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\" successfully" Jan 13 20:20:24.687938 containerd[1478]: time="2025-01-13T20:20:24.687890562Z" level=info msg="StopPodSandbox for \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\" returns successfully" Jan 13 20:20:24.692064 containerd[1478]: time="2025-01-13T20:20:24.692022012Z" level=info msg="StopPodSandbox for \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\"" Jan 13 20:20:24.692113 systemd[1]: run-netns-cni\x2dd5eb63ce\x2dbbf5\x2db477\x2ded26\x2d83d351ea7528.mount: Deactivated successfully. Jan 13 20:20:24.694161 containerd[1478]: time="2025-01-13T20:20:24.692847094Z" level=info msg="TearDown network for sandbox \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\" successfully" Jan 13 20:20:24.694161 containerd[1478]: time="2025-01-13T20:20:24.692875374Z" level=info msg="StopPodSandbox for \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\" returns successfully" Jan 13 20:20:24.694817 containerd[1478]: time="2025-01-13T20:20:24.694505498Z" level=info msg="StopPodSandbox for \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\"" Jan 13 20:20:24.694817 containerd[1478]: time="2025-01-13T20:20:24.694653098Z" level=info msg="TearDown network for sandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\" successfully" Jan 13 20:20:24.694817 containerd[1478]: time="2025-01-13T20:20:24.694666178Z" level=info msg="StopPodSandbox for \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\" returns successfully" Jan 13 20:20:24.695997 containerd[1478]: time="2025-01-13T20:20:24.695965981Z" level=info msg="StopPodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\"" Jan 13 20:20:24.696830 containerd[1478]: time="2025-01-13T20:20:24.696334782Z" level=info msg="TearDown network for sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" successfully" Jan 13 20:20:24.696830 containerd[1478]: time="2025-01-13T20:20:24.696362662Z" level=info msg="StopPodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" returns successfully" Jan 13 20:20:24.698207 containerd[1478]: time="2025-01-13T20:20:24.697650105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vhlrh,Uid:d410432e-4da0-436d-8d3b-2586cacab46b,Namespace:calico-system,Attempt:4,}" Jan 13 20:20:24.699201 kubelet[2817]: I0113 20:20:24.699163 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c" Jan 13 20:20:24.702733 containerd[1478]: time="2025-01-13T20:20:24.702674117Z" level=info msg="StopPodSandbox for \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\"" Jan 13 20:20:24.704210 containerd[1478]: time="2025-01-13T20:20:24.704159161Z" level=info msg="Ensure that sandbox 5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c in task-service has been cleanup successfully" Jan 13 20:20:24.707625 containerd[1478]: time="2025-01-13T20:20:24.705188843Z" level=info msg="TearDown network for sandbox \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\" successfully" Jan 13 20:20:24.707625 containerd[1478]: time="2025-01-13T20:20:24.705241643Z" level=info msg="StopPodSandbox for \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\" returns successfully" Jan 13 20:20:24.709055 containerd[1478]: time="2025-01-13T20:20:24.708986412Z" level=info msg="StopPodSandbox for \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\"" Jan 13 20:20:24.709518 containerd[1478]: time="2025-01-13T20:20:24.709386573Z" level=info msg="TearDown network for sandbox \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\" successfully" Jan 13 20:20:24.709696 containerd[1478]: time="2025-01-13T20:20:24.709676854Z" level=info msg="StopPodSandbox for \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\" returns successfully" Jan 13 20:20:24.710776 systemd[1]: run-netns-cni\x2dbc590a5b\x2d1621\x2dd88a\x2d58bf\x2d6687fffbda07.mount: Deactivated successfully. Jan 13 20:20:24.711985 containerd[1478]: time="2025-01-13T20:20:24.711688699Z" level=info msg="StopPodSandbox for \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\"" Jan 13 20:20:24.711985 containerd[1478]: time="2025-01-13T20:20:24.711795499Z" level=info msg="TearDown network for sandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\" successfully" Jan 13 20:20:24.711985 containerd[1478]: time="2025-01-13T20:20:24.711805339Z" level=info msg="StopPodSandbox for \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\" returns successfully" Jan 13 20:20:24.714897 containerd[1478]: time="2025-01-13T20:20:24.714859546Z" level=info msg="StopPodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\"" Jan 13 20:20:24.715943 kubelet[2817]: I0113 20:20:24.715849 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30" Jan 13 20:20:24.716734 containerd[1478]: time="2025-01-13T20:20:24.716332990Z" level=info msg="TearDown network for sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" successfully" Jan 13 20:20:24.716734 containerd[1478]: time="2025-01-13T20:20:24.716417710Z" level=info msg="StopPodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" returns successfully" Jan 13 20:20:24.717487 containerd[1478]: time="2025-01-13T20:20:24.717400272Z" level=info msg="StopPodSandbox for \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\"" Jan 13 20:20:24.718269 containerd[1478]: time="2025-01-13T20:20:24.718077594Z" level=info msg="Ensure that sandbox 316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30 in task-service has been cleanup successfully" Jan 13 20:20:24.723457 systemd[1]: run-netns-cni\x2d1ef3f15a\x2d1218\x2d8a6b\x2da966\x2da983b2d4f803.mount: Deactivated successfully. Jan 13 20:20:24.724208 containerd[1478]: time="2025-01-13T20:20:24.723806807Z" level=info msg="TearDown network for sandbox \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\" successfully" Jan 13 20:20:24.724208 containerd[1478]: time="2025-01-13T20:20:24.723845927Z" level=info msg="StopPodSandbox for \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\" returns successfully" Jan 13 20:20:24.724208 containerd[1478]: time="2025-01-13T20:20:24.723994088Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-zsmfk,Uid:189c7963-e6bf-46b5-b6d5-9d268e857385,Namespace:calico-apiserver,Attempt:4,}" Jan 13 20:20:24.726845 containerd[1478]: time="2025-01-13T20:20:24.726680574Z" level=info msg="StopPodSandbox for \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\"" Jan 13 20:20:24.726845 containerd[1478]: time="2025-01-13T20:20:24.726780094Z" level=info msg="TearDown network for sandbox \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\" successfully" Jan 13 20:20:24.726845 containerd[1478]: time="2025-01-13T20:20:24.726789494Z" level=info msg="StopPodSandbox for \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\" returns successfully" Jan 13 20:20:24.727911 containerd[1478]: time="2025-01-13T20:20:24.727873697Z" level=info msg="StopPodSandbox for \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\"" Jan 13 20:20:24.729242 containerd[1478]: time="2025-01-13T20:20:24.728946979Z" level=info msg="TearDown network for sandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\" successfully" Jan 13 20:20:24.729242 containerd[1478]: time="2025-01-13T20:20:24.728975860Z" level=info msg="StopPodSandbox for \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\" returns successfully" Jan 13 20:20:24.730171 kubelet[2817]: I0113 20:20:24.730016 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8" Jan 13 20:20:24.733231 containerd[1478]: time="2025-01-13T20:20:24.732284027Z" level=info msg="StopPodSandbox for \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\"" Jan 13 20:20:24.734417 containerd[1478]: time="2025-01-13T20:20:24.733753591Z" level=info msg="Ensure that sandbox 3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8 in task-service has been cleanup successfully" Jan 13 20:20:24.734934 containerd[1478]: time="2025-01-13T20:20:24.734827633Z" level=info msg="StopPodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\"" Jan 13 20:20:24.735307 containerd[1478]: time="2025-01-13T20:20:24.735026634Z" level=info msg="TearDown network for sandbox \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\" successfully" Jan 13 20:20:24.735307 containerd[1478]: time="2025-01-13T20:20:24.735044634Z" level=info msg="StopPodSandbox for \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\" returns successfully" Jan 13 20:20:24.735797 containerd[1478]: time="2025-01-13T20:20:24.735742356Z" level=info msg="TearDown network for sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" successfully" Jan 13 20:20:24.736127 containerd[1478]: time="2025-01-13T20:20:24.736084836Z" level=info msg="StopPodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" returns successfully" Jan 13 20:20:24.737129 containerd[1478]: time="2025-01-13T20:20:24.737009439Z" level=info msg="StopPodSandbox for \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\"" Jan 13 20:20:24.737623 containerd[1478]: time="2025-01-13T20:20:24.737418160Z" level=info msg="TearDown network for sandbox \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\" successfully" Jan 13 20:20:24.737788 containerd[1478]: time="2025-01-13T20:20:24.737584760Z" level=info msg="StopPodSandbox for \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\" returns successfully" Jan 13 20:20:24.738554 containerd[1478]: time="2025-01-13T20:20:24.738163321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-kzg6k,Uid:3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4,Namespace:calico-apiserver,Attempt:4,}" Jan 13 20:20:24.739970 containerd[1478]: time="2025-01-13T20:20:24.739935726Z" level=info msg="StopPodSandbox for \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\"" Jan 13 20:20:24.740070 containerd[1478]: time="2025-01-13T20:20:24.740054446Z" level=info msg="TearDown network for sandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\" successfully" Jan 13 20:20:24.740070 containerd[1478]: time="2025-01-13T20:20:24.740065886Z" level=info msg="StopPodSandbox for \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\" returns successfully" Jan 13 20:20:24.742920 containerd[1478]: time="2025-01-13T20:20:24.742730852Z" level=info msg="StopPodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\"" Jan 13 20:20:24.742920 containerd[1478]: time="2025-01-13T20:20:24.742853092Z" level=info msg="TearDown network for sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" successfully" Jan 13 20:20:24.742920 containerd[1478]: time="2025-01-13T20:20:24.742863172Z" level=info msg="StopPodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" returns successfully" Jan 13 20:20:24.744642 containerd[1478]: time="2025-01-13T20:20:24.744280816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d9b896c9c-x5cqp,Uid:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,Namespace:calico-system,Attempt:4,}" Jan 13 20:20:24.747410 kubelet[2817]: I0113 20:20:24.745532 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321" Jan 13 20:20:24.749068 containerd[1478]: time="2025-01-13T20:20:24.748553186Z" level=info msg="StopPodSandbox for \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\"" Jan 13 20:20:24.749068 containerd[1478]: time="2025-01-13T20:20:24.748757706Z" level=info msg="Ensure that sandbox b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321 in task-service has been cleanup successfully" Jan 13 20:20:24.749651 containerd[1478]: time="2025-01-13T20:20:24.749605308Z" level=info msg="TearDown network for sandbox \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\" successfully" Jan 13 20:20:24.749773 containerd[1478]: time="2025-01-13T20:20:24.749756509Z" level=info msg="StopPodSandbox for \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\" returns successfully" Jan 13 20:20:24.751507 containerd[1478]: time="2025-01-13T20:20:24.751449393Z" level=info msg="StopPodSandbox for \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\"" Jan 13 20:20:24.751672 containerd[1478]: time="2025-01-13T20:20:24.751636793Z" level=info msg="TearDown network for sandbox \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\" successfully" Jan 13 20:20:24.751672 containerd[1478]: time="2025-01-13T20:20:24.751668233Z" level=info msg="StopPodSandbox for \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\" returns successfully" Jan 13 20:20:24.753521 containerd[1478]: time="2025-01-13T20:20:24.753485078Z" level=info msg="StopPodSandbox for \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\"" Jan 13 20:20:24.753903 containerd[1478]: time="2025-01-13T20:20:24.753882079Z" level=info msg="TearDown network for sandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\" successfully" Jan 13 20:20:24.754101 containerd[1478]: time="2025-01-13T20:20:24.754086119Z" level=info msg="StopPodSandbox for \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\" returns successfully" Jan 13 20:20:24.755236 containerd[1478]: time="2025-01-13T20:20:24.755178962Z" level=info msg="StopPodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\"" Jan 13 20:20:24.755874 containerd[1478]: time="2025-01-13T20:20:24.755320282Z" level=info msg="TearDown network for sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" successfully" Jan 13 20:20:24.755874 containerd[1478]: time="2025-01-13T20:20:24.755332762Z" level=info msg="StopPodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" returns successfully" Jan 13 20:20:24.760246 containerd[1478]: time="2025-01-13T20:20:24.759933293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8n6tx,Uid:acbb6d2d-5611-4557-91bf-b12ca46c13f5,Namespace:kube-system,Attempt:4,}" Jan 13 20:20:24.924083 containerd[1478]: time="2025-01-13T20:20:24.923942682Z" level=error msg="Failed to destroy network for sandbox \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:24.926626 containerd[1478]: time="2025-01-13T20:20:24.925822566Z" level=error msg="encountered an error cleaning up failed sandbox \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:24.928896 containerd[1478]: time="2025-01-13T20:20:24.928451493Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-khp7b,Uid:ff1a3779-f857-4921-b0c5-fdad56861f50,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:24.933016 kubelet[2817]: E0113 20:20:24.931878 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:24.933016 kubelet[2817]: E0113 20:20:24.931936 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-khp7b" Jan 13 20:20:24.933016 kubelet[2817]: E0113 20:20:24.931958 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-khp7b" Jan 13 20:20:24.933484 kubelet[2817]: E0113 20:20:24.932013 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-khp7b_kube-system(ff1a3779-f857-4921-b0c5-fdad56861f50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-khp7b_kube-system(ff1a3779-f857-4921-b0c5-fdad56861f50)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-khp7b" podUID="ff1a3779-f857-4921-b0c5-fdad56861f50" Jan 13 20:20:24.995184 containerd[1478]: time="2025-01-13T20:20:24.994671210Z" level=error msg="Failed to destroy network for sandbox \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:24.999996 containerd[1478]: time="2025-01-13T20:20:24.999095500Z" level=error msg="encountered an error cleaning up failed sandbox \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:24.999996 containerd[1478]: time="2025-01-13T20:20:24.999209300Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vhlrh,Uid:d410432e-4da0-436d-8d3b-2586cacab46b,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.000142 kubelet[2817]: E0113 20:20:24.999529 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.000142 kubelet[2817]: E0113 20:20:24.999656 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:25.000142 kubelet[2817]: E0113 20:20:24.999681 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:25.000251 kubelet[2817]: E0113 20:20:24.999732 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vhlrh_calico-system(d410432e-4da0-436d-8d3b-2586cacab46b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vhlrh_calico-system(d410432e-4da0-436d-8d3b-2586cacab46b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vhlrh" podUID="d410432e-4da0-436d-8d3b-2586cacab46b" Jan 13 20:20:25.078160 containerd[1478]: time="2025-01-13T20:20:25.078092165Z" level=error msg="Failed to destroy network for sandbox \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.078384 containerd[1478]: time="2025-01-13T20:20:25.078247205Z" level=error msg="Failed to destroy network for sandbox \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.079558 containerd[1478]: time="2025-01-13T20:20:25.079271288Z" level=error msg="encountered an error cleaning up failed sandbox \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.080564 containerd[1478]: time="2025-01-13T20:20:25.079921329Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-zsmfk,Uid:189c7963-e6bf-46b5-b6d5-9d268e857385,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.081412 kubelet[2817]: E0113 20:20:25.080922 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.081412 kubelet[2817]: E0113 20:20:25.081004 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:25.081412 kubelet[2817]: E0113 20:20:25.081025 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:25.081622 kubelet[2817]: E0113 20:20:25.081067 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7776878d6f-zsmfk_calico-apiserver(189c7963-e6bf-46b5-b6d5-9d268e857385)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7776878d6f-zsmfk_calico-apiserver(189c7963-e6bf-46b5-b6d5-9d268e857385)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" podUID="189c7963-e6bf-46b5-b6d5-9d268e857385" Jan 13 20:20:25.082254 containerd[1478]: time="2025-01-13T20:20:25.082194495Z" level=error msg="encountered an error cleaning up failed sandbox \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.082322 containerd[1478]: time="2025-01-13T20:20:25.082279335Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8n6tx,Uid:acbb6d2d-5611-4557-91bf-b12ca46c13f5,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.082572 kubelet[2817]: E0113 20:20:25.082522 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.082837 kubelet[2817]: E0113 20:20:25.082812 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:25.082911 kubelet[2817]: E0113 20:20:25.082895 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:25.083023 kubelet[2817]: E0113 20:20:25.082992 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-8n6tx_kube-system(acbb6d2d-5611-4557-91bf-b12ca46c13f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-8n6tx_kube-system(acbb6d2d-5611-4557-91bf-b12ca46c13f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-8n6tx" podUID="acbb6d2d-5611-4557-91bf-b12ca46c13f5" Jan 13 20:20:25.084038 containerd[1478]: time="2025-01-13T20:20:25.083983499Z" level=error msg="Failed to destroy network for sandbox \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.085477 containerd[1478]: time="2025-01-13T20:20:25.085383622Z" level=error msg="encountered an error cleaning up failed sandbox \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.085573 containerd[1478]: time="2025-01-13T20:20:25.085511262Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-kzg6k,Uid:3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.086808 kubelet[2817]: E0113 20:20:25.085929 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.086808 kubelet[2817]: E0113 20:20:25.085989 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:25.086808 kubelet[2817]: E0113 20:20:25.086012 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:25.086925 kubelet[2817]: E0113 20:20:25.086053 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7776878d6f-kzg6k_calico-apiserver(3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7776878d6f-kzg6k_calico-apiserver(3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" podUID="3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4" Jan 13 20:20:25.099772 containerd[1478]: time="2025-01-13T20:20:25.099591255Z" level=error msg="Failed to destroy network for sandbox \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.100307 containerd[1478]: time="2025-01-13T20:20:25.100257537Z" level=error msg="encountered an error cleaning up failed sandbox \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.100362 containerd[1478]: time="2025-01-13T20:20:25.100330057Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d9b896c9c-x5cqp,Uid:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.101005 kubelet[2817]: E0113 20:20:25.100751 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:25.101005 kubelet[2817]: E0113 20:20:25.100837 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:25.101005 kubelet[2817]: E0113 20:20:25.100867 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:25.101301 kubelet[2817]: E0113 20:20:25.100933 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-d9b896c9c-x5cqp_calico-system(2fb62ec7-4c06-48e1-aa87-7b62ac4da84a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-d9b896c9c-x5cqp_calico-system(2fb62ec7-4c06-48e1-aa87-7b62ac4da84a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" podUID="2fb62ec7-4c06-48e1-aa87-7b62ac4da84a" Jan 13 20:20:25.215754 systemd[1]: run-netns-cni\x2de1819dee\x2d5382\x2da457\x2d8583\x2d29e62d98720c.mount: Deactivated successfully. Jan 13 20:20:25.216368 systemd[1]: run-netns-cni\x2dafc002ae\x2d9fad\x2d00e8\x2da6a1\x2d5f3dcee313de.mount: Deactivated successfully. Jan 13 20:20:25.754018 kubelet[2817]: I0113 20:20:25.753555 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4" Jan 13 20:20:25.755022 containerd[1478]: time="2025-01-13T20:20:25.754964109Z" level=info msg="StopPodSandbox for \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\"" Jan 13 20:20:25.755501 containerd[1478]: time="2025-01-13T20:20:25.755206630Z" level=info msg="Ensure that sandbox 6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4 in task-service has been cleanup successfully" Jan 13 20:20:25.759096 systemd[1]: run-netns-cni\x2d23f42416\x2d8129\x2db74f\x2d32dd\x2dc4c8a195d1e2.mount: Deactivated successfully. Jan 13 20:20:25.761090 containerd[1478]: time="2025-01-13T20:20:25.760902203Z" level=info msg="TearDown network for sandbox \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\" successfully" Jan 13 20:20:25.761090 containerd[1478]: time="2025-01-13T20:20:25.760943083Z" level=info msg="StopPodSandbox for \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\" returns successfully" Jan 13 20:20:25.763989 containerd[1478]: time="2025-01-13T20:20:25.763914970Z" level=info msg="StopPodSandbox for \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\"" Jan 13 20:20:25.764786 containerd[1478]: time="2025-01-13T20:20:25.764744252Z" level=info msg="TearDown network for sandbox \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\" successfully" Jan 13 20:20:25.764786 containerd[1478]: time="2025-01-13T20:20:25.764783612Z" level=info msg="StopPodSandbox for \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\" returns successfully" Jan 13 20:20:25.766123 containerd[1478]: time="2025-01-13T20:20:25.766050335Z" level=info msg="StopPodSandbox for \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\"" Jan 13 20:20:25.766274 containerd[1478]: time="2025-01-13T20:20:25.766182135Z" level=info msg="TearDown network for sandbox \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\" successfully" Jan 13 20:20:25.766274 containerd[1478]: time="2025-01-13T20:20:25.766193735Z" level=info msg="StopPodSandbox for \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\" returns successfully" Jan 13 20:20:25.767952 containerd[1478]: time="2025-01-13T20:20:25.767904499Z" level=info msg="StopPodSandbox for \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\"" Jan 13 20:20:25.768802 containerd[1478]: time="2025-01-13T20:20:25.768271140Z" level=info msg="TearDown network for sandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\" successfully" Jan 13 20:20:25.768802 containerd[1478]: time="2025-01-13T20:20:25.768295940Z" level=info msg="StopPodSandbox for \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\" returns successfully" Jan 13 20:20:25.770095 containerd[1478]: time="2025-01-13T20:20:25.769291063Z" level=info msg="StopPodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\"" Jan 13 20:20:25.770298 containerd[1478]: time="2025-01-13T20:20:25.770266945Z" level=info msg="TearDown network for sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" successfully" Jan 13 20:20:25.770367 containerd[1478]: time="2025-01-13T20:20:25.770352625Z" level=info msg="StopPodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" returns successfully" Jan 13 20:20:25.771489 containerd[1478]: time="2025-01-13T20:20:25.771381948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vhlrh,Uid:d410432e-4da0-436d-8d3b-2586cacab46b,Namespace:calico-system,Attempt:5,}" Jan 13 20:20:25.790349 kubelet[2817]: I0113 20:20:25.789833 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b" Jan 13 20:20:25.797113 containerd[1478]: time="2025-01-13T20:20:25.797055008Z" level=info msg="StopPodSandbox for \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\"" Jan 13 20:20:25.797317 containerd[1478]: time="2025-01-13T20:20:25.797287368Z" level=info msg="Ensure that sandbox 45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b in task-service has been cleanup successfully" Jan 13 20:20:25.805846 containerd[1478]: time="2025-01-13T20:20:25.801473378Z" level=info msg="TearDown network for sandbox \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\" successfully" Jan 13 20:20:25.805846 containerd[1478]: time="2025-01-13T20:20:25.801516018Z" level=info msg="StopPodSandbox for \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\" returns successfully" Jan 13 20:20:25.805846 containerd[1478]: time="2025-01-13T20:20:25.804797946Z" level=info msg="StopPodSandbox for \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\"" Jan 13 20:20:25.805846 containerd[1478]: time="2025-01-13T20:20:25.804899026Z" level=info msg="TearDown network for sandbox \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\" successfully" Jan 13 20:20:25.805846 containerd[1478]: time="2025-01-13T20:20:25.804909586Z" level=info msg="StopPodSandbox for \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\" returns successfully" Jan 13 20:20:25.805846 containerd[1478]: time="2025-01-13T20:20:25.805446947Z" level=info msg="StopPodSandbox for \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\"" Jan 13 20:20:25.809058 containerd[1478]: time="2025-01-13T20:20:25.806681270Z" level=info msg="TearDown network for sandbox \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\" successfully" Jan 13 20:20:25.809058 containerd[1478]: time="2025-01-13T20:20:25.806720390Z" level=info msg="StopPodSandbox for \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\" returns successfully" Jan 13 20:20:25.809058 containerd[1478]: time="2025-01-13T20:20:25.807250232Z" level=info msg="StopPodSandbox for \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\"" Jan 13 20:20:25.809058 containerd[1478]: time="2025-01-13T20:20:25.807324712Z" level=info msg="TearDown network for sandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\" successfully" Jan 13 20:20:25.809058 containerd[1478]: time="2025-01-13T20:20:25.807387752Z" level=info msg="StopPodSandbox for \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\" returns successfully" Jan 13 20:20:25.809058 containerd[1478]: time="2025-01-13T20:20:25.807930153Z" level=info msg="StopPodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\"" Jan 13 20:20:25.809058 containerd[1478]: time="2025-01-13T20:20:25.807999393Z" level=info msg="TearDown network for sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" successfully" Jan 13 20:20:25.809058 containerd[1478]: time="2025-01-13T20:20:25.808010713Z" level=info msg="StopPodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" returns successfully" Jan 13 20:20:25.809058 containerd[1478]: time="2025-01-13T20:20:25.808890315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-zsmfk,Uid:189c7963-e6bf-46b5-b6d5-9d268e857385,Namespace:calico-apiserver,Attempt:5,}" Jan 13 20:20:25.808573 systemd[1]: run-netns-cni\x2d26437418\x2de532\x2df0be\x2dedb3\x2d8fad85bb4192.mount: Deactivated successfully. Jan 13 20:20:25.816363 kubelet[2817]: I0113 20:20:25.814539 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb" Jan 13 20:20:25.816555 containerd[1478]: time="2025-01-13T20:20:25.815850852Z" level=info msg="StopPodSandbox for \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\"" Jan 13 20:20:25.816555 containerd[1478]: time="2025-01-13T20:20:25.816094252Z" level=info msg="Ensure that sandbox 388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb in task-service has been cleanup successfully" Jan 13 20:20:25.819826 containerd[1478]: time="2025-01-13T20:20:25.819772101Z" level=info msg="TearDown network for sandbox \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\" successfully" Jan 13 20:20:25.820201 containerd[1478]: time="2025-01-13T20:20:25.820179182Z" level=info msg="StopPodSandbox for \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\" returns successfully" Jan 13 20:20:25.821914 containerd[1478]: time="2025-01-13T20:20:25.821573305Z" level=info msg="StopPodSandbox for \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\"" Jan 13 20:20:25.822372 containerd[1478]: time="2025-01-13T20:20:25.822056626Z" level=info msg="TearDown network for sandbox \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\" successfully" Jan 13 20:20:25.822742 containerd[1478]: time="2025-01-13T20:20:25.822665068Z" level=info msg="StopPodSandbox for \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\" returns successfully" Jan 13 20:20:25.823219 containerd[1478]: time="2025-01-13T20:20:25.823180949Z" level=info msg="StopPodSandbox for \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\"" Jan 13 20:20:25.823493 containerd[1478]: time="2025-01-13T20:20:25.823398909Z" level=info msg="TearDown network for sandbox \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\" successfully" Jan 13 20:20:25.824544 containerd[1478]: time="2025-01-13T20:20:25.823815390Z" level=info msg="StopPodSandbox for \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\" returns successfully" Jan 13 20:20:25.825765 containerd[1478]: time="2025-01-13T20:20:25.825721315Z" level=info msg="StopPodSandbox for \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\"" Jan 13 20:20:25.825905 containerd[1478]: time="2025-01-13T20:20:25.825863515Z" level=info msg="TearDown network for sandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\" successfully" Jan 13 20:20:25.825905 containerd[1478]: time="2025-01-13T20:20:25.825900515Z" level=info msg="StopPodSandbox for \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\" returns successfully" Jan 13 20:20:25.828226 containerd[1478]: time="2025-01-13T20:20:25.828173921Z" level=info msg="StopPodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\"" Jan 13 20:20:25.828545 containerd[1478]: time="2025-01-13T20:20:25.828508401Z" level=info msg="TearDown network for sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" successfully" Jan 13 20:20:25.830293 containerd[1478]: time="2025-01-13T20:20:25.830222845Z" level=info msg="StopPodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" returns successfully" Jan 13 20:20:25.833148 containerd[1478]: time="2025-01-13T20:20:25.833074892Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-kzg6k,Uid:3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4,Namespace:calico-apiserver,Attempt:5,}" Jan 13 20:20:25.833890 kubelet[2817]: I0113 20:20:25.833543 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d" Jan 13 20:20:25.837429 containerd[1478]: time="2025-01-13T20:20:25.837344542Z" level=info msg="StopPodSandbox for \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\"" Jan 13 20:20:25.838256 containerd[1478]: time="2025-01-13T20:20:25.837736143Z" level=info msg="Ensure that sandbox 7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d in task-service has been cleanup successfully" Jan 13 20:20:25.838256 containerd[1478]: time="2025-01-13T20:20:25.838035624Z" level=info msg="TearDown network for sandbox \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\" successfully" Jan 13 20:20:25.838256 containerd[1478]: time="2025-01-13T20:20:25.838140904Z" level=info msg="StopPodSandbox for \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\" returns successfully" Jan 13 20:20:25.848185 containerd[1478]: time="2025-01-13T20:20:25.847911047Z" level=info msg="StopPodSandbox for \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\"" Jan 13 20:20:25.848185 containerd[1478]: time="2025-01-13T20:20:25.848074127Z" level=info msg="TearDown network for sandbox \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\" successfully" Jan 13 20:20:25.848185 containerd[1478]: time="2025-01-13T20:20:25.848087607Z" level=info msg="StopPodSandbox for \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\" returns successfully" Jan 13 20:20:25.849364 containerd[1478]: time="2025-01-13T20:20:25.849232850Z" level=info msg="StopPodSandbox for \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\"" Jan 13 20:20:25.849701 containerd[1478]: time="2025-01-13T20:20:25.849654411Z" level=info msg="TearDown network for sandbox \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\" successfully" Jan 13 20:20:25.849893 containerd[1478]: time="2025-01-13T20:20:25.849781731Z" level=info msg="StopPodSandbox for \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\" returns successfully" Jan 13 20:20:25.853898 containerd[1478]: time="2025-01-13T20:20:25.853800101Z" level=info msg="StopPodSandbox for \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\"" Jan 13 20:20:25.854223 containerd[1478]: time="2025-01-13T20:20:25.853915101Z" level=info msg="TearDown network for sandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\" successfully" Jan 13 20:20:25.854223 containerd[1478]: time="2025-01-13T20:20:25.853979821Z" level=info msg="StopPodSandbox for \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\" returns successfully" Jan 13 20:20:25.854966 containerd[1478]: time="2025-01-13T20:20:25.854891983Z" level=info msg="StopPodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\"" Jan 13 20:20:25.855242 containerd[1478]: time="2025-01-13T20:20:25.855192824Z" level=info msg="TearDown network for sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" successfully" Jan 13 20:20:25.855242 containerd[1478]: time="2025-01-13T20:20:25.855213504Z" level=info msg="StopPodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" returns successfully" Jan 13 20:20:25.859125 containerd[1478]: time="2025-01-13T20:20:25.859060673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d9b896c9c-x5cqp,Uid:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,Namespace:calico-system,Attempt:5,}" Jan 13 20:20:25.862209 kubelet[2817]: I0113 20:20:25.862094 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b" Jan 13 20:20:25.868263 containerd[1478]: time="2025-01-13T20:20:25.867795333Z" level=info msg="StopPodSandbox for \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\"" Jan 13 20:20:25.868263 containerd[1478]: time="2025-01-13T20:20:25.868043374Z" level=info msg="Ensure that sandbox 2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b in task-service has been cleanup successfully" Jan 13 20:20:25.870846 containerd[1478]: time="2025-01-13T20:20:25.870657780Z" level=info msg="TearDown network for sandbox \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\" successfully" Jan 13 20:20:25.870846 containerd[1478]: time="2025-01-13T20:20:25.870839620Z" level=info msg="StopPodSandbox for \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\" returns successfully" Jan 13 20:20:25.876873 containerd[1478]: time="2025-01-13T20:20:25.876804554Z" level=info msg="StopPodSandbox for \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\"" Jan 13 20:20:25.877022 containerd[1478]: time="2025-01-13T20:20:25.876939715Z" level=info msg="TearDown network for sandbox \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\" successfully" Jan 13 20:20:25.877022 containerd[1478]: time="2025-01-13T20:20:25.876951835Z" level=info msg="StopPodSandbox for \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\" returns successfully" Jan 13 20:20:25.883608 containerd[1478]: time="2025-01-13T20:20:25.883550810Z" level=info msg="StopPodSandbox for \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\"" Jan 13 20:20:25.884205 containerd[1478]: time="2025-01-13T20:20:25.884162412Z" level=info msg="TearDown network for sandbox \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\" successfully" Jan 13 20:20:25.884205 containerd[1478]: time="2025-01-13T20:20:25.884198692Z" level=info msg="StopPodSandbox for \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\" returns successfully" Jan 13 20:20:25.890095 containerd[1478]: time="2025-01-13T20:20:25.890047665Z" level=info msg="StopPodSandbox for \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\"" Jan 13 20:20:25.891687 containerd[1478]: time="2025-01-13T20:20:25.890954147Z" level=info msg="TearDown network for sandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\" successfully" Jan 13 20:20:25.891687 containerd[1478]: time="2025-01-13T20:20:25.890986428Z" level=info msg="StopPodSandbox for \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\" returns successfully" Jan 13 20:20:25.901211 containerd[1478]: time="2025-01-13T20:20:25.900882451Z" level=info msg="StopPodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\"" Jan 13 20:20:25.903807 containerd[1478]: time="2025-01-13T20:20:25.903756097Z" level=info msg="TearDown network for sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" successfully" Jan 13 20:20:25.903922 containerd[1478]: time="2025-01-13T20:20:25.903907338Z" level=info msg="StopPodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" returns successfully" Jan 13 20:20:25.909089 containerd[1478]: time="2025-01-13T20:20:25.907923307Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8n6tx,Uid:acbb6d2d-5611-4557-91bf-b12ca46c13f5,Namespace:kube-system,Attempt:5,}" Jan 13 20:20:25.915004 kubelet[2817]: I0113 20:20:25.914971 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7" Jan 13 20:20:25.921571 containerd[1478]: time="2025-01-13T20:20:25.920814697Z" level=info msg="StopPodSandbox for \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\"" Jan 13 20:20:25.925163 containerd[1478]: time="2025-01-13T20:20:25.925116707Z" level=info msg="Ensure that sandbox a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7 in task-service has been cleanup successfully" Jan 13 20:20:25.933157 containerd[1478]: time="2025-01-13T20:20:25.933042006Z" level=info msg="TearDown network for sandbox \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\" successfully" Jan 13 20:20:25.933157 containerd[1478]: time="2025-01-13T20:20:25.933109726Z" level=info msg="StopPodSandbox for \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\" returns successfully" Jan 13 20:20:25.933914 containerd[1478]: time="2025-01-13T20:20:25.933876368Z" level=info msg="StopPodSandbox for \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\"" Jan 13 20:20:25.934435 containerd[1478]: time="2025-01-13T20:20:25.934310729Z" level=info msg="TearDown network for sandbox \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\" successfully" Jan 13 20:20:25.934499 containerd[1478]: time="2025-01-13T20:20:25.934473729Z" level=info msg="StopPodSandbox for \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\" returns successfully" Jan 13 20:20:25.938248 containerd[1478]: time="2025-01-13T20:20:25.938193738Z" level=info msg="StopPodSandbox for \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\"" Jan 13 20:20:25.939125 containerd[1478]: time="2025-01-13T20:20:25.939075260Z" level=info msg="TearDown network for sandbox \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\" successfully" Jan 13 20:20:25.939125 containerd[1478]: time="2025-01-13T20:20:25.939113820Z" level=info msg="StopPodSandbox for \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\" returns successfully" Jan 13 20:20:25.942174 containerd[1478]: time="2025-01-13T20:20:25.942021547Z" level=info msg="StopPodSandbox for \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\"" Jan 13 20:20:25.942174 containerd[1478]: time="2025-01-13T20:20:25.942143827Z" level=info msg="TearDown network for sandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\" successfully" Jan 13 20:20:25.942174 containerd[1478]: time="2025-01-13T20:20:25.942157387Z" level=info msg="StopPodSandbox for \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\" returns successfully" Jan 13 20:20:25.944378 containerd[1478]: time="2025-01-13T20:20:25.944323792Z" level=info msg="StopPodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\"" Jan 13 20:20:25.947805 containerd[1478]: time="2025-01-13T20:20:25.947740360Z" level=info msg="TearDown network for sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" successfully" Jan 13 20:20:25.948058 containerd[1478]: time="2025-01-13T20:20:25.947795760Z" level=info msg="StopPodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" returns successfully" Jan 13 20:20:25.958719 containerd[1478]: time="2025-01-13T20:20:25.958667506Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-khp7b,Uid:ff1a3779-f857-4921-b0c5-fdad56861f50,Namespace:kube-system,Attempt:5,}" Jan 13 20:20:26.035228 containerd[1478]: time="2025-01-13T20:20:26.035029164Z" level=error msg="Failed to destroy network for sandbox \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.037907 containerd[1478]: time="2025-01-13T20:20:26.036159446Z" level=error msg="encountered an error cleaning up failed sandbox \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.037907 containerd[1478]: time="2025-01-13T20:20:26.037070328Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vhlrh,Uid:d410432e-4da0-436d-8d3b-2586cacab46b,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.040627 kubelet[2817]: E0113 20:20:26.038497 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.040627 kubelet[2817]: E0113 20:20:26.038566 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:26.040627 kubelet[2817]: E0113 20:20:26.038589 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vhlrh" Jan 13 20:20:26.044821 kubelet[2817]: E0113 20:20:26.038638 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vhlrh_calico-system(d410432e-4da0-436d-8d3b-2586cacab46b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vhlrh_calico-system(d410432e-4da0-436d-8d3b-2586cacab46b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vhlrh" podUID="d410432e-4da0-436d-8d3b-2586cacab46b" Jan 13 20:20:26.172505 containerd[1478]: time="2025-01-13T20:20:26.172397961Z" level=error msg="Failed to destroy network for sandbox \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.175149 containerd[1478]: time="2025-01-13T20:20:26.175086287Z" level=error msg="encountered an error cleaning up failed sandbox \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.175395 containerd[1478]: time="2025-01-13T20:20:26.175370888Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-zsmfk,Uid:189c7963-e6bf-46b5-b6d5-9d268e857385,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.176155 kubelet[2817]: E0113 20:20:26.176107 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.176455 kubelet[2817]: E0113 20:20:26.176403 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:26.176503 kubelet[2817]: E0113 20:20:26.176460 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" Jan 13 20:20:26.176584 kubelet[2817]: E0113 20:20:26.176526 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7776878d6f-zsmfk_calico-apiserver(189c7963-e6bf-46b5-b6d5-9d268e857385)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7776878d6f-zsmfk_calico-apiserver(189c7963-e6bf-46b5-b6d5-9d268e857385)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" podUID="189c7963-e6bf-46b5-b6d5-9d268e857385" Jan 13 20:20:26.220144 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23-shm.mount: Deactivated successfully. Jan 13 20:20:26.220991 systemd[1]: run-netns-cni\x2dee1c2a67\x2dda73\x2d4c7d\x2de38f\x2d3b08346a72fe.mount: Deactivated successfully. Jan 13 20:20:26.221188 systemd[1]: run-netns-cni\x2d62ca91f9\x2d13e8\x2d8881\x2df8e9\x2dcd2eab6e660e.mount: Deactivated successfully. Jan 13 20:20:26.221240 systemd[1]: run-netns-cni\x2da2b5ac7e\x2d6fe2\x2d20fb\x2d1d8f\x2dbe7f1c081c3e.mount: Deactivated successfully. Jan 13 20:20:26.221289 systemd[1]: run-netns-cni\x2d360fbb9d\x2d6067\x2de0a4\x2d8f10\x2ddcb212cb32d2.mount: Deactivated successfully. Jan 13 20:20:26.239117 containerd[1478]: time="2025-01-13T20:20:26.239065715Z" level=error msg="Failed to destroy network for sandbox \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.240395 containerd[1478]: time="2025-01-13T20:20:26.239618516Z" level=error msg="encountered an error cleaning up failed sandbox \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.240395 containerd[1478]: time="2025-01-13T20:20:26.239696196Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8n6tx,Uid:acbb6d2d-5611-4557-91bf-b12ca46c13f5,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.241667 kubelet[2817]: E0113 20:20:26.239930 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.241667 kubelet[2817]: E0113 20:20:26.239990 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:26.241667 kubelet[2817]: E0113 20:20:26.240012 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8n6tx" Jan 13 20:20:26.241811 kubelet[2817]: E0113 20:20:26.240054 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-8n6tx_kube-system(acbb6d2d-5611-4557-91bf-b12ca46c13f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-8n6tx_kube-system(acbb6d2d-5611-4557-91bf-b12ca46c13f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-8n6tx" podUID="acbb6d2d-5611-4557-91bf-b12ca46c13f5" Jan 13 20:20:26.243965 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33-shm.mount: Deactivated successfully. Jan 13 20:20:26.277622 containerd[1478]: time="2025-01-13T20:20:26.275393439Z" level=error msg="Failed to destroy network for sandbox \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.278428 containerd[1478]: time="2025-01-13T20:20:26.278119765Z" level=error msg="encountered an error cleaning up failed sandbox \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.278428 containerd[1478]: time="2025-01-13T20:20:26.278295205Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-kzg6k,Uid:3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.278797 kubelet[2817]: E0113 20:20:26.278747 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.278878 kubelet[2817]: E0113 20:20:26.278816 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:26.278878 kubelet[2817]: E0113 20:20:26.278839 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" Jan 13 20:20:26.278982 kubelet[2817]: E0113 20:20:26.278884 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7776878d6f-kzg6k_calico-apiserver(3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7776878d6f-kzg6k_calico-apiserver(3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" podUID="3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4" Jan 13 20:20:26.281158 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276-shm.mount: Deactivated successfully. Jan 13 20:20:26.288376 containerd[1478]: time="2025-01-13T20:20:26.288132148Z" level=error msg="Failed to destroy network for sandbox \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.294063 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06-shm.mount: Deactivated successfully. Jan 13 20:20:26.295311 containerd[1478]: time="2025-01-13T20:20:26.294751524Z" level=error msg="encountered an error cleaning up failed sandbox \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.296238 containerd[1478]: time="2025-01-13T20:20:26.296181887Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d9b896c9c-x5cqp,Uid:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.296748 kubelet[2817]: E0113 20:20:26.296699 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.296833 kubelet[2817]: E0113 20:20:26.296768 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:26.296833 kubelet[2817]: E0113 20:20:26.296793 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" Jan 13 20:20:26.296954 kubelet[2817]: E0113 20:20:26.296842 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-d9b896c9c-x5cqp_calico-system(2fb62ec7-4c06-48e1-aa87-7b62ac4da84a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-d9b896c9c-x5cqp_calico-system(2fb62ec7-4c06-48e1-aa87-7b62ac4da84a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" podUID="2fb62ec7-4c06-48e1-aa87-7b62ac4da84a" Jan 13 20:20:26.310970 containerd[1478]: time="2025-01-13T20:20:26.310876881Z" level=error msg="Failed to destroy network for sandbox \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.313126 containerd[1478]: time="2025-01-13T20:20:26.312848685Z" level=error msg="encountered an error cleaning up failed sandbox \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.313126 containerd[1478]: time="2025-01-13T20:20:26.312959046Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-khp7b,Uid:ff1a3779-f857-4921-b0c5-fdad56861f50,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.315092 kubelet[2817]: E0113 20:20:26.314967 2817 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 20:20:26.315092 kubelet[2817]: E0113 20:20:26.315054 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-khp7b" Jan 13 20:20:26.315092 kubelet[2817]: E0113 20:20:26.315079 2817 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-khp7b" Jan 13 20:20:26.316219 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287-shm.mount: Deactivated successfully. Jan 13 20:20:26.317731 kubelet[2817]: E0113 20:20:26.315674 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-khp7b_kube-system(ff1a3779-f857-4921-b0c5-fdad56861f50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-khp7b_kube-system(ff1a3779-f857-4921-b0c5-fdad56861f50)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-khp7b" podUID="ff1a3779-f857-4921-b0c5-fdad56861f50" Jan 13 20:20:26.396831 containerd[1478]: time="2025-01-13T20:20:26.395991157Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:26.398056 containerd[1478]: time="2025-01-13T20:20:26.397857882Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=137671762" Jan 13 20:20:26.399521 containerd[1478]: time="2025-01-13T20:20:26.399109045Z" level=info msg="ImageCreate event name:\"sha256:680b8c280812d12c035ca9f0deedea7c761afe0f1cc65109ea2f96bf63801758\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:26.402294 containerd[1478]: time="2025-01-13T20:20:26.402225132Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:26.403442 containerd[1478]: time="2025-01-13T20:20:26.403382094Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:680b8c280812d12c035ca9f0deedea7c761afe0f1cc65109ea2f96bf63801758\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"137671624\" in 5.967947975s" Jan 13 20:20:26.403442 containerd[1478]: time="2025-01-13T20:20:26.403436935Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:680b8c280812d12c035ca9f0deedea7c761afe0f1cc65109ea2f96bf63801758\"" Jan 13 20:20:26.415309 containerd[1478]: time="2025-01-13T20:20:26.415251002Z" level=info msg="CreateContainer within sandbox \"abbce71297145ea6dc3ab8b9590a51a491c5f20170a5f3c1b2b52c7d4ea2b995\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 13 20:20:26.439627 containerd[1478]: time="2025-01-13T20:20:26.439412818Z" level=info msg="CreateContainer within sandbox \"abbce71297145ea6dc3ab8b9590a51a491c5f20170a5f3c1b2b52c7d4ea2b995\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"96b9690d96e824ba117600c7f0a7137d27ed11f91390112f21873ab8d07e5388\"" Jan 13 20:20:26.441814 containerd[1478]: time="2025-01-13T20:20:26.440497180Z" level=info msg="StartContainer for \"96b9690d96e824ba117600c7f0a7137d27ed11f91390112f21873ab8d07e5388\"" Jan 13 20:20:26.471172 systemd[1]: Started cri-containerd-96b9690d96e824ba117600c7f0a7137d27ed11f91390112f21873ab8d07e5388.scope - libcontainer container 96b9690d96e824ba117600c7f0a7137d27ed11f91390112f21873ab8d07e5388. Jan 13 20:20:26.515477 containerd[1478]: time="2025-01-13T20:20:26.515278153Z" level=info msg="StartContainer for \"96b9690d96e824ba117600c7f0a7137d27ed11f91390112f21873ab8d07e5388\" returns successfully" Jan 13 20:20:26.628147 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 13 20:20:26.628562 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 13 20:20:26.923226 kubelet[2817]: I0113 20:20:26.923190 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276" Jan 13 20:20:26.924588 containerd[1478]: time="2025-01-13T20:20:26.924189857Z" level=info msg="StopPodSandbox for \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\"" Jan 13 20:20:26.924588 containerd[1478]: time="2025-01-13T20:20:26.924381858Z" level=info msg="Ensure that sandbox 80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276 in task-service has been cleanup successfully" Jan 13 20:20:26.925407 containerd[1478]: time="2025-01-13T20:20:26.925172060Z" level=info msg="TearDown network for sandbox \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\" successfully" Jan 13 20:20:26.925407 containerd[1478]: time="2025-01-13T20:20:26.925196700Z" level=info msg="StopPodSandbox for \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\" returns successfully" Jan 13 20:20:26.927372 containerd[1478]: time="2025-01-13T20:20:26.926743143Z" level=info msg="StopPodSandbox for \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\"" Jan 13 20:20:26.927372 containerd[1478]: time="2025-01-13T20:20:26.926893104Z" level=info msg="TearDown network for sandbox \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\" successfully" Jan 13 20:20:26.927372 containerd[1478]: time="2025-01-13T20:20:26.926907504Z" level=info msg="StopPodSandbox for \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\" returns successfully" Jan 13 20:20:26.929012 containerd[1478]: time="2025-01-13T20:20:26.928790348Z" level=info msg="StopPodSandbox for \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\"" Jan 13 20:20:26.929012 containerd[1478]: time="2025-01-13T20:20:26.928904228Z" level=info msg="TearDown network for sandbox \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\" successfully" Jan 13 20:20:26.929012 containerd[1478]: time="2025-01-13T20:20:26.928914508Z" level=info msg="StopPodSandbox for \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\" returns successfully" Jan 13 20:20:26.930242 containerd[1478]: time="2025-01-13T20:20:26.929533990Z" level=info msg="StopPodSandbox for \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\"" Jan 13 20:20:26.930242 containerd[1478]: time="2025-01-13T20:20:26.929662150Z" level=info msg="TearDown network for sandbox \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\" successfully" Jan 13 20:20:26.930242 containerd[1478]: time="2025-01-13T20:20:26.929686070Z" level=info msg="StopPodSandbox for \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\" returns successfully" Jan 13 20:20:26.931149 containerd[1478]: time="2025-01-13T20:20:26.931034313Z" level=info msg="StopPodSandbox for \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\"" Jan 13 20:20:26.931463 containerd[1478]: time="2025-01-13T20:20:26.931281274Z" level=info msg="TearDown network for sandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\" successfully" Jan 13 20:20:26.931463 containerd[1478]: time="2025-01-13T20:20:26.931305874Z" level=info msg="StopPodSandbox for \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\" returns successfully" Jan 13 20:20:26.932411 containerd[1478]: time="2025-01-13T20:20:26.932374356Z" level=info msg="StopPodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\"" Jan 13 20:20:26.933384 kubelet[2817]: I0113 20:20:26.932666 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06" Jan 13 20:20:26.934842 containerd[1478]: time="2025-01-13T20:20:26.934769402Z" level=info msg="TearDown network for sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" successfully" Jan 13 20:20:26.934842 containerd[1478]: time="2025-01-13T20:20:26.934812962Z" level=info msg="StopPodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" returns successfully" Jan 13 20:20:26.935530 containerd[1478]: time="2025-01-13T20:20:26.935478243Z" level=info msg="StopPodSandbox for \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\"" Jan 13 20:20:26.936820 containerd[1478]: time="2025-01-13T20:20:26.935947965Z" level=info msg="Ensure that sandbox 177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06 in task-service has been cleanup successfully" Jan 13 20:20:26.936820 containerd[1478]: time="2025-01-13T20:20:26.936192445Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-kzg6k,Uid:3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4,Namespace:calico-apiserver,Attempt:6,}" Jan 13 20:20:26.936820 containerd[1478]: time="2025-01-13T20:20:26.936199965Z" level=info msg="TearDown network for sandbox \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\" successfully" Jan 13 20:20:26.936820 containerd[1478]: time="2025-01-13T20:20:26.936284445Z" level=info msg="StopPodSandbox for \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\" returns successfully" Jan 13 20:20:26.937295 containerd[1478]: time="2025-01-13T20:20:26.937252768Z" level=info msg="StopPodSandbox for \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\"" Jan 13 20:20:26.937671 containerd[1478]: time="2025-01-13T20:20:26.937575808Z" level=info msg="TearDown network for sandbox \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\" successfully" Jan 13 20:20:26.937757 containerd[1478]: time="2025-01-13T20:20:26.937701769Z" level=info msg="StopPodSandbox for \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\" returns successfully" Jan 13 20:20:26.938313 containerd[1478]: time="2025-01-13T20:20:26.938281290Z" level=info msg="StopPodSandbox for \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\"" Jan 13 20:20:26.940184 containerd[1478]: time="2025-01-13T20:20:26.939726893Z" level=info msg="TearDown network for sandbox \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\" successfully" Jan 13 20:20:26.940184 containerd[1478]: time="2025-01-13T20:20:26.939766893Z" level=info msg="StopPodSandbox for \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\" returns successfully" Jan 13 20:20:26.941568 containerd[1478]: time="2025-01-13T20:20:26.940479095Z" level=info msg="StopPodSandbox for \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\"" Jan 13 20:20:26.941568 containerd[1478]: time="2025-01-13T20:20:26.940698615Z" level=info msg="TearDown network for sandbox \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\" successfully" Jan 13 20:20:26.941568 containerd[1478]: time="2025-01-13T20:20:26.940738056Z" level=info msg="StopPodSandbox for \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\" returns successfully" Jan 13 20:20:26.942283 containerd[1478]: time="2025-01-13T20:20:26.942254379Z" level=info msg="StopPodSandbox for \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\"" Jan 13 20:20:26.943220 containerd[1478]: time="2025-01-13T20:20:26.943177861Z" level=info msg="TearDown network for sandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\" successfully" Jan 13 20:20:26.943517 containerd[1478]: time="2025-01-13T20:20:26.943490062Z" level=info msg="StopPodSandbox for \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\" returns successfully" Jan 13 20:20:26.944133 containerd[1478]: time="2025-01-13T20:20:26.944079943Z" level=info msg="StopPodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\"" Jan 13 20:20:26.944266 kubelet[2817]: I0113 20:20:26.944239 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33" Jan 13 20:20:26.944720 containerd[1478]: time="2025-01-13T20:20:26.944434904Z" level=info msg="TearDown network for sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" successfully" Jan 13 20:20:26.944720 containerd[1478]: time="2025-01-13T20:20:26.944456544Z" level=info msg="StopPodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" returns successfully" Jan 13 20:20:26.945959 containerd[1478]: time="2025-01-13T20:20:26.945925108Z" level=info msg="StopPodSandbox for \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\"" Jan 13 20:20:26.946121 containerd[1478]: time="2025-01-13T20:20:26.946101908Z" level=info msg="Ensure that sandbox 8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33 in task-service has been cleanup successfully" Jan 13 20:20:26.947472 containerd[1478]: time="2025-01-13T20:20:26.947057590Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d9b896c9c-x5cqp,Uid:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,Namespace:calico-system,Attempt:6,}" Jan 13 20:20:26.947472 containerd[1478]: time="2025-01-13T20:20:26.947061350Z" level=info msg="TearDown network for sandbox \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\" successfully" Jan 13 20:20:26.947472 containerd[1478]: time="2025-01-13T20:20:26.947311391Z" level=info msg="StopPodSandbox for \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\" returns successfully" Jan 13 20:20:26.948787 containerd[1478]: time="2025-01-13T20:20:26.948756354Z" level=info msg="StopPodSandbox for \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\"" Jan 13 20:20:26.949234 containerd[1478]: time="2025-01-13T20:20:26.949001395Z" level=info msg="TearDown network for sandbox \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\" successfully" Jan 13 20:20:26.949234 containerd[1478]: time="2025-01-13T20:20:26.949018835Z" level=info msg="StopPodSandbox for \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\" returns successfully" Jan 13 20:20:26.949690 containerd[1478]: time="2025-01-13T20:20:26.949646276Z" level=info msg="StopPodSandbox for \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\"" Jan 13 20:20:26.949832 containerd[1478]: time="2025-01-13T20:20:26.949751996Z" level=info msg="TearDown network for sandbox \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\" successfully" Jan 13 20:20:26.949832 containerd[1478]: time="2025-01-13T20:20:26.949766396Z" level=info msg="StopPodSandbox for \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\" returns successfully" Jan 13 20:20:26.951711 containerd[1478]: time="2025-01-13T20:20:26.950764319Z" level=info msg="StopPodSandbox for \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\"" Jan 13 20:20:26.951711 containerd[1478]: time="2025-01-13T20:20:26.950886039Z" level=info msg="TearDown network for sandbox \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\" successfully" Jan 13 20:20:26.951711 containerd[1478]: time="2025-01-13T20:20:26.950899559Z" level=info msg="StopPodSandbox for \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\" returns successfully" Jan 13 20:20:26.951711 containerd[1478]: time="2025-01-13T20:20:26.951230720Z" level=info msg="StopPodSandbox for \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\"" Jan 13 20:20:26.951711 containerd[1478]: time="2025-01-13T20:20:26.951330200Z" level=info msg="TearDown network for sandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\" successfully" Jan 13 20:20:26.951711 containerd[1478]: time="2025-01-13T20:20:26.951341280Z" level=info msg="StopPodSandbox for \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\" returns successfully" Jan 13 20:20:26.952260 containerd[1478]: time="2025-01-13T20:20:26.952146402Z" level=info msg="StopPodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\"" Jan 13 20:20:26.952483 containerd[1478]: time="2025-01-13T20:20:26.952409763Z" level=info msg="TearDown network for sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" successfully" Jan 13 20:20:26.952678 containerd[1478]: time="2025-01-13T20:20:26.952660203Z" level=info msg="StopPodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" returns successfully" Jan 13 20:20:26.953017 kubelet[2817]: I0113 20:20:26.952979 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287" Jan 13 20:20:26.956160 containerd[1478]: time="2025-01-13T20:20:26.956111371Z" level=info msg="StopPodSandbox for \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\"" Jan 13 20:20:26.957472 containerd[1478]: time="2025-01-13T20:20:26.956957813Z" level=info msg="Ensure that sandbox 857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287 in task-service has been cleanup successfully" Jan 13 20:20:26.960076 containerd[1478]: time="2025-01-13T20:20:26.959583139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8n6tx,Uid:acbb6d2d-5611-4557-91bf-b12ca46c13f5,Namespace:kube-system,Attempt:6,}" Jan 13 20:20:27.004053 containerd[1478]: time="2025-01-13T20:20:27.003840121Z" level=info msg="TearDown network for sandbox \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\" successfully" Jan 13 20:20:27.004053 containerd[1478]: time="2025-01-13T20:20:27.003896801Z" level=info msg="StopPodSandbox for \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\" returns successfully" Jan 13 20:20:27.015889 kubelet[2817]: I0113 20:20:27.015773 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-lp5z6" podStartSLOduration=1.758858516 podStartE2EDuration="17.015752508s" podCreationTimestamp="2025-01-13 20:20:10 +0000 UTC" firstStartedPulling="2025-01-13 20:20:11.147759265 +0000 UTC m=+44.060419117" lastFinishedPulling="2025-01-13 20:20:26.404653257 +0000 UTC m=+59.317313109" observedRunningTime="2025-01-13 20:20:27.008365292 +0000 UTC m=+59.921025144" watchObservedRunningTime="2025-01-13 20:20:27.015752508 +0000 UTC m=+59.928412320" Jan 13 20:20:27.038109 containerd[1478]: time="2025-01-13T20:20:27.037920639Z" level=info msg="StopPodSandbox for \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\"" Jan 13 20:20:27.038789 containerd[1478]: time="2025-01-13T20:20:27.038556520Z" level=info msg="TearDown network for sandbox \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\" successfully" Jan 13 20:20:27.038789 containerd[1478]: time="2025-01-13T20:20:27.038588320Z" level=info msg="StopPodSandbox for \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\" returns successfully" Jan 13 20:20:27.042132 containerd[1478]: time="2025-01-13T20:20:27.041898288Z" level=info msg="StopPodSandbox for \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\"" Jan 13 20:20:27.042132 containerd[1478]: time="2025-01-13T20:20:27.042026168Z" level=info msg="TearDown network for sandbox \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\" successfully" Jan 13 20:20:27.042132 containerd[1478]: time="2025-01-13T20:20:27.042037048Z" level=info msg="StopPodSandbox for \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\" returns successfully" Jan 13 20:20:27.048693 containerd[1478]: time="2025-01-13T20:20:27.046988860Z" level=info msg="StopPodSandbox for \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\"" Jan 13 20:20:27.048693 containerd[1478]: time="2025-01-13T20:20:27.047096860Z" level=info msg="TearDown network for sandbox \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\" successfully" Jan 13 20:20:27.048693 containerd[1478]: time="2025-01-13T20:20:27.047107180Z" level=info msg="StopPodSandbox for \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\" returns successfully" Jan 13 20:20:27.050654 containerd[1478]: time="2025-01-13T20:20:27.050536668Z" level=info msg="StopPodSandbox for \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\"" Jan 13 20:20:27.054376 containerd[1478]: time="2025-01-13T20:20:27.054316796Z" level=info msg="TearDown network for sandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\" successfully" Jan 13 20:20:27.055634 kubelet[2817]: I0113 20:20:27.055584 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23" Jan 13 20:20:27.057721 containerd[1478]: time="2025-01-13T20:20:27.057664924Z" level=info msg="StopPodSandbox for \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\" returns successfully" Jan 13 20:20:27.060072 containerd[1478]: time="2025-01-13T20:20:27.057480844Z" level=info msg="StopPodSandbox for \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\"" Jan 13 20:20:27.060072 containerd[1478]: time="2025-01-13T20:20:27.059747049Z" level=info msg="Ensure that sandbox ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23 in task-service has been cleanup successfully" Jan 13 20:20:27.061020 containerd[1478]: time="2025-01-13T20:20:27.060884051Z" level=info msg="StopPodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\"" Jan 13 20:20:27.064066 containerd[1478]: time="2025-01-13T20:20:27.063205257Z" level=info msg="TearDown network for sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" successfully" Jan 13 20:20:27.064066 containerd[1478]: time="2025-01-13T20:20:27.063235777Z" level=info msg="StopPodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" returns successfully" Jan 13 20:20:27.064066 containerd[1478]: time="2025-01-13T20:20:27.063291577Z" level=info msg="TearDown network for sandbox \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\" successfully" Jan 13 20:20:27.064066 containerd[1478]: time="2025-01-13T20:20:27.063300617Z" level=info msg="StopPodSandbox for \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\" returns successfully" Jan 13 20:20:27.065404 containerd[1478]: time="2025-01-13T20:20:27.064381579Z" level=info msg="StopPodSandbox for \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\"" Jan 13 20:20:27.065404 containerd[1478]: time="2025-01-13T20:20:27.064501740Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-khp7b,Uid:ff1a3779-f857-4921-b0c5-fdad56861f50,Namespace:kube-system,Attempt:6,}" Jan 13 20:20:27.065404 containerd[1478]: time="2025-01-13T20:20:27.065104541Z" level=info msg="TearDown network for sandbox \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\" successfully" Jan 13 20:20:27.065404 containerd[1478]: time="2025-01-13T20:20:27.065124261Z" level=info msg="StopPodSandbox for \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\" returns successfully" Jan 13 20:20:27.066303 containerd[1478]: time="2025-01-13T20:20:27.066264664Z" level=info msg="StopPodSandbox for \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\"" Jan 13 20:20:27.066495 containerd[1478]: time="2025-01-13T20:20:27.066406664Z" level=info msg="TearDown network for sandbox \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\" successfully" Jan 13 20:20:27.066495 containerd[1478]: time="2025-01-13T20:20:27.066483464Z" level=info msg="StopPodSandbox for \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\" returns successfully" Jan 13 20:20:27.067632 containerd[1478]: time="2025-01-13T20:20:27.067537507Z" level=info msg="StopPodSandbox for \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\"" Jan 13 20:20:27.067859 containerd[1478]: time="2025-01-13T20:20:27.067783747Z" level=info msg="TearDown network for sandbox \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\" successfully" Jan 13 20:20:27.067859 containerd[1478]: time="2025-01-13T20:20:27.067802187Z" level=info msg="StopPodSandbox for \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\" returns successfully" Jan 13 20:20:27.069014 containerd[1478]: time="2025-01-13T20:20:27.068967710Z" level=info msg="StopPodSandbox for \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\"" Jan 13 20:20:27.069119 containerd[1478]: time="2025-01-13T20:20:27.069098630Z" level=info msg="TearDown network for sandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\" successfully" Jan 13 20:20:27.069119 containerd[1478]: time="2025-01-13T20:20:27.069114550Z" level=info msg="StopPodSandbox for \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\" returns successfully" Jan 13 20:20:27.071744 containerd[1478]: time="2025-01-13T20:20:27.071699436Z" level=info msg="StopPodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\"" Jan 13 20:20:27.071851 containerd[1478]: time="2025-01-13T20:20:27.071807236Z" level=info msg="TearDown network for sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" successfully" Jan 13 20:20:27.071851 containerd[1478]: time="2025-01-13T20:20:27.071819476Z" level=info msg="StopPodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" returns successfully" Jan 13 20:20:27.072148 kubelet[2817]: I0113 20:20:27.072109 2817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8" Jan 13 20:20:27.073156 containerd[1478]: time="2025-01-13T20:20:27.073102879Z" level=info msg="StopPodSandbox for \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\"" Jan 13 20:20:27.073526 containerd[1478]: time="2025-01-13T20:20:27.073278840Z" level=info msg="Ensure that sandbox 8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8 in task-service has been cleanup successfully" Jan 13 20:20:27.075855 containerd[1478]: time="2025-01-13T20:20:27.075162364Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vhlrh,Uid:d410432e-4da0-436d-8d3b-2586cacab46b,Namespace:calico-system,Attempt:6,}" Jan 13 20:20:27.076961 containerd[1478]: time="2025-01-13T20:20:27.076922968Z" level=info msg="TearDown network for sandbox \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\" successfully" Jan 13 20:20:27.077503 containerd[1478]: time="2025-01-13T20:20:27.077468129Z" level=info msg="StopPodSandbox for \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\" returns successfully" Jan 13 20:20:27.083725 containerd[1478]: time="2025-01-13T20:20:27.082582821Z" level=info msg="StopPodSandbox for \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\"" Jan 13 20:20:27.090548 containerd[1478]: time="2025-01-13T20:20:27.090310878Z" level=info msg="TearDown network for sandbox \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\" successfully" Jan 13 20:20:27.090548 containerd[1478]: time="2025-01-13T20:20:27.090535559Z" level=info msg="StopPodSandbox for \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\" returns successfully" Jan 13 20:20:27.091861 containerd[1478]: time="2025-01-13T20:20:27.091692362Z" level=info msg="StopPodSandbox for \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\"" Jan 13 20:20:27.092141 containerd[1478]: time="2025-01-13T20:20:27.092059722Z" level=info msg="TearDown network for sandbox \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\" successfully" Jan 13 20:20:27.092141 containerd[1478]: time="2025-01-13T20:20:27.092080242Z" level=info msg="StopPodSandbox for \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\" returns successfully" Jan 13 20:20:27.094049 containerd[1478]: time="2025-01-13T20:20:27.093909367Z" level=info msg="StopPodSandbox for \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\"" Jan 13 20:20:27.094049 containerd[1478]: time="2025-01-13T20:20:27.094020167Z" level=info msg="TearDown network for sandbox \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\" successfully" Jan 13 20:20:27.094049 containerd[1478]: time="2025-01-13T20:20:27.094030927Z" level=info msg="StopPodSandbox for \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\" returns successfully" Jan 13 20:20:27.095914 containerd[1478]: time="2025-01-13T20:20:27.095385610Z" level=info msg="StopPodSandbox for \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\"" Jan 13 20:20:27.096881 containerd[1478]: time="2025-01-13T20:20:27.096671053Z" level=info msg="TearDown network for sandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\" successfully" Jan 13 20:20:27.096881 containerd[1478]: time="2025-01-13T20:20:27.096834093Z" level=info msg="StopPodSandbox for \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\" returns successfully" Jan 13 20:20:27.098117 containerd[1478]: time="2025-01-13T20:20:27.097737415Z" level=info msg="StopPodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\"" Jan 13 20:20:27.098117 containerd[1478]: time="2025-01-13T20:20:27.098066416Z" level=info msg="TearDown network for sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" successfully" Jan 13 20:20:27.098117 containerd[1478]: time="2025-01-13T20:20:27.098077896Z" level=info msg="StopPodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" returns successfully" Jan 13 20:20:27.099382 containerd[1478]: time="2025-01-13T20:20:27.099194979Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-zsmfk,Uid:189c7963-e6bf-46b5-b6d5-9d268e857385,Namespace:calico-apiserver,Attempt:6,}" Jan 13 20:20:27.234067 systemd[1]: run-netns-cni\x2d88dd6088\x2da8d5\x2d9a99\x2de6cb\x2def11a4940e00.mount: Deactivated successfully. Jan 13 20:20:27.234175 systemd[1]: run-netns-cni\x2db7202215\x2dff08\x2dca76\x2df7ec\x2db97e6f229893.mount: Deactivated successfully. Jan 13 20:20:27.234223 systemd[1]: run-netns-cni\x2dd4c8ee80\x2d0704\x2dc36b\x2d7278\x2d11c1f4b81109.mount: Deactivated successfully. Jan 13 20:20:27.234269 systemd[1]: run-netns-cni\x2d1655c171\x2d983d\x2de5b2\x2d1df1\x2d2a4389cd1086.mount: Deactivated successfully. Jan 13 20:20:27.234315 systemd[1]: run-netns-cni\x2d75ff7810\x2d10b6\x2d42b3\x2d7864\x2dee5a0171cd11.mount: Deactivated successfully. Jan 13 20:20:27.234366 systemd[1]: run-netns-cni\x2d2961b850\x2de0dc\x2d0596\x2d43b7\x2d7a8955a1d212.mount: Deactivated successfully. Jan 13 20:20:27.234410 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3421747824.mount: Deactivated successfully. Jan 13 20:20:27.255571 containerd[1478]: time="2025-01-13T20:20:27.255254695Z" level=info msg="StopPodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\"" Jan 13 20:20:27.257195 containerd[1478]: time="2025-01-13T20:20:27.257146099Z" level=info msg="TearDown network for sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" successfully" Jan 13 20:20:27.257801 containerd[1478]: time="2025-01-13T20:20:27.257676540Z" level=info msg="StopPodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" returns successfully" Jan 13 20:20:27.263774 containerd[1478]: time="2025-01-13T20:20:27.263717914Z" level=info msg="RemovePodSandbox for \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\"" Jan 13 20:20:27.263774 containerd[1478]: time="2025-01-13T20:20:27.263770394Z" level=info msg="Forcibly stopping sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\"" Jan 13 20:20:27.263905 containerd[1478]: time="2025-01-13T20:20:27.263862754Z" level=info msg="TearDown network for sandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" successfully" Jan 13 20:20:27.293581 containerd[1478]: time="2025-01-13T20:20:27.293526262Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.296234 containerd[1478]: time="2025-01-13T20:20:27.295729307Z" level=info msg="RemovePodSandbox \"b078eee97a04fa8cf9de115ab0f58c65b5b0efea3d31acb03067c84a6e9724fb\" returns successfully" Jan 13 20:20:27.297019 containerd[1478]: time="2025-01-13T20:20:27.296988110Z" level=info msg="StopPodSandbox for \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\"" Jan 13 20:20:27.298240 containerd[1478]: time="2025-01-13T20:20:27.297657911Z" level=info msg="TearDown network for sandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\" successfully" Jan 13 20:20:27.298240 containerd[1478]: time="2025-01-13T20:20:27.297865672Z" level=info msg="StopPodSandbox for \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\" returns successfully" Jan 13 20:20:27.299381 containerd[1478]: time="2025-01-13T20:20:27.299353875Z" level=info msg="RemovePodSandbox for \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\"" Jan 13 20:20:27.299567 containerd[1478]: time="2025-01-13T20:20:27.299551516Z" level=info msg="Forcibly stopping sandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\"" Jan 13 20:20:27.299736 containerd[1478]: time="2025-01-13T20:20:27.299720436Z" level=info msg="TearDown network for sandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\" successfully" Jan 13 20:20:27.355832 containerd[1478]: time="2025-01-13T20:20:27.355785364Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.357045 containerd[1478]: time="2025-01-13T20:20:27.356104244Z" level=info msg="RemovePodSandbox \"c10d47a7a9b56cb347eed3b7e0fa75d2f34cdf60b893c17466da408470f853f1\" returns successfully" Jan 13 20:20:27.360735 containerd[1478]: time="2025-01-13T20:20:27.360510015Z" level=info msg="StopPodSandbox for \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\"" Jan 13 20:20:27.360735 containerd[1478]: time="2025-01-13T20:20:27.360652935Z" level=info msg="TearDown network for sandbox \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\" successfully" Jan 13 20:20:27.360735 containerd[1478]: time="2025-01-13T20:20:27.360664295Z" level=info msg="StopPodSandbox for \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\" returns successfully" Jan 13 20:20:27.361529 containerd[1478]: time="2025-01-13T20:20:27.361310256Z" level=info msg="RemovePodSandbox for \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\"" Jan 13 20:20:27.364490 containerd[1478]: time="2025-01-13T20:20:27.361342976Z" level=info msg="Forcibly stopping sandbox \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\"" Jan 13 20:20:27.364867 containerd[1478]: time="2025-01-13T20:20:27.364832584Z" level=info msg="TearDown network for sandbox \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\" successfully" Jan 13 20:20:27.404500 containerd[1478]: time="2025-01-13T20:20:27.404449235Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.407285 containerd[1478]: time="2025-01-13T20:20:27.405379597Z" level=info msg="RemovePodSandbox \"1a4424cd434cfee5b3b1b714f014a4fc2111212353a095cbd00b0a7eed0dbb47\" returns successfully" Jan 13 20:20:27.407285 containerd[1478]: time="2025-01-13T20:20:27.407008041Z" level=info msg="StopPodSandbox for \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\"" Jan 13 20:20:27.407285 containerd[1478]: time="2025-01-13T20:20:27.407130321Z" level=info msg="TearDown network for sandbox \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\" successfully" Jan 13 20:20:27.407285 containerd[1478]: time="2025-01-13T20:20:27.407140041Z" level=info msg="StopPodSandbox for \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\" returns successfully" Jan 13 20:20:27.412111 containerd[1478]: time="2025-01-13T20:20:27.412053332Z" level=info msg="RemovePodSandbox for \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\"" Jan 13 20:20:27.412515 containerd[1478]: time="2025-01-13T20:20:27.412375053Z" level=info msg="Forcibly stopping sandbox \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\"" Jan 13 20:20:27.422092 containerd[1478]: time="2025-01-13T20:20:27.419463469Z" level=info msg="TearDown network for sandbox \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\" successfully" Jan 13 20:20:27.560033 containerd[1478]: time="2025-01-13T20:20:27.559314388Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.561662 containerd[1478]: time="2025-01-13T20:20:27.561619913Z" level=info msg="RemovePodSandbox \"5dc062dc9efcb69c1af7972c9a8b051351225f710392960fe1d1efec5271d57c\" returns successfully" Jan 13 20:20:27.564000 containerd[1478]: time="2025-01-13T20:20:27.563954318Z" level=info msg="StopPodSandbox for \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\"" Jan 13 20:20:27.565193 containerd[1478]: time="2025-01-13T20:20:27.565152481Z" level=info msg="TearDown network for sandbox \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\" successfully" Jan 13 20:20:27.565193 containerd[1478]: time="2025-01-13T20:20:27.565186481Z" level=info msg="StopPodSandbox for \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\" returns successfully" Jan 13 20:20:27.574870 containerd[1478]: time="2025-01-13T20:20:27.574820063Z" level=info msg="RemovePodSandbox for \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\"" Jan 13 20:20:27.575867 containerd[1478]: time="2025-01-13T20:20:27.575667825Z" level=info msg="Forcibly stopping sandbox \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\"" Jan 13 20:20:27.579203 containerd[1478]: time="2025-01-13T20:20:27.578370951Z" level=info msg="TearDown network for sandbox \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\" successfully" Jan 13 20:20:27.606842 containerd[1478]: time="2025-01-13T20:20:27.605819374Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.606842 containerd[1478]: time="2025-01-13T20:20:27.606773096Z" level=info msg="RemovePodSandbox \"45072d75e3bafce819dfec73c2b64e923955481886e5abcdcdd2782ebce39b6b\" returns successfully" Jan 13 20:20:27.608271 containerd[1478]: time="2025-01-13T20:20:27.608075819Z" level=info msg="StopPodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\"" Jan 13 20:20:27.608271 containerd[1478]: time="2025-01-13T20:20:27.608187819Z" level=info msg="TearDown network for sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" successfully" Jan 13 20:20:27.608271 containerd[1478]: time="2025-01-13T20:20:27.608199459Z" level=info msg="StopPodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" returns successfully" Jan 13 20:20:27.609749 containerd[1478]: time="2025-01-13T20:20:27.609714223Z" level=info msg="RemovePodSandbox for \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\"" Jan 13 20:20:27.610739 containerd[1478]: time="2025-01-13T20:20:27.610332624Z" level=info msg="Forcibly stopping sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\"" Jan 13 20:20:27.610739 containerd[1478]: time="2025-01-13T20:20:27.610449864Z" level=info msg="TearDown network for sandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" successfully" Jan 13 20:20:27.614585 containerd[1478]: time="2025-01-13T20:20:27.614534354Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.614801 containerd[1478]: time="2025-01-13T20:20:27.614780354Z" level=info msg="RemovePodSandbox \"8845f9f85be2339437eac7c128c10d54f3c107cfae2e12aa7d029042fd83696d\" returns successfully" Jan 13 20:20:27.616477 containerd[1478]: time="2025-01-13T20:20:27.616436398Z" level=info msg="StopPodSandbox for \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\"" Jan 13 20:20:27.617157 containerd[1478]: time="2025-01-13T20:20:27.617132840Z" level=info msg="TearDown network for sandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\" successfully" Jan 13 20:20:27.617255 containerd[1478]: time="2025-01-13T20:20:27.617240680Z" level=info msg="StopPodSandbox for \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\" returns successfully" Jan 13 20:20:27.620862 containerd[1478]: time="2025-01-13T20:20:27.620813848Z" level=info msg="RemovePodSandbox for \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\"" Jan 13 20:20:27.620862 containerd[1478]: time="2025-01-13T20:20:27.620865728Z" level=info msg="Forcibly stopping sandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\"" Jan 13 20:20:27.621059 containerd[1478]: time="2025-01-13T20:20:27.620957528Z" level=info msg="TearDown network for sandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\" successfully" Jan 13 20:20:27.657909 containerd[1478]: time="2025-01-13T20:20:27.657700772Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.657909 containerd[1478]: time="2025-01-13T20:20:27.657774012Z" level=info msg="RemovePodSandbox \"25bccc67afe2edb552af6bfdc68840313da3a3af37ad8b281e9cddcafa8e96db\" returns successfully" Jan 13 20:20:27.659812 containerd[1478]: time="2025-01-13T20:20:27.659217416Z" level=info msg="StopPodSandbox for \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\"" Jan 13 20:20:27.659812 containerd[1478]: time="2025-01-13T20:20:27.659496656Z" level=info msg="TearDown network for sandbox \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\" successfully" Jan 13 20:20:27.659812 containerd[1478]: time="2025-01-13T20:20:27.659512736Z" level=info msg="StopPodSandbox for \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\" returns successfully" Jan 13 20:20:27.660732 containerd[1478]: time="2025-01-13T20:20:27.660156498Z" level=info msg="RemovePodSandbox for \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\"" Jan 13 20:20:27.660732 containerd[1478]: time="2025-01-13T20:20:27.660189898Z" level=info msg="Forcibly stopping sandbox \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\"" Jan 13 20:20:27.660732 containerd[1478]: time="2025-01-13T20:20:27.660261778Z" level=info msg="TearDown network for sandbox \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\" successfully" Jan 13 20:20:27.679778 systemd-networkd[1377]: caliad7e863cac6: Link UP Jan 13 20:20:27.681555 systemd-networkd[1377]: caliad7e863cac6: Gained carrier Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.066 [INFO][4803] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.132 [INFO][4803] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--kzg6k-eth0 calico-apiserver-7776878d6f- calico-apiserver 3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4 763 0 2025-01-13 20:20:10 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7776878d6f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4186-1-0-7-a3f46aeb9c calico-apiserver-7776878d6f-kzg6k eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliad7e863cac6 [] []}} ContainerID="85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-kzg6k" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--kzg6k-" Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.132 [INFO][4803] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-kzg6k" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--kzg6k-eth0" Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.504 [INFO][4879] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" HandleID="k8s-pod-network.85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--kzg6k-eth0" Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.553 [INFO][4879] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" HandleID="k8s-pod-network.85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--kzg6k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000464ce0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4186-1-0-7-a3f46aeb9c", "pod":"calico-apiserver-7776878d6f-kzg6k", "timestamp":"2025-01-13 20:20:27.50310902 +0000 UTC"}, Hostname:"ci-4186-1-0-7-a3f46aeb9c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.553 [INFO][4879] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.553 [INFO][4879] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.553 [INFO][4879] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186-1-0-7-a3f46aeb9c' Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.562 [INFO][4879] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.576 [INFO][4879] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.587 [INFO][4879] ipam/ipam.go 489: Trying affinity for 192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.598 [INFO][4879] ipam/ipam.go 155: Attempting to load block cidr=192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.607 [INFO][4879] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.607 [INFO][4879] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.83.192/26 handle="k8s-pod-network.85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.616 [INFO][4879] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225 Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.635 [INFO][4879] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.83.192/26 handle="k8s-pod-network.85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.654 [INFO][4879] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.83.193/26] block=192.168.83.192/26 handle="k8s-pod-network.85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.655 [INFO][4879] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.83.193/26] handle="k8s-pod-network.85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.655 [INFO][4879] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:20:27.714949 containerd[1478]: 2025-01-13 20:20:27.655 [INFO][4879] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.83.193/26] IPv6=[] ContainerID="85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" HandleID="k8s-pod-network.85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--kzg6k-eth0" Jan 13 20:20:27.715535 containerd[1478]: 2025-01-13 20:20:27.665 [INFO][4803] cni-plugin/k8s.go 386: Populated endpoint ContainerID="85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-kzg6k" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--kzg6k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--kzg6k-eth0", GenerateName:"calico-apiserver-7776878d6f-", Namespace:"calico-apiserver", SelfLink:"", UID:"3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4", ResourceVersion:"763", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 20, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7776878d6f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-1-0-7-a3f46aeb9c", ContainerID:"", Pod:"calico-apiserver-7776878d6f-kzg6k", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.83.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliad7e863cac6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:20:27.715535 containerd[1478]: 2025-01-13 20:20:27.666 [INFO][4803] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.83.193/32] ContainerID="85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-kzg6k" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--kzg6k-eth0" Jan 13 20:20:27.715535 containerd[1478]: 2025-01-13 20:20:27.666 [INFO][4803] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliad7e863cac6 ContainerID="85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-kzg6k" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--kzg6k-eth0" Jan 13 20:20:27.715535 containerd[1478]: 2025-01-13 20:20:27.682 [INFO][4803] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-kzg6k" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--kzg6k-eth0" Jan 13 20:20:27.715535 containerd[1478]: 2025-01-13 20:20:27.684 [INFO][4803] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-kzg6k" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--kzg6k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--kzg6k-eth0", GenerateName:"calico-apiserver-7776878d6f-", Namespace:"calico-apiserver", SelfLink:"", UID:"3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4", ResourceVersion:"763", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 20, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7776878d6f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-1-0-7-a3f46aeb9c", ContainerID:"85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225", Pod:"calico-apiserver-7776878d6f-kzg6k", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.83.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliad7e863cac6", MAC:"da:ed:88:9e:78:42", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:20:27.715535 containerd[1478]: 2025-01-13 20:20:27.710 [INFO][4803] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-kzg6k" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--kzg6k-eth0" Jan 13 20:20:27.743753 containerd[1478]: time="2025-01-13T20:20:27.743370768Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.743753 containerd[1478]: time="2025-01-13T20:20:27.743487848Z" level=info msg="RemovePodSandbox \"f5e215a30fafe332f9552bcac1a2bf5d772ef7ccaf3b4004544dfe88d21fd77e\" returns successfully" Jan 13 20:20:27.745324 containerd[1478]: time="2025-01-13T20:20:27.745184452Z" level=info msg="StopPodSandbox for \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\"" Jan 13 20:20:27.745324 containerd[1478]: time="2025-01-13T20:20:27.745299932Z" level=info msg="TearDown network for sandbox \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\" successfully" Jan 13 20:20:27.745324 containerd[1478]: time="2025-01-13T20:20:27.745310652Z" level=info msg="StopPodSandbox for \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\" returns successfully" Jan 13 20:20:27.746752 containerd[1478]: time="2025-01-13T20:20:27.746707375Z" level=info msg="RemovePodSandbox for \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\"" Jan 13 20:20:27.746752 containerd[1478]: time="2025-01-13T20:20:27.746754895Z" level=info msg="Forcibly stopping sandbox \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\"" Jan 13 20:20:27.746848 containerd[1478]: time="2025-01-13T20:20:27.746833495Z" level=info msg="TearDown network for sandbox \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\" successfully" Jan 13 20:20:27.761836 containerd[1478]: time="2025-01-13T20:20:27.760671887Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:20:27.765618 containerd[1478]: time="2025-01-13T20:20:27.763460293Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:20:27.765618 containerd[1478]: time="2025-01-13T20:20:27.763650894Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:27.765618 containerd[1478]: time="2025-01-13T20:20:27.763780534Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:27.773298 systemd-networkd[1377]: cali269b5349490: Link UP Jan 13 20:20:27.773566 systemd-networkd[1377]: cali269b5349490: Gained carrier Jan 13 20:20:27.785105 containerd[1478]: time="2025-01-13T20:20:27.782399377Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.785105 containerd[1478]: time="2025-01-13T20:20:27.782505617Z" level=info msg="RemovePodSandbox \"b601b4116f75c17b0a47dbe9dbe4ba64ade6e965ae052e37fc3ce2080a91c321\" returns successfully" Jan 13 20:20:27.785105 containerd[1478]: time="2025-01-13T20:20:27.784869182Z" level=info msg="StopPodSandbox for \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\"" Jan 13 20:20:27.785105 containerd[1478]: time="2025-01-13T20:20:27.784999582Z" level=info msg="TearDown network for sandbox \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\" successfully" Jan 13 20:20:27.785105 containerd[1478]: time="2025-01-13T20:20:27.785011222Z" level=info msg="StopPodSandbox for \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\" returns successfully" Jan 13 20:20:27.786281 containerd[1478]: time="2025-01-13T20:20:27.786165225Z" level=info msg="RemovePodSandbox for \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\"" Jan 13 20:20:27.786281 containerd[1478]: time="2025-01-13T20:20:27.786219825Z" level=info msg="Forcibly stopping sandbox \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\"" Jan 13 20:20:27.786404 containerd[1478]: time="2025-01-13T20:20:27.786311865Z" level=info msg="TearDown network for sandbox \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\" successfully" Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.241 [INFO][4834] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.279 [INFO][4834] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186--1--0--7--a3f46aeb9c-k8s-calico--kube--controllers--d9b896c9c--x5cqp-eth0 calico-kube-controllers-d9b896c9c- calico-system 2fb62ec7-4c06-48e1-aa87-7b62ac4da84a 762 0 2025-01-13 20:20:10 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:d9b896c9c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4186-1-0-7-a3f46aeb9c calico-kube-controllers-d9b896c9c-x5cqp eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali269b5349490 [] []}} ContainerID="70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" Namespace="calico-system" Pod="calico-kube-controllers-d9b896c9c-x5cqp" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--kube--controllers--d9b896c9c--x5cqp-" Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.279 [INFO][4834] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" Namespace="calico-system" Pod="calico-kube-controllers-d9b896c9c-x5cqp" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--kube--controllers--d9b896c9c--x5cqp-eth0" Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.509 [INFO][4907] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" HandleID="k8s-pod-network.70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--kube--controllers--d9b896c9c--x5cqp-eth0" Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.581 [INFO][4907] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" HandleID="k8s-pod-network.70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--kube--controllers--d9b896c9c--x5cqp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d7b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4186-1-0-7-a3f46aeb9c", "pod":"calico-kube-controllers-d9b896c9c-x5cqp", "timestamp":"2025-01-13 20:20:27.509698875 +0000 UTC"}, Hostname:"ci-4186-1-0-7-a3f46aeb9c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.585 [INFO][4907] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.655 [INFO][4907] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.656 [INFO][4907] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186-1-0-7-a3f46aeb9c' Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.666 [INFO][4907] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.680 [INFO][4907] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.713 [INFO][4907] ipam/ipam.go 489: Trying affinity for 192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.727 [INFO][4907] ipam/ipam.go 155: Attempting to load block cidr=192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.731 [INFO][4907] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.733 [INFO][4907] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.83.192/26 handle="k8s-pod-network.70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.738 [INFO][4907] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89 Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.746 [INFO][4907] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.83.192/26 handle="k8s-pod-network.70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.757 [INFO][4907] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.83.194/26] block=192.168.83.192/26 handle="k8s-pod-network.70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.757 [INFO][4907] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.83.194/26] handle="k8s-pod-network.70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.757 [INFO][4907] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:20:27.807314 containerd[1478]: 2025-01-13 20:20:27.758 [INFO][4907] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.83.194/26] IPv6=[] ContainerID="70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" HandleID="k8s-pod-network.70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--kube--controllers--d9b896c9c--x5cqp-eth0" Jan 13 20:20:27.807963 containerd[1478]: 2025-01-13 20:20:27.766 [INFO][4834] cni-plugin/k8s.go 386: Populated endpoint ContainerID="70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" Namespace="calico-system" Pod="calico-kube-controllers-d9b896c9c-x5cqp" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--kube--controllers--d9b896c9c--x5cqp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--1--0--7--a3f46aeb9c-k8s-calico--kube--controllers--d9b896c9c--x5cqp-eth0", GenerateName:"calico-kube-controllers-d9b896c9c-", Namespace:"calico-system", SelfLink:"", UID:"2fb62ec7-4c06-48e1-aa87-7b62ac4da84a", ResourceVersion:"762", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 20, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"d9b896c9c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-1-0-7-a3f46aeb9c", ContainerID:"", Pod:"calico-kube-controllers-d9b896c9c-x5cqp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.83.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali269b5349490", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:20:27.807963 containerd[1478]: 2025-01-13 20:20:27.766 [INFO][4834] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.83.194/32] ContainerID="70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" Namespace="calico-system" Pod="calico-kube-controllers-d9b896c9c-x5cqp" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--kube--controllers--d9b896c9c--x5cqp-eth0" Jan 13 20:20:27.807963 containerd[1478]: 2025-01-13 20:20:27.766 [INFO][4834] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali269b5349490 ContainerID="70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" Namespace="calico-system" Pod="calico-kube-controllers-d9b896c9c-x5cqp" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--kube--controllers--d9b896c9c--x5cqp-eth0" Jan 13 20:20:27.807963 containerd[1478]: 2025-01-13 20:20:27.772 [INFO][4834] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" Namespace="calico-system" Pod="calico-kube-controllers-d9b896c9c-x5cqp" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--kube--controllers--d9b896c9c--x5cqp-eth0" Jan 13 20:20:27.807963 containerd[1478]: 2025-01-13 20:20:27.772 [INFO][4834] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" Namespace="calico-system" Pod="calico-kube-controllers-d9b896c9c-x5cqp" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--kube--controllers--d9b896c9c--x5cqp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--1--0--7--a3f46aeb9c-k8s-calico--kube--controllers--d9b896c9c--x5cqp-eth0", GenerateName:"calico-kube-controllers-d9b896c9c-", Namespace:"calico-system", SelfLink:"", UID:"2fb62ec7-4c06-48e1-aa87-7b62ac4da84a", ResourceVersion:"762", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 20, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"d9b896c9c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-1-0-7-a3f46aeb9c", ContainerID:"70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89", Pod:"calico-kube-controllers-d9b896c9c-x5cqp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.83.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali269b5349490", MAC:"62:bf:58:ee:e8:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:20:27.807963 containerd[1478]: 2025-01-13 20:20:27.800 [INFO][4834] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89" Namespace="calico-system" Pod="calico-kube-controllers-d9b896c9c-x5cqp" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--kube--controllers--d9b896c9c--x5cqp-eth0" Jan 13 20:20:27.810984 containerd[1478]: time="2025-01-13T20:20:27.809809519Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.810984 containerd[1478]: time="2025-01-13T20:20:27.809926039Z" level=info msg="RemovePodSandbox \"2e2a0ef0dda2e103fb3453fa46e2e2c9c452e5324475b1359a9af98713b31e1b\" returns successfully" Jan 13 20:20:27.812273 containerd[1478]: time="2025-01-13T20:20:27.812114364Z" level=info msg="StopPodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\"" Jan 13 20:20:27.813087 containerd[1478]: time="2025-01-13T20:20:27.812249805Z" level=info msg="TearDown network for sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" successfully" Jan 13 20:20:27.816482 containerd[1478]: time="2025-01-13T20:20:27.816424734Z" level=info msg="StopPodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" returns successfully" Jan 13 20:20:27.823620 containerd[1478]: time="2025-01-13T20:20:27.823350950Z" level=info msg="RemovePodSandbox for \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\"" Jan 13 20:20:27.823620 containerd[1478]: time="2025-01-13T20:20:27.823427790Z" level=info msg="Forcibly stopping sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\"" Jan 13 20:20:27.823620 containerd[1478]: time="2025-01-13T20:20:27.823534470Z" level=info msg="TearDown network for sandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" successfully" Jan 13 20:20:27.835767 containerd[1478]: time="2025-01-13T20:20:27.835336497Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.835939 containerd[1478]: time="2025-01-13T20:20:27.835877458Z" level=info msg="RemovePodSandbox \"2954e7c034952dd9f0483c3eb85296f7c502a4a9b4cabba46b4c93551f10485a\" returns successfully" Jan 13 20:20:27.837548 containerd[1478]: time="2025-01-13T20:20:27.837497662Z" level=info msg="StopPodSandbox for \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\"" Jan 13 20:20:27.838548 containerd[1478]: time="2025-01-13T20:20:27.838500264Z" level=info msg="TearDown network for sandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\" successfully" Jan 13 20:20:27.838548 containerd[1478]: time="2025-01-13T20:20:27.838532545Z" level=info msg="StopPodSandbox for \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\" returns successfully" Jan 13 20:20:27.840036 containerd[1478]: time="2025-01-13T20:20:27.839823107Z" level=info msg="RemovePodSandbox for \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\"" Jan 13 20:20:27.840036 containerd[1478]: time="2025-01-13T20:20:27.839864308Z" level=info msg="Forcibly stopping sandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\"" Jan 13 20:20:27.840036 containerd[1478]: time="2025-01-13T20:20:27.839953748Z" level=info msg="TearDown network for sandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\" successfully" Jan 13 20:20:27.850027 systemd[1]: Started cri-containerd-85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225.scope - libcontainer container 85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225. Jan 13 20:20:27.852406 containerd[1478]: time="2025-01-13T20:20:27.852244616Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.852544 containerd[1478]: time="2025-01-13T20:20:27.852452176Z" level=info msg="RemovePodSandbox \"3c6aa6794b733ea50d662ebeea01229df176b1a4c57419bf333eff0d50f79ba2\" returns successfully" Jan 13 20:20:27.852544 containerd[1478]: time="2025-01-13T20:20:27.853204978Z" level=info msg="StopPodSandbox for \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\"" Jan 13 20:20:27.852544 containerd[1478]: time="2025-01-13T20:20:27.853322778Z" level=info msg="TearDown network for sandbox \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\" successfully" Jan 13 20:20:27.852544 containerd[1478]: time="2025-01-13T20:20:27.853333498Z" level=info msg="StopPodSandbox for \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\" returns successfully" Jan 13 20:20:27.853929 containerd[1478]: time="2025-01-13T20:20:27.853844779Z" level=info msg="RemovePodSandbox for \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\"" Jan 13 20:20:27.853929 containerd[1478]: time="2025-01-13T20:20:27.853875259Z" level=info msg="Forcibly stopping sandbox \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\"" Jan 13 20:20:27.854093 containerd[1478]: time="2025-01-13T20:20:27.854026740Z" level=info msg="TearDown network for sandbox \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\" successfully" Jan 13 20:20:27.864015 containerd[1478]: time="2025-01-13T20:20:27.863955362Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.864129 containerd[1478]: time="2025-01-13T20:20:27.864027163Z" level=info msg="RemovePodSandbox \"4ae0625b8fad52a60b4e115bcc149c8e6747a23245d6be4194161405143ceb8c\" returns successfully" Jan 13 20:20:27.866632 containerd[1478]: time="2025-01-13T20:20:27.865341166Z" level=info msg="StopPodSandbox for \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\"" Jan 13 20:20:27.866632 containerd[1478]: time="2025-01-13T20:20:27.865484766Z" level=info msg="TearDown network for sandbox \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\" successfully" Jan 13 20:20:27.866632 containerd[1478]: time="2025-01-13T20:20:27.865496406Z" level=info msg="StopPodSandbox for \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\" returns successfully" Jan 13 20:20:27.871111 containerd[1478]: time="2025-01-13T20:20:27.870864898Z" level=info msg="RemovePodSandbox for \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\"" Jan 13 20:20:27.871111 containerd[1478]: time="2025-01-13T20:20:27.871051899Z" level=info msg="Forcibly stopping sandbox \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\"" Jan 13 20:20:27.871313 containerd[1478]: time="2025-01-13T20:20:27.871286899Z" level=info msg="TearDown network for sandbox \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\" successfully" Jan 13 20:20:27.878830 containerd[1478]: time="2025-01-13T20:20:27.877782754Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.878830 containerd[1478]: time="2025-01-13T20:20:27.877992954Z" level=info msg="RemovePodSandbox \"3747fd47d420bc31ed59436511eff5f0ee7ee55db806c4580e1210c94bad42e8\" returns successfully" Jan 13 20:20:27.879453 containerd[1478]: time="2025-01-13T20:20:27.879236277Z" level=info msg="StopPodSandbox for \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\"" Jan 13 20:20:27.879453 containerd[1478]: time="2025-01-13T20:20:27.879354518Z" level=info msg="TearDown network for sandbox \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\" successfully" Jan 13 20:20:27.879453 containerd[1478]: time="2025-01-13T20:20:27.879365358Z" level=info msg="StopPodSandbox for \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\" returns successfully" Jan 13 20:20:27.882283 containerd[1478]: time="2025-01-13T20:20:27.880992841Z" level=info msg="RemovePodSandbox for \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\"" Jan 13 20:20:27.882283 containerd[1478]: time="2025-01-13T20:20:27.881043801Z" level=info msg="Forcibly stopping sandbox \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\"" Jan 13 20:20:27.882283 containerd[1478]: time="2025-01-13T20:20:27.881128082Z" level=info msg="TearDown network for sandbox \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\" successfully" Jan 13 20:20:27.882070 systemd-networkd[1377]: cali3e2816f8dad: Link UP Jan 13 20:20:27.884283 systemd-networkd[1377]: cali3e2816f8dad: Gained carrier Jan 13 20:20:27.896893 containerd[1478]: time="2025-01-13T20:20:27.896554877Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.897235 containerd[1478]: time="2025-01-13T20:20:27.897008598Z" level=info msg="RemovePodSandbox \"7d1bfa869bc4f55bd28e0ac1a2dd940f45441d92f8e611eae13efe16e0d1bf7d\" returns successfully" Jan 13 20:20:27.899721 containerd[1478]: time="2025-01-13T20:20:27.899623884Z" level=info msg="StopPodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\"" Jan 13 20:20:27.901619 containerd[1478]: time="2025-01-13T20:20:27.899859804Z" level=info msg="TearDown network for sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" successfully" Jan 13 20:20:27.901619 containerd[1478]: time="2025-01-13T20:20:27.901495808Z" level=info msg="StopPodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" returns successfully" Jan 13 20:20:27.902832 containerd[1478]: time="2025-01-13T20:20:27.902797371Z" level=info msg="RemovePodSandbox for \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\"" Jan 13 20:20:27.902832 containerd[1478]: time="2025-01-13T20:20:27.902839371Z" level=info msg="Forcibly stopping sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\"" Jan 13 20:20:27.903019 containerd[1478]: time="2025-01-13T20:20:27.902923051Z" level=info msg="TearDown network for sandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" successfully" Jan 13 20:20:27.911373 containerd[1478]: time="2025-01-13T20:20:27.911242990Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.914336 containerd[1478]: time="2025-01-13T20:20:27.913378355Z" level=info msg="RemovePodSandbox \"8dfd065cc9b3d96074776c958a550310ccc62230e25c8a87eb65d1b400040715\" returns successfully" Jan 13 20:20:27.915341 containerd[1478]: time="2025-01-13T20:20:27.914965559Z" level=info msg="StopPodSandbox for \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\"" Jan 13 20:20:27.915341 containerd[1478]: time="2025-01-13T20:20:27.915085759Z" level=info msg="TearDown network for sandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\" successfully" Jan 13 20:20:27.915341 containerd[1478]: time="2025-01-13T20:20:27.915098279Z" level=info msg="StopPodSandbox for \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\" returns successfully" Jan 13 20:20:27.916194 containerd[1478]: time="2025-01-13T20:20:27.916026441Z" level=info msg="RemovePodSandbox for \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\"" Jan 13 20:20:27.916194 containerd[1478]: time="2025-01-13T20:20:27.916068641Z" level=info msg="Forcibly stopping sandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\"" Jan 13 20:20:27.916194 containerd[1478]: time="2025-01-13T20:20:27.916140801Z" level=info msg="TearDown network for sandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\" successfully" Jan 13 20:20:27.919193 containerd[1478]: time="2025-01-13T20:20:27.914903879Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:20:27.919193 containerd[1478]: time="2025-01-13T20:20:27.914965879Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:20:27.919193 containerd[1478]: time="2025-01-13T20:20:27.914980679Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:27.919193 containerd[1478]: time="2025-01-13T20:20:27.916130241Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.191 [INFO][4823] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.299 [INFO][4823] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--8n6tx-eth0 coredns-7db6d8ff4d- kube-system acbb6d2d-5611-4557-91bf-b12ca46c13f5 761 0 2025-01-13 20:19:44 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4186-1-0-7-a3f46aeb9c coredns-7db6d8ff4d-8n6tx eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3e2816f8dad [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8n6tx" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--8n6tx-" Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.299 [INFO][4823] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8n6tx" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--8n6tx-eth0" Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.576 [INFO][4912] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" HandleID="k8s-pod-network.7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--8n6tx-eth0" Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.625 [INFO][4912] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" HandleID="k8s-pod-network.7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--8n6tx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400035e980), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4186-1-0-7-a3f46aeb9c", "pod":"coredns-7db6d8ff4d-8n6tx", "timestamp":"2025-01-13 20:20:27.575993386 +0000 UTC"}, Hostname:"ci-4186-1-0-7-a3f46aeb9c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.625 [INFO][4912] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.758 [INFO][4912] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.758 [INFO][4912] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186-1-0-7-a3f46aeb9c' Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.762 [INFO][4912] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.777 [INFO][4912] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.802 [INFO][4912] ipam/ipam.go 489: Trying affinity for 192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.808 [INFO][4912] ipam/ipam.go 155: Attempting to load block cidr=192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.815 [INFO][4912] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.816 [INFO][4912] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.83.192/26 handle="k8s-pod-network.7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.823 [INFO][4912] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4 Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.839 [INFO][4912] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.83.192/26 handle="k8s-pod-network.7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.855 [INFO][4912] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.83.195/26] block=192.168.83.192/26 handle="k8s-pod-network.7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.855 [INFO][4912] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.83.195/26] handle="k8s-pod-network.7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.855 [INFO][4912] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:20:27.919193 containerd[1478]: 2025-01-13 20:20:27.855 [INFO][4912] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.83.195/26] IPv6=[] ContainerID="7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" HandleID="k8s-pod-network.7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--8n6tx-eth0" Jan 13 20:20:27.919896 containerd[1478]: 2025-01-13 20:20:27.862 [INFO][4823] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8n6tx" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--8n6tx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--8n6tx-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"acbb6d2d-5611-4557-91bf-b12ca46c13f5", ResourceVersion:"761", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 19, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-1-0-7-a3f46aeb9c", ContainerID:"", Pod:"coredns-7db6d8ff4d-8n6tx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.83.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3e2816f8dad", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:20:27.919896 containerd[1478]: 2025-01-13 20:20:27.863 [INFO][4823] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.83.195/32] ContainerID="7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8n6tx" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--8n6tx-eth0" Jan 13 20:20:27.919896 containerd[1478]: 2025-01-13 20:20:27.864 [INFO][4823] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3e2816f8dad ContainerID="7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8n6tx" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--8n6tx-eth0" Jan 13 20:20:27.919896 containerd[1478]: 2025-01-13 20:20:27.888 [INFO][4823] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8n6tx" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--8n6tx-eth0" Jan 13 20:20:27.919896 containerd[1478]: 2025-01-13 20:20:27.891 [INFO][4823] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8n6tx" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--8n6tx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--8n6tx-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"acbb6d2d-5611-4557-91bf-b12ca46c13f5", ResourceVersion:"761", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 19, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-1-0-7-a3f46aeb9c", ContainerID:"7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4", Pod:"coredns-7db6d8ff4d-8n6tx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.83.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3e2816f8dad", MAC:"fe:74:2b:90:09:ff", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:20:27.919896 containerd[1478]: 2025-01-13 20:20:27.910 [INFO][4823] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8n6tx" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--8n6tx-eth0" Jan 13 20:20:27.933369 containerd[1478]: time="2025-01-13T20:20:27.932356038Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.933369 containerd[1478]: time="2025-01-13T20:20:27.932465719Z" level=info msg="RemovePodSandbox \"bb3a12eac33ab67e6a21613c16c8be3faf0c39db92e587e406c3f9eadb5ac587\" returns successfully" Jan 13 20:20:27.934466 containerd[1478]: time="2025-01-13T20:20:27.933721522Z" level=info msg="StopPodSandbox for \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\"" Jan 13 20:20:27.934466 containerd[1478]: time="2025-01-13T20:20:27.933833122Z" level=info msg="TearDown network for sandbox \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\" successfully" Jan 13 20:20:27.934466 containerd[1478]: time="2025-01-13T20:20:27.933843322Z" level=info msg="StopPodSandbox for \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\" returns successfully" Jan 13 20:20:27.934466 containerd[1478]: time="2025-01-13T20:20:27.934371803Z" level=info msg="RemovePodSandbox for \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\"" Jan 13 20:20:27.934466 containerd[1478]: time="2025-01-13T20:20:27.934397243Z" level=info msg="Forcibly stopping sandbox \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\"" Jan 13 20:20:27.934876 containerd[1478]: time="2025-01-13T20:20:27.934516443Z" level=info msg="TearDown network for sandbox \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\" successfully" Jan 13 20:20:27.946690 containerd[1478]: time="2025-01-13T20:20:27.946476951Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.946690 containerd[1478]: time="2025-01-13T20:20:27.946579471Z" level=info msg="RemovePodSandbox \"d51f8ce7daeedc2355ae88fa7bef0b0463a27c033968c936e9dcbc7bee9e2ec4\" returns successfully" Jan 13 20:20:27.947290 containerd[1478]: time="2025-01-13T20:20:27.947248952Z" level=info msg="StopPodSandbox for \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\"" Jan 13 20:20:27.947711 containerd[1478]: time="2025-01-13T20:20:27.947661833Z" level=info msg="TearDown network for sandbox \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\" successfully" Jan 13 20:20:27.947789 containerd[1478]: time="2025-01-13T20:20:27.947729594Z" level=info msg="StopPodSandbox for \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\" returns successfully" Jan 13 20:20:27.949070 containerd[1478]: time="2025-01-13T20:20:27.948873556Z" level=info msg="RemovePodSandbox for \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\"" Jan 13 20:20:27.949070 containerd[1478]: time="2025-01-13T20:20:27.948956116Z" level=info msg="Forcibly stopping sandbox \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\"" Jan 13 20:20:27.949242 containerd[1478]: time="2025-01-13T20:20:27.949140757Z" level=info msg="TearDown network for sandbox \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\" successfully" Jan 13 20:20:27.966527 containerd[1478]: time="2025-01-13T20:20:27.966302956Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.966527 containerd[1478]: time="2025-01-13T20:20:27.966371236Z" level=info msg="RemovePodSandbox \"316d3809648ceb6631561104cf0c22121f54c22099063b12291a5aef22320c30\" returns successfully" Jan 13 20:20:27.967822 containerd[1478]: time="2025-01-13T20:20:27.967576639Z" level=info msg="StopPodSandbox for \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\"" Jan 13 20:20:27.967822 containerd[1478]: time="2025-01-13T20:20:27.967755319Z" level=info msg="TearDown network for sandbox \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\" successfully" Jan 13 20:20:27.967822 containerd[1478]: time="2025-01-13T20:20:27.967767919Z" level=info msg="StopPodSandbox for \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\" returns successfully" Jan 13 20:20:27.968554 containerd[1478]: time="2025-01-13T20:20:27.968381921Z" level=info msg="RemovePodSandbox for \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\"" Jan 13 20:20:27.969617 containerd[1478]: time="2025-01-13T20:20:27.968662161Z" level=info msg="Forcibly stopping sandbox \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\"" Jan 13 20:20:27.969617 containerd[1478]: time="2025-01-13T20:20:27.968768881Z" level=info msg="TearDown network for sandbox \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\" successfully" Jan 13 20:20:27.985826 systemd[1]: Started cri-containerd-70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89.scope - libcontainer container 70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89. Jan 13 20:20:27.992170 containerd[1478]: time="2025-01-13T20:20:27.991584494Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:27.992386 containerd[1478]: time="2025-01-13T20:20:27.992356135Z" level=info msg="RemovePodSandbox \"388cde6b902cf1a12a508ec360a91651a9fe84fb9b4b349f0008b965e5d12edb\" returns successfully" Jan 13 20:20:27.996970 containerd[1478]: time="2025-01-13T20:20:27.996929626Z" level=info msg="StopPodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\"" Jan 13 20:20:27.997486 containerd[1478]: time="2025-01-13T20:20:27.997458947Z" level=info msg="TearDown network for sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" successfully" Jan 13 20:20:27.997636 containerd[1478]: time="2025-01-13T20:20:27.997620547Z" level=info msg="StopPodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" returns successfully" Jan 13 20:20:27.998435 containerd[1478]: time="2025-01-13T20:20:27.998385589Z" level=info msg="RemovePodSandbox for \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\"" Jan 13 20:20:27.998759 containerd[1478]: time="2025-01-13T20:20:27.998533149Z" level=info msg="Forcibly stopping sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\"" Jan 13 20:20:27.999233 containerd[1478]: time="2025-01-13T20:20:27.999124751Z" level=info msg="TearDown network for sandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" successfully" Jan 13 20:20:28.010028 systemd-networkd[1377]: cali5059be056f4: Link UP Jan 13 20:20:28.015186 systemd-networkd[1377]: cali5059be056f4: Gained carrier Jan 13 20:20:28.028897 containerd[1478]: time="2025-01-13T20:20:28.028464737Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:28.029507 containerd[1478]: time="2025-01-13T20:20:28.029424259Z" level=info msg="RemovePodSandbox \"03a2d1abaa65c01021f8252640e9ccf119e7a282e43f635e25aea7025b7968ca\" returns successfully" Jan 13 20:20:28.034293 containerd[1478]: time="2025-01-13T20:20:28.033978269Z" level=info msg="StopPodSandbox for \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\"" Jan 13 20:20:28.034293 containerd[1478]: time="2025-01-13T20:20:28.034138710Z" level=info msg="TearDown network for sandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\" successfully" Jan 13 20:20:28.034293 containerd[1478]: time="2025-01-13T20:20:28.034149550Z" level=info msg="StopPodSandbox for \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\" returns successfully" Jan 13 20:20:28.035613 containerd[1478]: time="2025-01-13T20:20:28.035077632Z" level=info msg="RemovePodSandbox for \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\"" Jan 13 20:20:28.035613 containerd[1478]: time="2025-01-13T20:20:28.035112952Z" level=info msg="Forcibly stopping sandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\"" Jan 13 20:20:28.035613 containerd[1478]: time="2025-01-13T20:20:28.035188512Z" level=info msg="TearDown network for sandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\" successfully" Jan 13 20:20:28.039045 containerd[1478]: time="2025-01-13T20:20:28.038011278Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:20:28.039521 containerd[1478]: time="2025-01-13T20:20:28.039315441Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:20:28.039763 containerd[1478]: time="2025-01-13T20:20:28.039652082Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:28.041884 containerd[1478]: time="2025-01-13T20:20:28.040871365Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:28.053648 containerd[1478]: time="2025-01-13T20:20:28.052146030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-kzg6k,Uid:3b18b5d2-f018-4c67-a9be-1b6f49d4b5e4,Namespace:calico-apiserver,Attempt:6,} returns sandbox id \"85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225\"" Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.352 [INFO][4877] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.446 [INFO][4877] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--zsmfk-eth0 calico-apiserver-7776878d6f- calico-apiserver 189c7963-e6bf-46b5-b6d5-9d268e857385 764 0 2025-01-13 20:20:10 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7776878d6f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4186-1-0-7-a3f46aeb9c calico-apiserver-7776878d6f-zsmfk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5059be056f4 [] []}} ContainerID="48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-zsmfk" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--zsmfk-" Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.446 [INFO][4877] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-zsmfk" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--zsmfk-eth0" Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.606 [INFO][4932] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" HandleID="k8s-pod-network.48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--zsmfk-eth0" Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.642 [INFO][4932] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" HandleID="k8s-pod-network.48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--zsmfk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003992c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4186-1-0-7-a3f46aeb9c", "pod":"calico-apiserver-7776878d6f-zsmfk", "timestamp":"2025-01-13 20:20:27.606228615 +0000 UTC"}, Hostname:"ci-4186-1-0-7-a3f46aeb9c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.642 [INFO][4932] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.856 [INFO][4932] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.856 [INFO][4932] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186-1-0-7-a3f46aeb9c' Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.862 [INFO][4932] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.883 [INFO][4932] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.901 [INFO][4932] ipam/ipam.go 489: Trying affinity for 192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.913 [INFO][4932] ipam/ipam.go 155: Attempting to load block cidr=192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.921 [INFO][4932] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.922 [INFO][4932] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.83.192/26 handle="k8s-pod-network.48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.928 [INFO][4932] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612 Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.946 [INFO][4932] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.83.192/26 handle="k8s-pod-network.48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.972 [INFO][4932] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.83.196/26] block=192.168.83.192/26 handle="k8s-pod-network.48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.972 [INFO][4932] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.83.196/26] handle="k8s-pod-network.48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.972 [INFO][4932] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:20:28.060051 containerd[1478]: 2025-01-13 20:20:27.972 [INFO][4932] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.83.196/26] IPv6=[] ContainerID="48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" HandleID="k8s-pod-network.48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--zsmfk-eth0" Jan 13 20:20:28.060748 containerd[1478]: 2025-01-13 20:20:27.988 [INFO][4877] cni-plugin/k8s.go 386: Populated endpoint ContainerID="48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-zsmfk" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--zsmfk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--zsmfk-eth0", GenerateName:"calico-apiserver-7776878d6f-", Namespace:"calico-apiserver", SelfLink:"", UID:"189c7963-e6bf-46b5-b6d5-9d268e857385", ResourceVersion:"764", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 20, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7776878d6f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-1-0-7-a3f46aeb9c", ContainerID:"", Pod:"calico-apiserver-7776878d6f-zsmfk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.83.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5059be056f4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:20:28.060748 containerd[1478]: 2025-01-13 20:20:27.991 [INFO][4877] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.83.196/32] ContainerID="48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-zsmfk" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--zsmfk-eth0" Jan 13 20:20:28.060748 containerd[1478]: 2025-01-13 20:20:27.991 [INFO][4877] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5059be056f4 ContainerID="48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-zsmfk" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--zsmfk-eth0" Jan 13 20:20:28.060748 containerd[1478]: 2025-01-13 20:20:28.020 [INFO][4877] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-zsmfk" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--zsmfk-eth0" Jan 13 20:20:28.060748 containerd[1478]: 2025-01-13 20:20:28.025 [INFO][4877] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-zsmfk" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--zsmfk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--zsmfk-eth0", GenerateName:"calico-apiserver-7776878d6f-", Namespace:"calico-apiserver", SelfLink:"", UID:"189c7963-e6bf-46b5-b6d5-9d268e857385", ResourceVersion:"764", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 20, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7776878d6f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-1-0-7-a3f46aeb9c", ContainerID:"48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612", Pod:"calico-apiserver-7776878d6f-zsmfk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.83.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5059be056f4", MAC:"2a:0a:b2:cd:d2:47", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:20:28.060748 containerd[1478]: 2025-01-13 20:20:28.046 [INFO][4877] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612" Namespace="calico-apiserver" Pod="calico-apiserver-7776878d6f-zsmfk" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-calico--apiserver--7776878d6f--zsmfk-eth0" Jan 13 20:20:28.073040 containerd[1478]: time="2025-01-13T20:20:28.070815472Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 13 20:20:28.073296 containerd[1478]: time="2025-01-13T20:20:28.073060717Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:28.073422 containerd[1478]: time="2025-01-13T20:20:28.073315038Z" level=info msg="RemovePodSandbox \"43e2ec9318437e8dfcba4ef1c3073f43576c31faa75a661695d0d26c7f76a125\" returns successfully" Jan 13 20:20:28.081791 containerd[1478]: time="2025-01-13T20:20:28.081379896Z" level=info msg="StopPodSandbox for \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\"" Jan 13 20:20:28.082915 containerd[1478]: time="2025-01-13T20:20:28.082804419Z" level=info msg="TearDown network for sandbox \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\" successfully" Jan 13 20:20:28.092122 containerd[1478]: time="2025-01-13T20:20:28.091948120Z" level=info msg="StopPodSandbox for \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\" returns successfully" Jan 13 20:20:28.096536 systemd[1]: Started cri-containerd-7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4.scope - libcontainer container 7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4. Jan 13 20:20:28.103491 containerd[1478]: time="2025-01-13T20:20:28.101504461Z" level=info msg="RemovePodSandbox for \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\"" Jan 13 20:20:28.103491 containerd[1478]: time="2025-01-13T20:20:28.101549581Z" level=info msg="Forcibly stopping sandbox \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\"" Jan 13 20:20:28.103491 containerd[1478]: time="2025-01-13T20:20:28.101650582Z" level=info msg="TearDown network for sandbox \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\" successfully" Jan 13 20:20:28.105265 systemd-networkd[1377]: caliba7b05590b2: Link UP Jan 13 20:20:28.106317 systemd-networkd[1377]: caliba7b05590b2: Gained carrier Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:27.261 [INFO][4866] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:27.335 [INFO][4866] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--khp7b-eth0 coredns-7db6d8ff4d- kube-system ff1a3779-f857-4921-b0c5-fdad56861f50 760 0 2025-01-13 20:19:44 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4186-1-0-7-a3f46aeb9c coredns-7db6d8ff4d-khp7b eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliba7b05590b2 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" Namespace="kube-system" Pod="coredns-7db6d8ff4d-khp7b" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--khp7b-" Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:27.336 [INFO][4866] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" Namespace="kube-system" Pod="coredns-7db6d8ff4d-khp7b" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--khp7b-eth0" Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:27.655 [INFO][4924] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" HandleID="k8s-pod-network.764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--khp7b-eth0" Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:27.692 [INFO][4924] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" HandleID="k8s-pod-network.764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--khp7b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000314de0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4186-1-0-7-a3f46aeb9c", "pod":"coredns-7db6d8ff4d-khp7b", "timestamp":"2025-01-13 20:20:27.655534807 +0000 UTC"}, Hostname:"ci-4186-1-0-7-a3f46aeb9c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:27.694 [INFO][4924] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:27.973 [INFO][4924] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:27.973 [INFO][4924] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186-1-0-7-a3f46aeb9c' Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:27.979 [INFO][4924] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:28.004 [INFO][4924] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:28.033 [INFO][4924] ipam/ipam.go 489: Trying affinity for 192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:28.037 [INFO][4924] ipam/ipam.go 155: Attempting to load block cidr=192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:28.051 [INFO][4924] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:28.051 [INFO][4924] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.83.192/26 handle="k8s-pod-network.764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:28.055 [INFO][4924] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3 Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:28.065 [INFO][4924] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.83.192/26 handle="k8s-pod-network.764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:28.084 [INFO][4924] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.83.197/26] block=192.168.83.192/26 handle="k8s-pod-network.764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:28.084 [INFO][4924] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.83.197/26] handle="k8s-pod-network.764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:28.084 [INFO][4924] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:20:28.171795 containerd[1478]: 2025-01-13 20:20:28.085 [INFO][4924] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.83.197/26] IPv6=[] ContainerID="764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" HandleID="k8s-pod-network.764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--khp7b-eth0" Jan 13 20:20:28.172521 containerd[1478]: 2025-01-13 20:20:28.088 [INFO][4866] cni-plugin/k8s.go 386: Populated endpoint ContainerID="764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" Namespace="kube-system" Pod="coredns-7db6d8ff4d-khp7b" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--khp7b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--khp7b-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"ff1a3779-f857-4921-b0c5-fdad56861f50", ResourceVersion:"760", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 19, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-1-0-7-a3f46aeb9c", ContainerID:"", Pod:"coredns-7db6d8ff4d-khp7b", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.83.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliba7b05590b2", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:20:28.172521 containerd[1478]: 2025-01-13 20:20:28.088 [INFO][4866] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.83.197/32] ContainerID="764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" Namespace="kube-system" Pod="coredns-7db6d8ff4d-khp7b" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--khp7b-eth0" Jan 13 20:20:28.172521 containerd[1478]: 2025-01-13 20:20:28.088 [INFO][4866] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliba7b05590b2 ContainerID="764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" Namespace="kube-system" Pod="coredns-7db6d8ff4d-khp7b" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--khp7b-eth0" Jan 13 20:20:28.172521 containerd[1478]: 2025-01-13 20:20:28.107 [INFO][4866] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" Namespace="kube-system" Pod="coredns-7db6d8ff4d-khp7b" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--khp7b-eth0" Jan 13 20:20:28.172521 containerd[1478]: 2025-01-13 20:20:28.116 [INFO][4866] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" Namespace="kube-system" Pod="coredns-7db6d8ff4d-khp7b" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--khp7b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--khp7b-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"ff1a3779-f857-4921-b0c5-fdad56861f50", ResourceVersion:"760", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 19, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-1-0-7-a3f46aeb9c", ContainerID:"764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3", Pod:"coredns-7db6d8ff4d-khp7b", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.83.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliba7b05590b2", MAC:"2e:69:1d:f4:f4:e5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:20:28.172521 containerd[1478]: 2025-01-13 20:20:28.143 [INFO][4866] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3" Namespace="kube-system" Pod="coredns-7db6d8ff4d-khp7b" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-coredns--7db6d8ff4d--khp7b-eth0" Jan 13 20:20:28.189638 containerd[1478]: time="2025-01-13T20:20:28.188814458Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:20:28.189638 containerd[1478]: time="2025-01-13T20:20:28.188894938Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:20:28.189638 containerd[1478]: time="2025-01-13T20:20:28.188910898Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:28.189638 containerd[1478]: time="2025-01-13T20:20:28.189004138Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:28.192254 containerd[1478]: time="2025-01-13T20:20:28.192180625Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d9b896c9c-x5cqp,Uid:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,Namespace:calico-system,Attempt:6,} returns sandbox id \"70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89\"" Jan 13 20:20:28.216377 containerd[1478]: time="2025-01-13T20:20:28.215388078Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:28.219888 containerd[1478]: time="2025-01-13T20:20:28.219746448Z" level=info msg="RemovePodSandbox \"ba7d3b994fec2d9147d7151824ce0c0bf1b04d70b1cd62a124eb2308220d787f\" returns successfully" Jan 13 20:20:28.222568 containerd[1478]: time="2025-01-13T20:20:28.222523334Z" level=info msg="StopPodSandbox for \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\"" Jan 13 20:20:28.223073 containerd[1478]: time="2025-01-13T20:20:28.223011615Z" level=info msg="TearDown network for sandbox \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\" successfully" Jan 13 20:20:28.223073 containerd[1478]: time="2025-01-13T20:20:28.223029775Z" level=info msg="StopPodSandbox for \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\" returns successfully" Jan 13 20:20:28.223782 containerd[1478]: time="2025-01-13T20:20:28.223760417Z" level=info msg="RemovePodSandbox for \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\"" Jan 13 20:20:28.223919 containerd[1478]: time="2025-01-13T20:20:28.223849897Z" level=info msg="Forcibly stopping sandbox \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\"" Jan 13 20:20:28.224046 containerd[1478]: time="2025-01-13T20:20:28.224014777Z" level=info msg="TearDown network for sandbox \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\" successfully" Jan 13 20:20:28.228139 systemd-networkd[1377]: cali29e3b9d4c8e: Link UP Jan 13 20:20:28.229202 systemd-networkd[1377]: cali29e3b9d4c8e: Gained carrier Jan 13 20:20:28.238465 containerd[1478]: time="2025-01-13T20:20:28.238263129Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:28.238465 containerd[1478]: time="2025-01-13T20:20:28.238330489Z" level=info msg="RemovePodSandbox \"0abc946fef712475e7eff0dccd9ed1a913bae22227091e8fc18d043176f727b9\" returns successfully" Jan 13 20:20:28.241749 containerd[1478]: time="2025-01-13T20:20:28.240453254Z" level=info msg="StopPodSandbox for \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\"" Jan 13 20:20:28.241749 containerd[1478]: time="2025-01-13T20:20:28.240583014Z" level=info msg="TearDown network for sandbox \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\" successfully" Jan 13 20:20:28.241749 containerd[1478]: time="2025-01-13T20:20:28.241685177Z" level=info msg="StopPodSandbox for \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\" returns successfully" Jan 13 20:20:28.242539 containerd[1478]: time="2025-01-13T20:20:28.242509259Z" level=info msg="RemovePodSandbox for \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\"" Jan 13 20:20:28.243725 containerd[1478]: time="2025-01-13T20:20:28.243697061Z" level=info msg="Forcibly stopping sandbox \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\"" Jan 13 20:20:28.244073 containerd[1478]: time="2025-01-13T20:20:28.243929462Z" level=info msg="TearDown network for sandbox \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\" successfully" Jan 13 20:20:28.261587 containerd[1478]: time="2025-01-13T20:20:28.261123861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8n6tx,Uid:acbb6d2d-5611-4557-91bf-b12ca46c13f5,Namespace:kube-system,Attempt:6,} returns sandbox id \"7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4\"" Jan 13 20:20:28.272716 containerd[1478]: time="2025-01-13T20:20:28.272661327Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:28.278039 containerd[1478]: time="2025-01-13T20:20:28.273957850Z" level=info msg="RemovePodSandbox \"a9a131fc547a4ae11ce4066e4f4e6cf3ce4eb80e8e9a4ce509e03e18f2b079c7\" returns successfully" Jan 13 20:20:28.299545 containerd[1478]: time="2025-01-13T20:20:28.299497027Z" level=info msg="StopPodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\"" Jan 13 20:20:28.301390 containerd[1478]: time="2025-01-13T20:20:28.301161911Z" level=info msg="TearDown network for sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" successfully" Jan 13 20:20:28.301698 containerd[1478]: time="2025-01-13T20:20:28.301485032Z" level=info msg="StopPodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" returns successfully" Jan 13 20:20:28.304224 containerd[1478]: time="2025-01-13T20:20:28.304038597Z" level=info msg="CreateContainer within sandbox \"7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 13 20:20:28.306743 containerd[1478]: time="2025-01-13T20:20:28.305726801Z" level=info msg="RemovePodSandbox for \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\"" Jan 13 20:20:28.306743 containerd[1478]: time="2025-01-13T20:20:28.305787601Z" level=info msg="Forcibly stopping sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\"" Jan 13 20:20:28.306743 containerd[1478]: time="2025-01-13T20:20:28.305950642Z" level=info msg="TearDown network for sandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" successfully" Jan 13 20:20:28.314405 systemd[1]: Started cri-containerd-48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612.scope - libcontainer container 48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612. Jan 13 20:20:28.321378 containerd[1478]: time="2025-01-13T20:20:28.321314436Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:28.321761 containerd[1478]: time="2025-01-13T20:20:28.321737837Z" level=info msg="RemovePodSandbox \"faa71ac13b659a4f2bbb3be3b4d93940bdcc94d90fa322e4f7efc6c694babfc0\" returns successfully" Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:27.294 [INFO][4855] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:27.369 [INFO][4855] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186--1--0--7--a3f46aeb9c-k8s-csi--node--driver--vhlrh-eth0 csi-node-driver- calico-system d410432e-4da0-436d-8d3b-2586cacab46b 674 0 2025-01-13 20:20:10 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:65bf684474 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4186-1-0-7-a3f46aeb9c csi-node-driver-vhlrh eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali29e3b9d4c8e [] []}} ContainerID="17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" Namespace="calico-system" Pod="csi-node-driver-vhlrh" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-csi--node--driver--vhlrh-" Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:27.370 [INFO][4855] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" Namespace="calico-system" Pod="csi-node-driver-vhlrh" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-csi--node--driver--vhlrh-eth0" Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:27.692 [INFO][4923] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" HandleID="k8s-pod-network.17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-csi--node--driver--vhlrh-eth0" Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:27.738 [INFO][4923] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" HandleID="k8s-pod-network.17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-csi--node--driver--vhlrh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000334de0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4186-1-0-7-a3f46aeb9c", "pod":"csi-node-driver-vhlrh", "timestamp":"2025-01-13 20:20:27.692196611 +0000 UTC"}, Hostname:"ci-4186-1-0-7-a3f46aeb9c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:27.738 [INFO][4923] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:28.085 [INFO][4923] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:28.092 [INFO][4923] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186-1-0-7-a3f46aeb9c' Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:28.104 [INFO][4923] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:28.133 [INFO][4923] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:28.160 [INFO][4923] ipam/ipam.go 489: Trying affinity for 192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:28.165 [INFO][4923] ipam/ipam.go 155: Attempting to load block cidr=192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:28.177 [INFO][4923] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.83.192/26 host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:28.177 [INFO][4923] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.83.192/26 handle="k8s-pod-network.17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:28.183 [INFO][4923] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:28.198 [INFO][4923] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.83.192/26 handle="k8s-pod-network.17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:28.218 [INFO][4923] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.83.198/26] block=192.168.83.192/26 handle="k8s-pod-network.17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:28.219 [INFO][4923] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.83.198/26] handle="k8s-pod-network.17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" host="ci-4186-1-0-7-a3f46aeb9c" Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:28.219 [INFO][4923] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 20:20:28.328036 containerd[1478]: 2025-01-13 20:20:28.219 [INFO][4923] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.83.198/26] IPv6=[] ContainerID="17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" HandleID="k8s-pod-network.17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" Workload="ci--4186--1--0--7--a3f46aeb9c-k8s-csi--node--driver--vhlrh-eth0" Jan 13 20:20:28.328898 containerd[1478]: 2025-01-13 20:20:28.224 [INFO][4855] cni-plugin/k8s.go 386: Populated endpoint ContainerID="17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" Namespace="calico-system" Pod="csi-node-driver-vhlrh" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-csi--node--driver--vhlrh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--1--0--7--a3f46aeb9c-k8s-csi--node--driver--vhlrh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d410432e-4da0-436d-8d3b-2586cacab46b", ResourceVersion:"674", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 20, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-1-0-7-a3f46aeb9c", ContainerID:"", Pod:"csi-node-driver-vhlrh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.83.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali29e3b9d4c8e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:20:28.328898 containerd[1478]: 2025-01-13 20:20:28.225 [INFO][4855] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.83.198/32] ContainerID="17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" Namespace="calico-system" Pod="csi-node-driver-vhlrh" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-csi--node--driver--vhlrh-eth0" Jan 13 20:20:28.328898 containerd[1478]: 2025-01-13 20:20:28.225 [INFO][4855] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali29e3b9d4c8e ContainerID="17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" Namespace="calico-system" Pod="csi-node-driver-vhlrh" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-csi--node--driver--vhlrh-eth0" Jan 13 20:20:28.328898 containerd[1478]: 2025-01-13 20:20:28.229 [INFO][4855] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" Namespace="calico-system" Pod="csi-node-driver-vhlrh" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-csi--node--driver--vhlrh-eth0" Jan 13 20:20:28.328898 containerd[1478]: 2025-01-13 20:20:28.230 [INFO][4855] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" Namespace="calico-system" Pod="csi-node-driver-vhlrh" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-csi--node--driver--vhlrh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--1--0--7--a3f46aeb9c-k8s-csi--node--driver--vhlrh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d410432e-4da0-436d-8d3b-2586cacab46b", ResourceVersion:"674", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 20, 20, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-1-0-7-a3f46aeb9c", ContainerID:"17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a", Pod:"csi-node-driver-vhlrh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.83.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali29e3b9d4c8e", MAC:"82:13:a3:cf:7a:16", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 20:20:28.328898 containerd[1478]: 2025-01-13 20:20:28.300 [INFO][4855] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a" Namespace="calico-system" Pod="csi-node-driver-vhlrh" WorkloadEndpoint="ci--4186--1--0--7--a3f46aeb9c-k8s-csi--node--driver--vhlrh-eth0" Jan 13 20:20:28.332281 containerd[1478]: time="2025-01-13T20:20:28.330662137Z" level=info msg="StopPodSandbox for \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\"" Jan 13 20:20:28.332281 containerd[1478]: time="2025-01-13T20:20:28.330800858Z" level=info msg="TearDown network for sandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\" successfully" Jan 13 20:20:28.332281 containerd[1478]: time="2025-01-13T20:20:28.330825578Z" level=info msg="StopPodSandbox for \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\" returns successfully" Jan 13 20:20:28.343750 containerd[1478]: time="2025-01-13T20:20:28.343692847Z" level=info msg="RemovePodSandbox for \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\"" Jan 13 20:20:28.343878 containerd[1478]: time="2025-01-13T20:20:28.343753887Z" level=info msg="Forcibly stopping sandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\"" Jan 13 20:20:28.343916 containerd[1478]: time="2025-01-13T20:20:28.343875647Z" level=info msg="TearDown network for sandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\" successfully" Jan 13 20:20:28.364192 containerd[1478]: time="2025-01-13T20:20:28.362976450Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:20:28.364192 containerd[1478]: time="2025-01-13T20:20:28.363041930Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:20:28.364192 containerd[1478]: time="2025-01-13T20:20:28.363056930Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:28.364192 containerd[1478]: time="2025-01-13T20:20:28.363137530Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:28.368830 containerd[1478]: time="2025-01-13T20:20:28.368775943Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:28.368996 containerd[1478]: time="2025-01-13T20:20:28.368849223Z" level=info msg="RemovePodSandbox \"e5c2788211e9ab8990b57cd0660a9c88c203ac98a650c6f7161de5df1e9da5e0\" returns successfully" Jan 13 20:20:28.372075 containerd[1478]: time="2025-01-13T20:20:28.372032510Z" level=info msg="StopPodSandbox for \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\"" Jan 13 20:20:28.372519 containerd[1478]: time="2025-01-13T20:20:28.372471431Z" level=info msg="TearDown network for sandbox \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\" successfully" Jan 13 20:20:28.373115 containerd[1478]: time="2025-01-13T20:20:28.373090153Z" level=info msg="StopPodSandbox for \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\" returns successfully" Jan 13 20:20:28.374027 containerd[1478]: time="2025-01-13T20:20:28.374001275Z" level=info msg="RemovePodSandbox for \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\"" Jan 13 20:20:28.374239 containerd[1478]: time="2025-01-13T20:20:28.374219995Z" level=info msg="Forcibly stopping sandbox \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\"" Jan 13 20:20:28.374878 containerd[1478]: time="2025-01-13T20:20:28.374855517Z" level=info msg="TearDown network for sandbox \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\" successfully" Jan 13 20:20:28.395051 containerd[1478]: time="2025-01-13T20:20:28.394134600Z" level=info msg="CreateContainer within sandbox \"7b2351687ccf0475a36202259bcc27dd791b2a2b1ed47c6929e36847a8a54df4\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a32c32ecb91894e5bf4b2ac78f83ce5f6ac156413b44a89425b9aaee575d7071\"" Jan 13 20:20:28.399721 containerd[1478]: time="2025-01-13T20:20:28.399682413Z" level=info msg="StartContainer for \"a32c32ecb91894e5bf4b2ac78f83ce5f6ac156413b44a89425b9aaee575d7071\"" Jan 13 20:20:28.410790 containerd[1478]: time="2025-01-13T20:20:28.410739998Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:28.410911 containerd[1478]: time="2025-01-13T20:20:28.410807078Z" level=info msg="RemovePodSandbox \"e0b7cd4428a34bc9aaf932fdc670212dec7589c297fa0146d8e722342c1a4e24\" returns successfully" Jan 13 20:20:28.413958 containerd[1478]: time="2025-01-13T20:20:28.413741884Z" level=info msg="StopPodSandbox for \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\"" Jan 13 20:20:28.414581 containerd[1478]: time="2025-01-13T20:20:28.414544726Z" level=info msg="TearDown network for sandbox \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\" successfully" Jan 13 20:20:28.414874 containerd[1478]: time="2025-01-13T20:20:28.414814607Z" level=info msg="StopPodSandbox for \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\" returns successfully" Jan 13 20:20:28.415888 containerd[1478]: time="2025-01-13T20:20:28.415756329Z" level=info msg="RemovePodSandbox for \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\"" Jan 13 20:20:28.415888 containerd[1478]: time="2025-01-13T20:20:28.415790889Z" level=info msg="Forcibly stopping sandbox \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\"" Jan 13 20:20:28.416010 containerd[1478]: time="2025-01-13T20:20:28.415872409Z" level=info msg="TearDown network for sandbox \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\" successfully" Jan 13 20:20:28.427405 containerd[1478]: time="2025-01-13T20:20:28.426534313Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:28.427405 containerd[1478]: time="2025-01-13T20:20:28.427266995Z" level=info msg="RemovePodSandbox \"5f2671ee521a655919a4d477952467d801806ec2b9d43e23f9708122c8851f64\" returns successfully" Jan 13 20:20:28.430339 containerd[1478]: time="2025-01-13T20:20:28.429981961Z" level=info msg="StopPodSandbox for \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\"" Jan 13 20:20:28.430764 containerd[1478]: time="2025-01-13T20:20:28.430634362Z" level=info msg="TearDown network for sandbox \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\" successfully" Jan 13 20:20:28.430764 containerd[1478]: time="2025-01-13T20:20:28.430658682Z" level=info msg="StopPodSandbox for \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\" returns successfully" Jan 13 20:20:28.433337 containerd[1478]: time="2025-01-13T20:20:28.433293208Z" level=info msg="RemovePodSandbox for \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\"" Jan 13 20:20:28.433337 containerd[1478]: time="2025-01-13T20:20:28.433335728Z" level=info msg="Forcibly stopping sandbox \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\"" Jan 13 20:20:28.433552 containerd[1478]: time="2025-01-13T20:20:28.433488289Z" level=info msg="TearDown network for sandbox \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\" successfully" Jan 13 20:20:28.454917 containerd[1478]: time="2025-01-13T20:20:28.454843577Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:20:28.455041 containerd[1478]: time="2025-01-13T20:20:28.454940417Z" level=info msg="RemovePodSandbox \"6f32dc52676231851531a6681d5055fcd6636fe069f6d084179e3fd316f116a4\" returns successfully" Jan 13 20:20:28.465267 containerd[1478]: time="2025-01-13T20:20:28.465128320Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 20:20:28.465267 containerd[1478]: time="2025-01-13T20:20:28.465204680Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 20:20:28.465267 containerd[1478]: time="2025-01-13T20:20:28.465220640Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:28.466480 containerd[1478]: time="2025-01-13T20:20:28.465942202Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 20:20:28.528133 systemd[1]: Started cri-containerd-764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3.scope - libcontainer container 764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3. Jan 13 20:20:28.535976 systemd[1]: Started cri-containerd-17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a.scope - libcontainer container 17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a. Jan 13 20:20:28.540992 systemd[1]: Started cri-containerd-a32c32ecb91894e5bf4b2ac78f83ce5f6ac156413b44a89425b9aaee575d7071.scope - libcontainer container a32c32ecb91894e5bf4b2ac78f83ce5f6ac156413b44a89425b9aaee575d7071. Jan 13 20:20:28.631928 containerd[1478]: time="2025-01-13T20:20:28.631869455Z" level=info msg="StartContainer for \"a32c32ecb91894e5bf4b2ac78f83ce5f6ac156413b44a89425b9aaee575d7071\" returns successfully" Jan 13 20:20:28.645351 containerd[1478]: time="2025-01-13T20:20:28.645293806Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-khp7b,Uid:ff1a3779-f857-4921-b0c5-fdad56861f50,Namespace:kube-system,Attempt:6,} returns sandbox id \"764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3\"" Jan 13 20:20:28.651701 containerd[1478]: time="2025-01-13T20:20:28.651653540Z" level=info msg="CreateContainer within sandbox \"764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 13 20:20:28.677692 containerd[1478]: time="2025-01-13T20:20:28.677542998Z" level=info msg="CreateContainer within sandbox \"764b0a07e30cab71243b8b4d44c97238473f285b2b8a868fd2ab9814485eaef3\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b753e3814d37465d44978de392be6b088e383d0e581b421938cc00605a0c5248\"" Jan 13 20:20:28.679162 containerd[1478]: time="2025-01-13T20:20:28.679042842Z" level=info msg="StartContainer for \"b753e3814d37465d44978de392be6b088e383d0e581b421938cc00605a0c5248\"" Jan 13 20:20:28.701814 containerd[1478]: time="2025-01-13T20:20:28.701559292Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vhlrh,Uid:d410432e-4da0-436d-8d3b-2586cacab46b,Namespace:calico-system,Attempt:6,} returns sandbox id \"17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a\"" Jan 13 20:20:28.766452 containerd[1478]: time="2025-01-13T20:20:28.765926197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7776878d6f-zsmfk,Uid:189c7963-e6bf-46b5-b6d5-9d268e857385,Namespace:calico-apiserver,Attempt:6,} returns sandbox id \"48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612\"" Jan 13 20:20:28.778017 systemd[1]: Started cri-containerd-b753e3814d37465d44978de392be6b088e383d0e581b421938cc00605a0c5248.scope - libcontainer container b753e3814d37465d44978de392be6b088e383d0e581b421938cc00605a0c5248. Jan 13 20:20:28.835899 containerd[1478]: time="2025-01-13T20:20:28.835837355Z" level=info msg="StartContainer for \"b753e3814d37465d44978de392be6b088e383d0e581b421938cc00605a0c5248\" returns successfully" Jan 13 20:20:29.128931 systemd-networkd[1377]: cali269b5349490: Gained IPv6LL Jan 13 20:20:29.130422 systemd-networkd[1377]: caliad7e863cac6: Gained IPv6LL Jan 13 20:20:29.189664 kernel: bpftool[5471]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Jan 13 20:20:29.192803 systemd-networkd[1377]: cali3e2816f8dad: Gained IPv6LL Jan 13 20:20:29.257762 systemd-networkd[1377]: cali5059be056f4: Gained IPv6LL Jan 13 20:20:29.273277 kubelet[2817]: I0113 20:20:29.273201 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-8n6tx" podStartSLOduration=45.273179572 podStartE2EDuration="45.273179572s" podCreationTimestamp="2025-01-13 20:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:20:29.270245565 +0000 UTC m=+62.182905457" watchObservedRunningTime="2025-01-13 20:20:29.273179572 +0000 UTC m=+62.185839424" Jan 13 20:20:29.325829 kubelet[2817]: I0113 20:20:29.325576 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-khp7b" podStartSLOduration=45.325556168 podStartE2EDuration="45.325556168s" podCreationTimestamp="2025-01-13 20:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 20:20:29.300227792 +0000 UTC m=+62.212887644" watchObservedRunningTime="2025-01-13 20:20:29.325556168 +0000 UTC m=+62.238216100" Jan 13 20:20:29.467552 systemd-networkd[1377]: vxlan.calico: Link UP Jan 13 20:20:29.467558 systemd-networkd[1377]: vxlan.calico: Gained carrier Jan 13 20:20:29.640976 systemd-networkd[1377]: cali29e3b9d4c8e: Gained IPv6LL Jan 13 20:20:29.897704 systemd-networkd[1377]: caliba7b05590b2: Gained IPv6LL Jan 13 20:20:30.601792 systemd-networkd[1377]: vxlan.calico: Gained IPv6LL Jan 13 20:20:31.340951 containerd[1478]: time="2025-01-13T20:20:31.340893845Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:31.342609 containerd[1478]: time="2025-01-13T20:20:31.341719887Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=39298409" Jan 13 20:20:31.343474 containerd[1478]: time="2025-01-13T20:20:31.343377731Z" level=info msg="ImageCreate event name:\"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:31.347681 containerd[1478]: time="2025-01-13T20:20:31.347101179Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:31.348669 containerd[1478]: time="2025-01-13T20:20:31.348627822Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"40668079\" in 3.27771739s" Jan 13 20:20:31.348818 containerd[1478]: time="2025-01-13T20:20:31.348799502Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\"" Jan 13 20:20:31.351136 containerd[1478]: time="2025-01-13T20:20:31.351101667Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Jan 13 20:20:31.354115 containerd[1478]: time="2025-01-13T20:20:31.354060074Z" level=info msg="CreateContainer within sandbox \"85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 13 20:20:31.385139 containerd[1478]: time="2025-01-13T20:20:31.385083181Z" level=info msg="CreateContainer within sandbox \"85874591874a1af1b2cc5bebfa15d70ee2bd630dbad8ac1a254d3a06cdcb4225\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ec21098f23aa09ce8967991eb262e96bf4939e498441b7fa5996ceb8e1d0a835\"" Jan 13 20:20:31.387091 containerd[1478]: time="2025-01-13T20:20:31.386167303Z" level=info msg="StartContainer for \"ec21098f23aa09ce8967991eb262e96bf4939e498441b7fa5996ceb8e1d0a835\"" Jan 13 20:20:31.428066 systemd[1]: Started cri-containerd-ec21098f23aa09ce8967991eb262e96bf4939e498441b7fa5996ceb8e1d0a835.scope - libcontainer container ec21098f23aa09ce8967991eb262e96bf4939e498441b7fa5996ceb8e1d0a835. Jan 13 20:20:31.479563 containerd[1478]: time="2025-01-13T20:20:31.479463426Z" level=info msg="StartContainer for \"ec21098f23aa09ce8967991eb262e96bf4939e498441b7fa5996ceb8e1d0a835\" returns successfully" Jan 13 20:20:32.281947 kubelet[2817]: I0113 20:20:32.280756 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7776878d6f-kzg6k" podStartSLOduration=18.99917936 podStartE2EDuration="22.280731918s" podCreationTimestamp="2025-01-13 20:20:10 +0000 UTC" firstStartedPulling="2025-01-13 20:20:28.068864868 +0000 UTC m=+60.981524720" lastFinishedPulling="2025-01-13 20:20:31.350417426 +0000 UTC m=+64.263077278" observedRunningTime="2025-01-13 20:20:32.279754836 +0000 UTC m=+65.192414728" watchObservedRunningTime="2025-01-13 20:20:32.280731918 +0000 UTC m=+65.193391770" Jan 13 20:20:33.268855 kubelet[2817]: I0113 20:20:33.268724 2817 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:20:33.659936 containerd[1478]: time="2025-01-13T20:20:33.659773341Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:33.661754 containerd[1478]: time="2025-01-13T20:20:33.661591345Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=31953828" Jan 13 20:20:33.664048 containerd[1478]: time="2025-01-13T20:20:33.662920068Z" level=info msg="ImageCreate event name:\"sha256:32c335fdb9d757e7ba6a76a9cfa8d292a5a229101ae7ea37b42f53c28adf2db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:33.666085 containerd[1478]: time="2025-01-13T20:20:33.666041394Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:33.666746 containerd[1478]: time="2025-01-13T20:20:33.666710236Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:32c335fdb9d757e7ba6a76a9cfa8d292a5a229101ae7ea37b42f53c28adf2db1\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"33323450\" in 2.314893407s" Jan 13 20:20:33.666837 containerd[1478]: time="2025-01-13T20:20:33.666748516Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:32c335fdb9d757e7ba6a76a9cfa8d292a5a229101ae7ea37b42f53c28adf2db1\"" Jan 13 20:20:33.668850 containerd[1478]: time="2025-01-13T20:20:33.668814280Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Jan 13 20:20:33.697080 containerd[1478]: time="2025-01-13T20:20:33.697044180Z" level=info msg="CreateContainer within sandbox \"70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 13 20:20:33.717344 containerd[1478]: time="2025-01-13T20:20:33.717241463Z" level=info msg="CreateContainer within sandbox \"70505f87acbe1969ae6eb0cb4082949fdeccc0a88b8a252600893cf16f27dd89\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"fd26ebc55caa2dda30d6bed315b3e143bbdf416051bc10f91578fcd4e56291e0\"" Jan 13 20:20:33.718414 containerd[1478]: time="2025-01-13T20:20:33.718363705Z" level=info msg="StartContainer for \"fd26ebc55caa2dda30d6bed315b3e143bbdf416051bc10f91578fcd4e56291e0\"" Jan 13 20:20:33.760844 systemd[1]: Started cri-containerd-fd26ebc55caa2dda30d6bed315b3e143bbdf416051bc10f91578fcd4e56291e0.scope - libcontainer container fd26ebc55caa2dda30d6bed315b3e143bbdf416051bc10f91578fcd4e56291e0. Jan 13 20:20:33.802429 containerd[1478]: time="2025-01-13T20:20:33.802327884Z" level=info msg="StartContainer for \"fd26ebc55caa2dda30d6bed315b3e143bbdf416051bc10f91578fcd4e56291e0\" returns successfully" Jan 13 20:20:34.340028 kubelet[2817]: I0113 20:20:34.339930 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-d9b896c9c-x5cqp" podStartSLOduration=18.872161061 podStartE2EDuration="24.339907016s" podCreationTimestamp="2025-01-13 20:20:10 +0000 UTC" firstStartedPulling="2025-01-13 20:20:28.200402484 +0000 UTC m=+61.113062296" lastFinishedPulling="2025-01-13 20:20:33.668148399 +0000 UTC m=+66.580808251" observedRunningTime="2025-01-13 20:20:34.297887008 +0000 UTC m=+67.210546860" watchObservedRunningTime="2025-01-13 20:20:34.339907016 +0000 UTC m=+67.252566828" Jan 13 20:20:35.222131 containerd[1478]: time="2025-01-13T20:20:35.222082181Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:35.224440 containerd[1478]: time="2025-01-13T20:20:35.224360386Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7464730" Jan 13 20:20:35.225740 containerd[1478]: time="2025-01-13T20:20:35.225676909Z" level=info msg="ImageCreate event name:\"sha256:3c11734f3001b7070e7e2b5e64938f89891cf8c44f8997e86aa23c5d5bf70163\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:35.231120 containerd[1478]: time="2025-01-13T20:20:35.229950117Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:35.231120 containerd[1478]: time="2025-01-13T20:20:35.230978400Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:3c11734f3001b7070e7e2b5e64938f89891cf8c44f8997e86aa23c5d5bf70163\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"8834384\" in 1.561857399s" Jan 13 20:20:35.231120 containerd[1478]: time="2025-01-13T20:20:35.231012040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:3c11734f3001b7070e7e2b5e64938f89891cf8c44f8997e86aa23c5d5bf70163\"" Jan 13 20:20:35.234696 containerd[1478]: time="2025-01-13T20:20:35.234652047Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 13 20:20:35.238953 containerd[1478]: time="2025-01-13T20:20:35.238308175Z" level=info msg="CreateContainer within sandbox \"17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jan 13 20:20:35.267020 containerd[1478]: time="2025-01-13T20:20:35.266966034Z" level=info msg="CreateContainer within sandbox \"17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"e1540588edea07ce3c35e5c51b26d129a784ff5e4144843b7259a48eb8a444ac\"" Jan 13 20:20:35.270626 containerd[1478]: time="2025-01-13T20:20:35.267807836Z" level=info msg="StartContainer for \"e1540588edea07ce3c35e5c51b26d129a784ff5e4144843b7259a48eb8a444ac\"" Jan 13 20:20:35.316867 systemd[1]: Started cri-containerd-e1540588edea07ce3c35e5c51b26d129a784ff5e4144843b7259a48eb8a444ac.scope - libcontainer container e1540588edea07ce3c35e5c51b26d129a784ff5e4144843b7259a48eb8a444ac. Jan 13 20:20:35.353156 containerd[1478]: time="2025-01-13T20:20:35.353082253Z" level=info msg="StartContainer for \"e1540588edea07ce3c35e5c51b26d129a784ff5e4144843b7259a48eb8a444ac\" returns successfully" Jan 13 20:20:35.603183 containerd[1478]: time="2025-01-13T20:20:35.602921651Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:35.607632 containerd[1478]: time="2025-01-13T20:20:35.605777017Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Jan 13 20:20:35.608399 containerd[1478]: time="2025-01-13T20:20:35.608324582Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"40668079\" in 373.190094ms" Jan 13 20:20:35.608561 containerd[1478]: time="2025-01-13T20:20:35.608539343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\"" Jan 13 20:20:35.611038 containerd[1478]: time="2025-01-13T20:20:35.610991148Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Jan 13 20:20:35.612911 containerd[1478]: time="2025-01-13T20:20:35.612872032Z" level=info msg="CreateContainer within sandbox \"48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 13 20:20:35.641913 containerd[1478]: time="2025-01-13T20:20:35.641842812Z" level=info msg="CreateContainer within sandbox \"48813a14fad31814ec6a082c539a14182df17ad3b336c7bbee46235c68903612\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"f26624a7a798d8f8867bcd39ab1425543815cb7775ea9ab703bd855991f2c26f\"" Jan 13 20:20:35.645668 containerd[1478]: time="2025-01-13T20:20:35.642503133Z" level=info msg="StartContainer for \"f26624a7a798d8f8867bcd39ab1425543815cb7775ea9ab703bd855991f2c26f\"" Jan 13 20:20:35.679852 systemd[1]: Started cri-containerd-f26624a7a798d8f8867bcd39ab1425543815cb7775ea9ab703bd855991f2c26f.scope - libcontainer container f26624a7a798d8f8867bcd39ab1425543815cb7775ea9ab703bd855991f2c26f. Jan 13 20:20:35.728053 containerd[1478]: time="2025-01-13T20:20:35.728013071Z" level=info msg="StartContainer for \"f26624a7a798d8f8867bcd39ab1425543815cb7775ea9ab703bd855991f2c26f\" returns successfully" Jan 13 20:20:36.316144 kubelet[2817]: I0113 20:20:36.316065 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7776878d6f-zsmfk" podStartSLOduration=19.477588908 podStartE2EDuration="26.316044164s" podCreationTimestamp="2025-01-13 20:20:10 +0000 UTC" firstStartedPulling="2025-01-13 20:20:28.77138393 +0000 UTC m=+61.684043782" lastFinishedPulling="2025-01-13 20:20:35.609839186 +0000 UTC m=+68.522499038" observedRunningTime="2025-01-13 20:20:36.3141264 +0000 UTC m=+69.226786292" watchObservedRunningTime="2025-01-13 20:20:36.316044164 +0000 UTC m=+69.228704016" Jan 13 20:20:37.299096 kubelet[2817]: I0113 20:20:37.299059 2817 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:20:37.308375 containerd[1478]: time="2025-01-13T20:20:37.307639752Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:37.309716 containerd[1478]: time="2025-01-13T20:20:37.309654756Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=9883368" Jan 13 20:20:37.312640 containerd[1478]: time="2025-01-13T20:20:37.312000521Z" level=info msg="ImageCreate event name:\"sha256:3eb557f7694f230afd24a75a691bcda4c0a7bfe87a981386dcd4ecf2b0701349\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:37.315156 containerd[1478]: time="2025-01-13T20:20:37.315112407Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 20:20:37.315780 containerd[1478]: time="2025-01-13T20:20:37.315738569Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:3eb557f7694f230afd24a75a691bcda4c0a7bfe87a981386dcd4ecf2b0701349\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11252974\" in 1.704703301s" Jan 13 20:20:37.315780 containerd[1478]: time="2025-01-13T20:20:37.315778129Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:3eb557f7694f230afd24a75a691bcda4c0a7bfe87a981386dcd4ecf2b0701349\"" Jan 13 20:20:37.322042 containerd[1478]: time="2025-01-13T20:20:37.321994221Z" level=info msg="CreateContainer within sandbox \"17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jan 13 20:20:37.349914 containerd[1478]: time="2025-01-13T20:20:37.349746158Z" level=info msg="CreateContainer within sandbox \"17f941d86062cf71d4f04eb94b5b666908f7efa2ab4627ce007de96c62b8de2a\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"b2be2a7342fe9d97ece35e27badc638d68b863436401028847f1d88101c92c7f\"" Jan 13 20:20:37.353730 containerd[1478]: time="2025-01-13T20:20:37.352708044Z" level=info msg="StartContainer for \"b2be2a7342fe9d97ece35e27badc638d68b863436401028847f1d88101c92c7f\"" Jan 13 20:20:37.407907 systemd[1]: Started cri-containerd-b2be2a7342fe9d97ece35e27badc638d68b863436401028847f1d88101c92c7f.scope - libcontainer container b2be2a7342fe9d97ece35e27badc638d68b863436401028847f1d88101c92c7f. Jan 13 20:20:37.442768 containerd[1478]: time="2025-01-13T20:20:37.442712306Z" level=info msg="StartContainer for \"b2be2a7342fe9d97ece35e27badc638d68b863436401028847f1d88101c92c7f\" returns successfully" Jan 13 20:20:38.327902 kubelet[2817]: I0113 20:20:38.327822 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-vhlrh" podStartSLOduration=19.71750547 podStartE2EDuration="28.327801137s" podCreationTimestamp="2025-01-13 20:20:10 +0000 UTC" firstStartedPulling="2025-01-13 20:20:28.708435828 +0000 UTC m=+61.621095640" lastFinishedPulling="2025-01-13 20:20:37.318731495 +0000 UTC m=+70.231391307" observedRunningTime="2025-01-13 20:20:38.326789495 +0000 UTC m=+71.239449347" watchObservedRunningTime="2025-01-13 20:20:38.327801137 +0000 UTC m=+71.240460989" Jan 13 20:20:38.370916 kubelet[2817]: I0113 20:20:38.370881 2817 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jan 13 20:20:38.370916 kubelet[2817]: I0113 20:20:38.370920 2817 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jan 13 20:21:10.546657 kubelet[2817]: I0113 20:21:10.545512 2817 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 20:21:11.660491 systemd[1]: Started sshd@7-138.199.153.203:22-8.221.136.6:26554.service - OpenSSH per-connection server daemon (8.221.136.6:26554). Jan 13 20:21:14.893000 sshd[5896]: kex_exchange_identification: read: Connection reset by peer Jan 13 20:21:14.893000 sshd[5896]: Connection reset by 8.221.136.6 port 26554 Jan 13 20:21:14.893853 systemd[1]: sshd@7-138.199.153.203:22-8.221.136.6:26554.service: Deactivated successfully. Jan 13 20:21:15.151903 systemd[1]: Started sshd@8-138.199.153.203:22-8.221.136.6:26558.service - OpenSSH per-connection server daemon (8.221.136.6:26558). Jan 13 20:21:16.139561 sshd[5900]: Invalid user from 8.221.136.6 port 26558 Jan 13 20:21:16.381352 sshd[5900]: Connection closed by invalid user 8.221.136.6 port 26558 [preauth] Jan 13 20:21:16.384549 systemd[1]: sshd@8-138.199.153.203:22-8.221.136.6:26558.service: Deactivated successfully. Jan 13 20:21:20.384205 systemd[1]: run-containerd-runc-k8s.io-fd26ebc55caa2dda30d6bed315b3e143bbdf416051bc10f91578fcd4e56291e0-runc.3wF2ab.mount: Deactivated successfully. Jan 13 20:21:28.461803 containerd[1478]: time="2025-01-13T20:21:28.461619760Z" level=info msg="StopPodSandbox for \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\"" Jan 13 20:21:28.463060 containerd[1478]: time="2025-01-13T20:21:28.462208761Z" level=info msg="TearDown network for sandbox \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\" successfully" Jan 13 20:21:28.463060 containerd[1478]: time="2025-01-13T20:21:28.462228721Z" level=info msg="StopPodSandbox for \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\" returns successfully" Jan 13 20:21:28.464231 containerd[1478]: time="2025-01-13T20:21:28.464186564Z" level=info msg="RemovePodSandbox for \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\"" Jan 13 20:21:28.464331 containerd[1478]: time="2025-01-13T20:21:28.464268284Z" level=info msg="Forcibly stopping sandbox \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\"" Jan 13 20:21:28.464458 containerd[1478]: time="2025-01-13T20:21:28.464437685Z" level=info msg="TearDown network for sandbox \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\" successfully" Jan 13 20:21:28.469976 containerd[1478]: time="2025-01-13T20:21:28.469800853Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:21:28.469976 containerd[1478]: time="2025-01-13T20:21:28.469892453Z" level=info msg="RemovePodSandbox \"857aecf133b52543398e4f6115a3d29102a006e1dda7434790c93cc2c3a56287\" returns successfully" Jan 13 20:21:28.471646 containerd[1478]: time="2025-01-13T20:21:28.471309895Z" level=info msg="StopPodSandbox for \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\"" Jan 13 20:21:28.471646 containerd[1478]: time="2025-01-13T20:21:28.471519095Z" level=info msg="TearDown network for sandbox \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\" successfully" Jan 13 20:21:28.471646 containerd[1478]: time="2025-01-13T20:21:28.471537255Z" level=info msg="StopPodSandbox for \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\" returns successfully" Jan 13 20:21:28.472165 containerd[1478]: time="2025-01-13T20:21:28.472135936Z" level=info msg="RemovePodSandbox for \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\"" Jan 13 20:21:28.472382 containerd[1478]: time="2025-01-13T20:21:28.472355656Z" level=info msg="Forcibly stopping sandbox \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\"" Jan 13 20:21:28.472661 containerd[1478]: time="2025-01-13T20:21:28.472557417Z" level=info msg="TearDown network for sandbox \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\" successfully" Jan 13 20:21:28.476936 containerd[1478]: time="2025-01-13T20:21:28.476678823Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:21:28.476936 containerd[1478]: time="2025-01-13T20:21:28.476797543Z" level=info msg="RemovePodSandbox \"177029645a3e01cad3bdb5289fe20954fbf97371a3fe90286f4238b70e4c3a06\" returns successfully" Jan 13 20:21:28.477797 containerd[1478]: time="2025-01-13T20:21:28.477764905Z" level=info msg="StopPodSandbox for \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\"" Jan 13 20:21:28.477928 containerd[1478]: time="2025-01-13T20:21:28.477907105Z" level=info msg="TearDown network for sandbox \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\" successfully" Jan 13 20:21:28.477973 containerd[1478]: time="2025-01-13T20:21:28.477926985Z" level=info msg="StopPodSandbox for \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\" returns successfully" Jan 13 20:21:28.480224 containerd[1478]: time="2025-01-13T20:21:28.478545066Z" level=info msg="RemovePodSandbox for \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\"" Jan 13 20:21:28.480224 containerd[1478]: time="2025-01-13T20:21:28.478582346Z" level=info msg="Forcibly stopping sandbox \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\"" Jan 13 20:21:28.480224 containerd[1478]: time="2025-01-13T20:21:28.478726306Z" level=info msg="TearDown network for sandbox \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\" successfully" Jan 13 20:21:28.482559 containerd[1478]: time="2025-01-13T20:21:28.482385951Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:21:28.482559 containerd[1478]: time="2025-01-13T20:21:28.482465592Z" level=info msg="RemovePodSandbox \"ca2635379c68aefa299f098cb5b4101c0c66bc24c979701e0608fedbcbf34c23\" returns successfully" Jan 13 20:21:28.483178 containerd[1478]: time="2025-01-13T20:21:28.483147153Z" level=info msg="StopPodSandbox for \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\"" Jan 13 20:21:28.483335 containerd[1478]: time="2025-01-13T20:21:28.483310193Z" level=info msg="TearDown network for sandbox \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\" successfully" Jan 13 20:21:28.483367 containerd[1478]: time="2025-01-13T20:21:28.483335713Z" level=info msg="StopPodSandbox for \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\" returns successfully" Jan 13 20:21:28.484061 containerd[1478]: time="2025-01-13T20:21:28.484023834Z" level=info msg="RemovePodSandbox for \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\"" Jan 13 20:21:28.484140 containerd[1478]: time="2025-01-13T20:21:28.484090274Z" level=info msg="Forcibly stopping sandbox \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\"" Jan 13 20:21:28.484239 containerd[1478]: time="2025-01-13T20:21:28.484215674Z" level=info msg="TearDown network for sandbox \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\" successfully" Jan 13 20:21:28.488830 containerd[1478]: time="2025-01-13T20:21:28.488775001Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:21:28.488947 containerd[1478]: time="2025-01-13T20:21:28.488864361Z" level=info msg="RemovePodSandbox \"8a64a9830240611ea094e4e3861c0dffa3b2b6e341169e410b2ca01f543e4ed8\" returns successfully" Jan 13 20:21:28.489423 containerd[1478]: time="2025-01-13T20:21:28.489357242Z" level=info msg="StopPodSandbox for \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\"" Jan 13 20:21:28.489506 containerd[1478]: time="2025-01-13T20:21:28.489471242Z" level=info msg="TearDown network for sandbox \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\" successfully" Jan 13 20:21:28.489506 containerd[1478]: time="2025-01-13T20:21:28.489482402Z" level=info msg="StopPodSandbox for \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\" returns successfully" Jan 13 20:21:28.490010 containerd[1478]: time="2025-01-13T20:21:28.489951643Z" level=info msg="RemovePodSandbox for \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\"" Jan 13 20:21:28.490010 containerd[1478]: time="2025-01-13T20:21:28.489984603Z" level=info msg="Forcibly stopping sandbox \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\"" Jan 13 20:21:28.490150 containerd[1478]: time="2025-01-13T20:21:28.490126083Z" level=info msg="TearDown network for sandbox \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\" successfully" Jan 13 20:21:28.496628 containerd[1478]: time="2025-01-13T20:21:28.495109211Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:21:28.496628 containerd[1478]: time="2025-01-13T20:21:28.495229331Z" level=info msg="RemovePodSandbox \"80cc597c70a7f0f797b5dba54d541f7e5de22ab9de4e7282a2bd1a8bde86e276\" returns successfully" Jan 13 20:21:28.496628 containerd[1478]: time="2025-01-13T20:21:28.495761091Z" level=info msg="StopPodSandbox for \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\"" Jan 13 20:21:28.496628 containerd[1478]: time="2025-01-13T20:21:28.495888092Z" level=info msg="TearDown network for sandbox \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\" successfully" Jan 13 20:21:28.496628 containerd[1478]: time="2025-01-13T20:21:28.495910052Z" level=info msg="StopPodSandbox for \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\" returns successfully" Jan 13 20:21:28.497080 containerd[1478]: time="2025-01-13T20:21:28.497022933Z" level=info msg="RemovePodSandbox for \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\"" Jan 13 20:21:28.497080 containerd[1478]: time="2025-01-13T20:21:28.497060413Z" level=info msg="Forcibly stopping sandbox \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\"" Jan 13 20:21:28.497184 containerd[1478]: time="2025-01-13T20:21:28.497156414Z" level=info msg="TearDown network for sandbox \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\" successfully" Jan 13 20:21:28.503998 containerd[1478]: time="2025-01-13T20:21:28.503924744Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 20:21:28.504332 containerd[1478]: time="2025-01-13T20:21:28.504300064Z" level=info msg="RemovePodSandbox \"8cf7553ff0b3aa5495eede8432cdde3a487f25e3f4e2072aaea7ac702b8a6e33\" returns successfully" Jan 13 20:21:41.022115 systemd[1]: run-containerd-runc-k8s.io-96b9690d96e824ba117600c7f0a7137d27ed11f91390112f21873ab8d07e5388-runc.XXmReH.mount: Deactivated successfully. Jan 13 20:22:41.027803 systemd[1]: run-containerd-runc-k8s.io-96b9690d96e824ba117600c7f0a7137d27ed11f91390112f21873ab8d07e5388-runc.X5jGCE.mount: Deactivated successfully. Jan 13 20:22:50.373173 systemd[1]: run-containerd-runc-k8s.io-fd26ebc55caa2dda30d6bed315b3e143bbdf416051bc10f91578fcd4e56291e0-runc.ebo5Mo.mount: Deactivated successfully. Jan 13 20:23:00.532404 update_engine[1459]: I20250113 20:23:00.530450 1459 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 13 20:23:00.532404 update_engine[1459]: I20250113 20:23:00.530527 1459 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 13 20:23:00.532404 update_engine[1459]: I20250113 20:23:00.530846 1459 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 13 20:23:00.533031 update_engine[1459]: I20250113 20:23:00.533000 1459 omaha_request_params.cc:62] Current group set to beta Jan 13 20:23:00.534828 update_engine[1459]: I20250113 20:23:00.534772 1459 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 13 20:23:00.534986 update_engine[1459]: I20250113 20:23:00.534968 1459 update_attempter.cc:643] Scheduling an action processor start. Jan 13 20:23:00.535434 update_engine[1459]: I20250113 20:23:00.535041 1459 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 13 20:23:00.540788 update_engine[1459]: I20250113 20:23:00.538895 1459 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 13 20:23:00.540788 update_engine[1459]: I20250113 20:23:00.539022 1459 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 13 20:23:00.540788 update_engine[1459]: I20250113 20:23:00.539031 1459 omaha_request_action.cc:272] Request: Jan 13 20:23:00.540788 update_engine[1459]: Jan 13 20:23:00.540788 update_engine[1459]: Jan 13 20:23:00.540788 update_engine[1459]: Jan 13 20:23:00.540788 update_engine[1459]: Jan 13 20:23:00.540788 update_engine[1459]: Jan 13 20:23:00.540788 update_engine[1459]: Jan 13 20:23:00.540788 update_engine[1459]: Jan 13 20:23:00.540788 update_engine[1459]: Jan 13 20:23:00.540788 update_engine[1459]: I20250113 20:23:00.539039 1459 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 13 20:23:00.543467 update_engine[1459]: I20250113 20:23:00.543427 1459 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 13 20:23:00.544051 update_engine[1459]: I20250113 20:23:00.544016 1459 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 13 20:23:00.545086 update_engine[1459]: E20250113 20:23:00.545053 1459 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 13 20:23:00.545256 update_engine[1459]: I20250113 20:23:00.545225 1459 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 13 20:23:00.545465 locksmithd[1488]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 13 20:23:10.443642 update_engine[1459]: I20250113 20:23:10.442950 1459 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 13 20:23:10.443642 update_engine[1459]: I20250113 20:23:10.443254 1459 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 13 20:23:10.443642 update_engine[1459]: I20250113 20:23:10.443543 1459 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 13 20:23:10.444564 update_engine[1459]: E20250113 20:23:10.444412 1459 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 13 20:23:10.444564 update_engine[1459]: I20250113 20:23:10.444501 1459 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 13 20:23:20.442201 update_engine[1459]: I20250113 20:23:20.442085 1459 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 13 20:23:20.443074 update_engine[1459]: I20250113 20:23:20.442399 1459 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 13 20:23:20.443074 update_engine[1459]: I20250113 20:23:20.442800 1459 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 13 20:23:20.443342 update_engine[1459]: E20250113 20:23:20.443285 1459 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 13 20:23:20.443431 update_engine[1459]: I20250113 20:23:20.443362 1459 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 13 20:23:30.437629 update_engine[1459]: I20250113 20:23:30.437485 1459 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 13 20:23:30.438033 update_engine[1459]: I20250113 20:23:30.437991 1459 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 13 20:23:30.438485 update_engine[1459]: I20250113 20:23:30.438340 1459 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 13 20:23:30.438928 update_engine[1459]: E20250113 20:23:30.438831 1459 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 13 20:23:30.439026 update_engine[1459]: I20250113 20:23:30.438930 1459 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 13 20:23:30.439026 update_engine[1459]: I20250113 20:23:30.438943 1459 omaha_request_action.cc:617] Omaha request response: Jan 13 20:23:30.439073 update_engine[1459]: E20250113 20:23:30.439057 1459 omaha_request_action.cc:636] Omaha request network transfer failed. Jan 13 20:23:30.439095 update_engine[1459]: I20250113 20:23:30.439085 1459 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jan 13 20:23:30.439117 update_engine[1459]: I20250113 20:23:30.439093 1459 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 13 20:23:30.439117 update_engine[1459]: I20250113 20:23:30.439101 1459 update_attempter.cc:306] Processing Done. Jan 13 20:23:30.439157 update_engine[1459]: E20250113 20:23:30.439121 1459 update_attempter.cc:619] Update failed. Jan 13 20:23:30.439157 update_engine[1459]: I20250113 20:23:30.439129 1459 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jan 13 20:23:30.439157 update_engine[1459]: I20250113 20:23:30.439137 1459 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jan 13 20:23:30.439157 update_engine[1459]: I20250113 20:23:30.439146 1459 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jan 13 20:23:30.439381 update_engine[1459]: I20250113 20:23:30.439239 1459 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 13 20:23:30.439381 update_engine[1459]: I20250113 20:23:30.439272 1459 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 13 20:23:30.439381 update_engine[1459]: I20250113 20:23:30.439282 1459 omaha_request_action.cc:272] Request: Jan 13 20:23:30.439381 update_engine[1459]: Jan 13 20:23:30.439381 update_engine[1459]: Jan 13 20:23:30.439381 update_engine[1459]: Jan 13 20:23:30.439381 update_engine[1459]: Jan 13 20:23:30.439381 update_engine[1459]: Jan 13 20:23:30.439381 update_engine[1459]: Jan 13 20:23:30.439381 update_engine[1459]: I20250113 20:23:30.439291 1459 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 13 20:23:30.439580 update_engine[1459]: I20250113 20:23:30.439493 1459 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 13 20:23:30.439837 locksmithd[1488]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jan 13 20:23:30.440121 update_engine[1459]: I20250113 20:23:30.439829 1459 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 13 20:23:30.440543 update_engine[1459]: E20250113 20:23:30.440342 1459 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 13 20:23:30.440543 update_engine[1459]: I20250113 20:23:30.440400 1459 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 13 20:23:30.440543 update_engine[1459]: I20250113 20:23:30.440411 1459 omaha_request_action.cc:617] Omaha request response: Jan 13 20:23:30.440543 update_engine[1459]: I20250113 20:23:30.440418 1459 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 13 20:23:30.440543 update_engine[1459]: I20250113 20:23:30.440423 1459 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 13 20:23:30.440543 update_engine[1459]: I20250113 20:23:30.440429 1459 update_attempter.cc:306] Processing Done. Jan 13 20:23:30.440543 update_engine[1459]: I20250113 20:23:30.440434 1459 update_attempter.cc:310] Error event sent. Jan 13 20:23:30.440543 update_engine[1459]: I20250113 20:23:30.440443 1459 update_check_scheduler.cc:74] Next update check in 47m15s Jan 13 20:23:30.441027 locksmithd[1488]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jan 13 20:23:41.021392 systemd[1]: run-containerd-runc-k8s.io-96b9690d96e824ba117600c7f0a7137d27ed11f91390112f21873ab8d07e5388-runc.wsw4V0.mount: Deactivated successfully. Jan 13 20:23:50.372074 systemd[1]: run-containerd-runc-k8s.io-fd26ebc55caa2dda30d6bed315b3e143bbdf416051bc10f91578fcd4e56291e0-runc.QU6CX6.mount: Deactivated successfully. Jan 13 20:24:19.306968 systemd[1]: Started sshd@9-138.199.153.203:22-139.178.89.65:59522.service - OpenSSH per-connection server daemon (139.178.89.65:59522). Jan 13 20:24:20.312095 sshd[6296]: Accepted publickey for core from 139.178.89.65 port 59522 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:24:20.322038 sshd-session[6296]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:24:20.331677 systemd-logind[1457]: New session 8 of user core. Jan 13 20:24:20.336828 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 13 20:24:21.099108 sshd[6298]: Connection closed by 139.178.89.65 port 59522 Jan 13 20:24:21.100974 sshd-session[6296]: pam_unix(sshd:session): session closed for user core Jan 13 20:24:21.106815 systemd[1]: sshd@9-138.199.153.203:22-139.178.89.65:59522.service: Deactivated successfully. Jan 13 20:24:21.112243 systemd[1]: session-8.scope: Deactivated successfully. Jan 13 20:24:21.113529 systemd-logind[1457]: Session 8 logged out. Waiting for processes to exit. Jan 13 20:24:21.114679 systemd-logind[1457]: Removed session 8. Jan 13 20:24:26.274381 systemd[1]: Started sshd@10-138.199.153.203:22-139.178.89.65:60444.service - OpenSSH per-connection server daemon (139.178.89.65:60444). Jan 13 20:24:27.261860 sshd[6330]: Accepted publickey for core from 139.178.89.65 port 60444 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:24:27.262449 sshd-session[6330]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:24:27.269932 systemd-logind[1457]: New session 9 of user core. Jan 13 20:24:27.275947 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 13 20:24:28.033435 sshd[6334]: Connection closed by 139.178.89.65 port 60444 Jan 13 20:24:28.035166 sshd-session[6330]: pam_unix(sshd:session): session closed for user core Jan 13 20:24:28.043763 systemd[1]: sshd@10-138.199.153.203:22-139.178.89.65:60444.service: Deactivated successfully. Jan 13 20:24:28.047026 systemd[1]: session-9.scope: Deactivated successfully. Jan 13 20:24:28.050985 systemd-logind[1457]: Session 9 logged out. Waiting for processes to exit. Jan 13 20:24:28.053366 systemd-logind[1457]: Removed session 9. Jan 13 20:24:28.213039 systemd[1]: Started sshd@11-138.199.153.203:22-139.178.89.65:60452.service - OpenSSH per-connection server daemon (139.178.89.65:60452). Jan 13 20:24:29.203814 sshd[6346]: Accepted publickey for core from 139.178.89.65 port 60452 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:24:29.209773 sshd-session[6346]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:24:29.218215 systemd-logind[1457]: New session 10 of user core. Jan 13 20:24:29.222155 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 13 20:24:30.015090 sshd[6368]: Connection closed by 139.178.89.65 port 60452 Jan 13 20:24:30.015618 sshd-session[6346]: pam_unix(sshd:session): session closed for user core Jan 13 20:24:30.019994 systemd[1]: sshd@11-138.199.153.203:22-139.178.89.65:60452.service: Deactivated successfully. Jan 13 20:24:30.022243 systemd[1]: session-10.scope: Deactivated successfully. Jan 13 20:24:30.024802 systemd-logind[1457]: Session 10 logged out. Waiting for processes to exit. Jan 13 20:24:30.026466 systemd-logind[1457]: Removed session 10. Jan 13 20:24:30.197185 systemd[1]: Started sshd@12-138.199.153.203:22-139.178.89.65:60458.service - OpenSSH per-connection server daemon (139.178.89.65:60458). Jan 13 20:24:31.184670 sshd[6377]: Accepted publickey for core from 139.178.89.65 port 60458 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:24:31.187307 sshd-session[6377]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:24:31.192220 systemd-logind[1457]: New session 11 of user core. Jan 13 20:24:31.198089 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 13 20:24:31.954053 sshd[6379]: Connection closed by 139.178.89.65 port 60458 Jan 13 20:24:31.955006 sshd-session[6377]: pam_unix(sshd:session): session closed for user core Jan 13 20:24:31.961217 systemd[1]: sshd@12-138.199.153.203:22-139.178.89.65:60458.service: Deactivated successfully. Jan 13 20:24:31.966554 systemd[1]: session-11.scope: Deactivated successfully. Jan 13 20:24:31.969148 systemd-logind[1457]: Session 11 logged out. Waiting for processes to exit. Jan 13 20:24:31.970782 systemd-logind[1457]: Removed session 11. Jan 13 20:24:37.132851 systemd[1]: Started sshd@13-138.199.153.203:22-139.178.89.65:58870.service - OpenSSH per-connection server daemon (139.178.89.65:58870). Jan 13 20:24:38.133351 sshd[6395]: Accepted publickey for core from 139.178.89.65 port 58870 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:24:38.135724 sshd-session[6395]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:24:38.142941 systemd-logind[1457]: New session 12 of user core. Jan 13 20:24:38.151228 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 13 20:24:38.914931 sshd[6397]: Connection closed by 139.178.89.65 port 58870 Jan 13 20:24:38.915476 sshd-session[6395]: pam_unix(sshd:session): session closed for user core Jan 13 20:24:38.921373 systemd[1]: sshd@13-138.199.153.203:22-139.178.89.65:58870.service: Deactivated successfully. Jan 13 20:24:38.925244 systemd[1]: session-12.scope: Deactivated successfully. Jan 13 20:24:38.926486 systemd-logind[1457]: Session 12 logged out. Waiting for processes to exit. Jan 13 20:24:38.928227 systemd-logind[1457]: Removed session 12. Jan 13 20:24:39.094041 systemd[1]: Started sshd@14-138.199.153.203:22-139.178.89.65:58878.service - OpenSSH per-connection server daemon (139.178.89.65:58878). Jan 13 20:24:40.094205 sshd[6408]: Accepted publickey for core from 139.178.89.65 port 58878 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:24:40.094893 sshd-session[6408]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:24:40.100970 systemd-logind[1457]: New session 13 of user core. Jan 13 20:24:40.104804 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 13 20:24:41.017904 sshd[6410]: Connection closed by 139.178.89.65 port 58878 Jan 13 20:24:41.019345 sshd-session[6408]: pam_unix(sshd:session): session closed for user core Jan 13 20:24:41.027531 systemd[1]: sshd@14-138.199.153.203:22-139.178.89.65:58878.service: Deactivated successfully. Jan 13 20:24:41.032219 systemd[1]: session-13.scope: Deactivated successfully. Jan 13 20:24:41.033691 systemd-logind[1457]: Session 13 logged out. Waiting for processes to exit. Jan 13 20:24:41.034796 systemd-logind[1457]: Removed session 13. Jan 13 20:24:41.200079 systemd[1]: Started sshd@15-138.199.153.203:22-139.178.89.65:58892.service - OpenSSH per-connection server daemon (139.178.89.65:58892). Jan 13 20:24:42.209993 sshd[6440]: Accepted publickey for core from 139.178.89.65 port 58892 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:24:42.213280 sshd-session[6440]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:24:42.219286 systemd-logind[1457]: New session 14 of user core. Jan 13 20:24:42.224807 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 13 20:24:44.926813 sshd[6442]: Connection closed by 139.178.89.65 port 58892 Jan 13 20:24:44.929973 sshd-session[6440]: pam_unix(sshd:session): session closed for user core Jan 13 20:24:44.936510 systemd[1]: sshd@15-138.199.153.203:22-139.178.89.65:58892.service: Deactivated successfully. Jan 13 20:24:44.940559 systemd[1]: session-14.scope: Deactivated successfully. Jan 13 20:24:44.944864 systemd-logind[1457]: Session 14 logged out. Waiting for processes to exit. Jan 13 20:24:44.946120 systemd-logind[1457]: Removed session 14. Jan 13 20:24:45.096061 systemd[1]: Started sshd@16-138.199.153.203:22-139.178.89.65:50314.service - OpenSSH per-connection server daemon (139.178.89.65:50314). Jan 13 20:24:46.094612 sshd[6458]: Accepted publickey for core from 139.178.89.65 port 50314 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:24:46.095227 sshd-session[6458]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:24:46.101660 systemd-logind[1457]: New session 15 of user core. Jan 13 20:24:46.105866 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 13 20:24:46.994768 sshd[6462]: Connection closed by 139.178.89.65 port 50314 Jan 13 20:24:46.995584 sshd-session[6458]: pam_unix(sshd:session): session closed for user core Jan 13 20:24:47.000674 systemd-logind[1457]: Session 15 logged out. Waiting for processes to exit. Jan 13 20:24:47.000993 systemd[1]: sshd@16-138.199.153.203:22-139.178.89.65:50314.service: Deactivated successfully. Jan 13 20:24:47.004516 systemd[1]: session-15.scope: Deactivated successfully. Jan 13 20:24:47.007327 systemd-logind[1457]: Removed session 15. Jan 13 20:24:47.170663 systemd[1]: Started sshd@17-138.199.153.203:22-139.178.89.65:50316.service - OpenSSH per-connection server daemon (139.178.89.65:50316). Jan 13 20:24:48.153934 sshd[6471]: Accepted publickey for core from 139.178.89.65 port 50316 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:24:48.156590 sshd-session[6471]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:24:48.165375 systemd-logind[1457]: New session 16 of user core. Jan 13 20:24:48.171854 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 13 20:24:48.916574 sshd[6473]: Connection closed by 139.178.89.65 port 50316 Jan 13 20:24:48.917309 sshd-session[6471]: pam_unix(sshd:session): session closed for user core Jan 13 20:24:48.924212 systemd-logind[1457]: Session 16 logged out. Waiting for processes to exit. Jan 13 20:24:48.926520 systemd[1]: sshd@17-138.199.153.203:22-139.178.89.65:50316.service: Deactivated successfully. Jan 13 20:24:48.932677 systemd[1]: session-16.scope: Deactivated successfully. Jan 13 20:24:48.935675 systemd-logind[1457]: Removed session 16. Jan 13 20:24:54.093050 systemd[1]: Started sshd@18-138.199.153.203:22-139.178.89.65:41410.service - OpenSSH per-connection server daemon (139.178.89.65:41410). Jan 13 20:24:55.067122 sshd[6506]: Accepted publickey for core from 139.178.89.65 port 41410 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:24:55.071338 sshd-session[6506]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:24:55.079653 systemd-logind[1457]: New session 17 of user core. Jan 13 20:24:55.101177 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 13 20:24:55.823614 sshd[6508]: Connection closed by 139.178.89.65 port 41410 Jan 13 20:24:55.823141 sshd-session[6506]: pam_unix(sshd:session): session closed for user core Jan 13 20:24:55.829645 systemd[1]: sshd@18-138.199.153.203:22-139.178.89.65:41410.service: Deactivated successfully. Jan 13 20:24:55.835294 systemd[1]: session-17.scope: Deactivated successfully. Jan 13 20:24:55.837854 systemd-logind[1457]: Session 17 logged out. Waiting for processes to exit. Jan 13 20:24:55.839343 systemd-logind[1457]: Removed session 17. Jan 13 20:25:00.996673 systemd[1]: Started sshd@19-138.199.153.203:22-139.178.89.65:41424.service - OpenSSH per-connection server daemon (139.178.89.65:41424). Jan 13 20:25:01.976553 sshd[6519]: Accepted publickey for core from 139.178.89.65 port 41424 ssh2: RSA SHA256:mP9Np05W8ayjbouGSSYjPkGP0Fk3DK/yb5iC6Sb3lHc Jan 13 20:25:01.979908 sshd-session[6519]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 20:25:01.987898 systemd-logind[1457]: New session 18 of user core. Jan 13 20:25:01.993985 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 13 20:25:02.737370 sshd[6521]: Connection closed by 139.178.89.65 port 41424 Jan 13 20:25:02.736830 sshd-session[6519]: pam_unix(sshd:session): session closed for user core Jan 13 20:25:02.741405 systemd[1]: sshd@19-138.199.153.203:22-139.178.89.65:41424.service: Deactivated successfully. Jan 13 20:25:02.744506 systemd[1]: session-18.scope: Deactivated successfully. Jan 13 20:25:02.746198 systemd-logind[1457]: Session 18 logged out. Waiting for processes to exit. Jan 13 20:25:02.747873 systemd-logind[1457]: Removed session 18. Jan 13 20:25:11.030168 systemd[1]: run-containerd-runc-k8s.io-96b9690d96e824ba117600c7f0a7137d27ed11f91390112f21873ab8d07e5388-runc.tSRt2o.mount: Deactivated successfully. Jan 13 20:25:28.707535 systemd[1]: run-containerd-runc-k8s.io-fd26ebc55caa2dda30d6bed315b3e143bbdf416051bc10f91578fcd4e56291e0-runc.Tg1wLY.mount: Deactivated successfully. Jan 13 20:25:34.609950 systemd[1]: cri-containerd-f8ccbd0a8728367c341e17fad7f36507a333d203f9bc449f6ec6caf7a545f82a.scope: Deactivated successfully. Jan 13 20:25:34.611819 systemd[1]: cri-containerd-f8ccbd0a8728367c341e17fad7f36507a333d203f9bc449f6ec6caf7a545f82a.scope: Consumed 6.709s CPU time, 21.8M memory peak, 0B memory swap peak. Jan 13 20:25:34.637930 containerd[1478]: time="2025-01-13T20:25:34.637032214Z" level=info msg="shim disconnected" id=f8ccbd0a8728367c341e17fad7f36507a333d203f9bc449f6ec6caf7a545f82a namespace=k8s.io Jan 13 20:25:34.638182 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f8ccbd0a8728367c341e17fad7f36507a333d203f9bc449f6ec6caf7a545f82a-rootfs.mount: Deactivated successfully. Jan 13 20:25:34.639716 containerd[1478]: time="2025-01-13T20:25:34.638700685Z" level=warning msg="cleaning up after shim disconnected" id=f8ccbd0a8728367c341e17fad7f36507a333d203f9bc449f6ec6caf7a545f82a namespace=k8s.io Jan 13 20:25:34.639716 containerd[1478]: time="2025-01-13T20:25:34.638756446Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:25:34.736419 systemd[1]: cri-containerd-d853fcf395080df1cd38ab3f891cdb2f8a6ddf60a830ab56643c86fac4450f84.scope: Deactivated successfully. Jan 13 20:25:34.739345 systemd[1]: cri-containerd-d853fcf395080df1cd38ab3f891cdb2f8a6ddf60a830ab56643c86fac4450f84.scope: Consumed 5.991s CPU time. Jan 13 20:25:34.768655 containerd[1478]: time="2025-01-13T20:25:34.768567035Z" level=info msg="shim disconnected" id=d853fcf395080df1cd38ab3f891cdb2f8a6ddf60a830ab56643c86fac4450f84 namespace=k8s.io Jan 13 20:25:34.768655 containerd[1478]: time="2025-01-13T20:25:34.768643436Z" level=warning msg="cleaning up after shim disconnected" id=d853fcf395080df1cd38ab3f891cdb2f8a6ddf60a830ab56643c86fac4450f84 namespace=k8s.io Jan 13 20:25:34.768655 containerd[1478]: time="2025-01-13T20:25:34.768656397Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:25:34.769327 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d853fcf395080df1cd38ab3f891cdb2f8a6ddf60a830ab56643c86fac4450f84-rootfs.mount: Deactivated successfully. Jan 13 20:25:34.987226 kubelet[2817]: E0113 20:25:34.987083 2817 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:40346->10.0.0.2:2379: read: connection timed out" Jan 13 20:25:35.192347 kubelet[2817]: I0113 20:25:35.192023 2817 scope.go:117] "RemoveContainer" containerID="f8ccbd0a8728367c341e17fad7f36507a333d203f9bc449f6ec6caf7a545f82a" Jan 13 20:25:35.197129 kubelet[2817]: I0113 20:25:35.196940 2817 scope.go:117] "RemoveContainer" containerID="d853fcf395080df1cd38ab3f891cdb2f8a6ddf60a830ab56643c86fac4450f84" Jan 13 20:25:35.197629 containerd[1478]: time="2025-01-13T20:25:35.197434647Z" level=info msg="CreateContainer within sandbox \"69ac0d500ad38637117ba17350f7fca996956941613199042e2f35cfde8e9c77\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 13 20:25:35.212610 containerd[1478]: time="2025-01-13T20:25:35.212547572Z" level=info msg="CreateContainer within sandbox \"60889e28563ac19dd18f35de621660bd1465b36f720b72cec6c03b837e342729\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 13 20:25:35.227874 containerd[1478]: time="2025-01-13T20:25:35.227699179Z" level=info msg="CreateContainer within sandbox \"69ac0d500ad38637117ba17350f7fca996956941613199042e2f35cfde8e9c77\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"0a4188afbcd9f071428153968dde7225cde81143ed3a016e2c6abad8f007e832\"" Jan 13 20:25:35.230633 containerd[1478]: time="2025-01-13T20:25:35.228445473Z" level=info msg="StartContainer for \"0a4188afbcd9f071428153968dde7225cde81143ed3a016e2c6abad8f007e832\"" Jan 13 20:25:35.240676 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3771547606.mount: Deactivated successfully. Jan 13 20:25:35.242504 containerd[1478]: time="2025-01-13T20:25:35.242161852Z" level=info msg="CreateContainer within sandbox \"60889e28563ac19dd18f35de621660bd1465b36f720b72cec6c03b837e342729\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"2ad3e65d491bb9d67d6f816fbb41da4a3ae36d3b9ed4890dae18e13627657dab\"" Jan 13 20:25:35.243211 containerd[1478]: time="2025-01-13T20:25:35.242861865Z" level=info msg="StartContainer for \"2ad3e65d491bb9d67d6f816fbb41da4a3ae36d3b9ed4890dae18e13627657dab\"" Jan 13 20:25:35.273865 systemd[1]: Started cri-containerd-0a4188afbcd9f071428153968dde7225cde81143ed3a016e2c6abad8f007e832.scope - libcontainer container 0a4188afbcd9f071428153968dde7225cde81143ed3a016e2c6abad8f007e832. Jan 13 20:25:35.284360 systemd[1]: Started cri-containerd-2ad3e65d491bb9d67d6f816fbb41da4a3ae36d3b9ed4890dae18e13627657dab.scope - libcontainer container 2ad3e65d491bb9d67d6f816fbb41da4a3ae36d3b9ed4890dae18e13627657dab. Jan 13 20:25:35.324922 containerd[1478]: time="2025-01-13T20:25:35.324778613Z" level=info msg="StartContainer for \"0a4188afbcd9f071428153968dde7225cde81143ed3a016e2c6abad8f007e832\" returns successfully" Jan 13 20:25:35.344181 containerd[1478]: time="2025-01-13T20:25:35.344119338Z" level=info msg="StartContainer for \"2ad3e65d491bb9d67d6f816fbb41da4a3ae36d3b9ed4890dae18e13627657dab\" returns successfully" Jan 13 20:25:39.216771 kubelet[2817]: E0113 20:25:39.210169 2817 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:40176->10.0.0.2:2379: read: connection timed out" event=< Jan 13 20:25:39.216771 kubelet[2817]: &Event{ObjectMeta:{calico-kube-controllers-d9b896c9c-x5cqp.181a5a554e06cdb5 calico-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-system,Name:calico-kube-controllers-d9b896c9c-x5cqp,UID:2fb62ec7-4c06-48e1-aa87-7b62ac4da84a,APIVersion:v1,ResourceVersion:752,FieldPath:spec.containers{calico-kube-controllers},},Reason:Unhealthy,Message:Liveness probe failed: Error verifying datastore: Get "https://10.96.0.1:443/apis/crd.projectcalico.org/v1/clusterinformations/default": context deadline exceeded Jan 13 20:25:39.216771 kubelet[2817]: ,Source:EventSource{Component:kubelet,Host:ci-4186-1-0-7-a3f46aeb9c,},FirstTimestamp:2025-01-13 20:25:28.732536245 +0000 UTC m=+361.645196097,LastTimestamp:2025-01-13 20:25:28.732536245 +0000 UTC m=+361.645196097,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4186-1-0-7-a3f46aeb9c,} Jan 13 20:25:39.216771 kubelet[2817]: > Jan 13 20:25:39.662906 systemd[1]: cri-containerd-2ad3e65d491bb9d67d6f816fbb41da4a3ae36d3b9ed4890dae18e13627657dab.scope: Deactivated successfully. Jan 13 20:25:39.692951 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2ad3e65d491bb9d67d6f816fbb41da4a3ae36d3b9ed4890dae18e13627657dab-rootfs.mount: Deactivated successfully. Jan 13 20:25:39.701983 containerd[1478]: time="2025-01-13T20:25:39.701674491Z" level=info msg="shim disconnected" id=2ad3e65d491bb9d67d6f816fbb41da4a3ae36d3b9ed4890dae18e13627657dab namespace=k8s.io Jan 13 20:25:39.701983 containerd[1478]: time="2025-01-13T20:25:39.701746892Z" level=warning msg="cleaning up after shim disconnected" id=2ad3e65d491bb9d67d6f816fbb41da4a3ae36d3b9ed4890dae18e13627657dab namespace=k8s.io Jan 13 20:25:39.701983 containerd[1478]: time="2025-01-13T20:25:39.701757332Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:25:40.081295 systemd[1]: cri-containerd-e83fbe58e72cd88a28efdb6097d48ab6e42dcfd649c01b81166dadf26b733015.scope: Deactivated successfully. Jan 13 20:25:40.083762 systemd[1]: cri-containerd-e83fbe58e72cd88a28efdb6097d48ab6e42dcfd649c01b81166dadf26b733015.scope: Consumed 3.113s CPU time, 18.1M memory peak, 0B memory swap peak. Jan 13 20:25:40.110575 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e83fbe58e72cd88a28efdb6097d48ab6e42dcfd649c01b81166dadf26b733015-rootfs.mount: Deactivated successfully. Jan 13 20:25:40.111801 containerd[1478]: time="2025-01-13T20:25:40.111417863Z" level=info msg="shim disconnected" id=e83fbe58e72cd88a28efdb6097d48ab6e42dcfd649c01b81166dadf26b733015 namespace=k8s.io Jan 13 20:25:40.111801 containerd[1478]: time="2025-01-13T20:25:40.111492184Z" level=warning msg="cleaning up after shim disconnected" id=e83fbe58e72cd88a28efdb6097d48ab6e42dcfd649c01b81166dadf26b733015 namespace=k8s.io Jan 13 20:25:40.111801 containerd[1478]: time="2025-01-13T20:25:40.111514505Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 20:25:40.219625 kubelet[2817]: I0113 20:25:40.219555 2817 scope.go:117] "RemoveContainer" containerID="e83fbe58e72cd88a28efdb6097d48ab6e42dcfd649c01b81166dadf26b733015" Jan 13 20:25:40.222571 kubelet[2817]: I0113 20:25:40.222523 2817 scope.go:117] "RemoveContainer" containerID="d853fcf395080df1cd38ab3f891cdb2f8a6ddf60a830ab56643c86fac4450f84" Jan 13 20:25:40.222984 containerd[1478]: time="2025-01-13T20:25:40.222949144Z" level=info msg="CreateContainer within sandbox \"8d14acc280406613d063aa18dd33ab40af07eaef2250f28d026c52bc3dd230bc\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 13 20:25:40.223462 kubelet[2817]: I0113 20:25:40.223443 2817 scope.go:117] "RemoveContainer" containerID="2ad3e65d491bb9d67d6f816fbb41da4a3ae36d3b9ed4890dae18e13627657dab" Jan 13 20:25:40.224553 kubelet[2817]: E0113 20:25:40.224490 2817 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7bc55997bb-8xkf8_tigera-operator(165dbc9f-0adf-4a2f-b0a7-d975c523a53f)\"" pod="tigera-operator/tigera-operator-7bc55997bb-8xkf8" podUID="165dbc9f-0adf-4a2f-b0a7-d975c523a53f" Jan 13 20:25:40.227284 containerd[1478]: time="2025-01-13T20:25:40.227128220Z" level=info msg="RemoveContainer for \"d853fcf395080df1cd38ab3f891cdb2f8a6ddf60a830ab56643c86fac4450f84\"" Jan 13 20:25:40.233528 containerd[1478]: time="2025-01-13T20:25:40.233476856Z" level=info msg="RemoveContainer for \"d853fcf395080df1cd38ab3f891cdb2f8a6ddf60a830ab56643c86fac4450f84\" returns successfully" Jan 13 20:25:40.244492 containerd[1478]: time="2025-01-13T20:25:40.244299254Z" level=info msg="CreateContainer within sandbox \"8d14acc280406613d063aa18dd33ab40af07eaef2250f28d026c52bc3dd230bc\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"4a2ba0305bf4f819a3e5a88f3728e8d3c1b29accd331a007720e0007cfa6eef2\"" Jan 13 20:25:40.245016 containerd[1478]: time="2025-01-13T20:25:40.244990427Z" level=info msg="StartContainer for \"4a2ba0305bf4f819a3e5a88f3728e8d3c1b29accd331a007720e0007cfa6eef2\"" Jan 13 20:25:40.277874 systemd[1]: Started cri-containerd-4a2ba0305bf4f819a3e5a88f3728e8d3c1b29accd331a007720e0007cfa6eef2.scope - libcontainer container 4a2ba0305bf4f819a3e5a88f3728e8d3c1b29accd331a007720e0007cfa6eef2. Jan 13 20:25:40.321482 containerd[1478]: time="2025-01-13T20:25:40.321287383Z" level=info msg="StartContainer for \"4a2ba0305bf4f819a3e5a88f3728e8d3c1b29accd331a007720e0007cfa6eef2\" returns successfully"