May 13 23:43:10.863484 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] May 13 23:43:10.863508 kernel: Linux version 6.6.89-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Tue May 13 22:16:18 -00 2025 May 13 23:43:10.863518 kernel: KASLR enabled May 13 23:43:10.863524 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II May 13 23:43:10.863529 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390bb018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 May 13 23:43:10.863535 kernel: random: crng init done May 13 23:43:10.863542 kernel: secureboot: Secure boot disabled May 13 23:43:10.863547 kernel: ACPI: Early table checksum verification disabled May 13 23:43:10.863553 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) May 13 23:43:10.863560 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) May 13 23:43:10.863566 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:43:10.863572 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:43:10.863577 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:43:10.863583 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:43:10.863590 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:43:10.863598 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:43:10.863604 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:43:10.863610 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:43:10.863616 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:43:10.863622 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) May 13 23:43:10.863628 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 May 13 23:43:10.863633 kernel: NUMA: Failed to initialise from firmware May 13 23:43:10.863640 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] May 13 23:43:10.863646 kernel: NUMA: NODE_DATA [mem 0x13966f800-0x139674fff] May 13 23:43:10.863651 kernel: Zone ranges: May 13 23:43:10.863659 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] May 13 23:43:10.863665 kernel: DMA32 empty May 13 23:43:10.863671 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] May 13 23:43:10.863677 kernel: Movable zone start for each node May 13 23:43:10.863683 kernel: Early memory node ranges May 13 23:43:10.863689 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] May 13 23:43:10.863695 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] May 13 23:43:10.863701 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] May 13 23:43:10.863707 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] May 13 23:43:10.863713 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] May 13 23:43:10.863719 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] May 13 23:43:10.863725 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] May 13 23:43:10.863732 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] May 13 23:43:10.863738 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] May 13 23:43:10.863744 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] May 13 23:43:10.863753 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges May 13 23:43:10.863759 kernel: psci: probing for conduit method from ACPI. May 13 23:43:10.863766 kernel: psci: PSCIv1.1 detected in firmware. May 13 23:43:10.863774 kernel: psci: Using standard PSCI v0.2 function IDs May 13 23:43:10.863780 kernel: psci: Trusted OS migration not required May 13 23:43:10.863786 kernel: psci: SMC Calling Convention v1.1 May 13 23:43:10.863793 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) May 13 23:43:10.863799 kernel: percpu: Embedded 31 pages/cpu s86632 r8192 d32152 u126976 May 13 23:43:10.863806 kernel: pcpu-alloc: s86632 r8192 d32152 u126976 alloc=31*4096 May 13 23:43:10.863812 kernel: pcpu-alloc: [0] 0 [0] 1 May 13 23:43:10.863818 kernel: Detected PIPT I-cache on CPU0 May 13 23:43:10.863825 kernel: CPU features: detected: GIC system register CPU interface May 13 23:43:10.863831 kernel: CPU features: detected: Hardware dirty bit management May 13 23:43:10.863839 kernel: CPU features: detected: Spectre-v4 May 13 23:43:10.863846 kernel: CPU features: detected: Spectre-BHB May 13 23:43:10.863852 kernel: CPU features: kernel page table isolation forced ON by KASLR May 13 23:43:10.863858 kernel: CPU features: detected: Kernel page table isolation (KPTI) May 13 23:43:10.863865 kernel: CPU features: detected: ARM erratum 1418040 May 13 23:43:10.863871 kernel: CPU features: detected: SSBS not fully self-synchronizing May 13 23:43:10.863877 kernel: alternatives: applying boot alternatives May 13 23:43:10.863885 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=3174b2682629aa8ad4069807ed6fd62c10f62266ee1e150a1104f2a2fb6489b5 May 13 23:43:10.863892 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 13 23:43:10.863898 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 13 23:43:10.863905 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 13 23:43:10.863912 kernel: Fallback order for Node 0: 0 May 13 23:43:10.863919 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 May 13 23:43:10.863925 kernel: Policy zone: Normal May 13 23:43:10.863932 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 13 23:43:10.863938 kernel: software IO TLB: area num 2. May 13 23:43:10.863944 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) May 13 23:43:10.863951 kernel: Memory: 3883704K/4096000K available (10368K kernel code, 2186K rwdata, 8100K rodata, 38464K init, 897K bss, 212296K reserved, 0K cma-reserved) May 13 23:43:10.863958 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 13 23:43:10.863964 kernel: rcu: Preemptible hierarchical RCU implementation. May 13 23:43:10.863972 kernel: rcu: RCU event tracing is enabled. May 13 23:43:10.863978 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 13 23:43:10.863985 kernel: Trampoline variant of Tasks RCU enabled. May 13 23:43:10.863992 kernel: Tracing variant of Tasks RCU enabled. May 13 23:43:10.863999 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 13 23:43:10.864006 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 13 23:43:10.864012 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 May 13 23:43:10.864018 kernel: GICv3: 256 SPIs implemented May 13 23:43:10.864024 kernel: GICv3: 0 Extended SPIs implemented May 13 23:43:10.864031 kernel: Root IRQ handler: gic_handle_irq May 13 23:43:10.864037 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI May 13 23:43:10.864043 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 May 13 23:43:10.864050 kernel: ITS [mem 0x08080000-0x0809ffff] May 13 23:43:10.864056 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) May 13 23:43:10.864136 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) May 13 23:43:10.864147 kernel: GICv3: using LPI property table @0x00000001000e0000 May 13 23:43:10.864154 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 May 13 23:43:10.864160 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 13 23:43:10.864166 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 13 23:43:10.864173 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). May 13 23:43:10.864179 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns May 13 23:43:10.864186 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns May 13 23:43:10.864192 kernel: Console: colour dummy device 80x25 May 13 23:43:10.864200 kernel: ACPI: Core revision 20230628 May 13 23:43:10.864207 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) May 13 23:43:10.864216 kernel: pid_max: default: 32768 minimum: 301 May 13 23:43:10.864223 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity May 13 23:43:10.864230 kernel: landlock: Up and running. May 13 23:43:10.864236 kernel: SELinux: Initializing. May 13 23:43:10.864243 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 23:43:10.864249 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 23:43:10.864256 kernel: ACPI PPTT: PPTT table found, but unable to locate core 1 (1) May 13 23:43:10.864263 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 13 23:43:10.864270 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 13 23:43:10.864278 kernel: rcu: Hierarchical SRCU implementation. May 13 23:43:10.864284 kernel: rcu: Max phase no-delay instances is 400. May 13 23:43:10.864291 kernel: Platform MSI: ITS@0x8080000 domain created May 13 23:43:10.864297 kernel: PCI/MSI: ITS@0x8080000 domain created May 13 23:43:10.864304 kernel: Remapping and enabling EFI services. May 13 23:43:10.864321 kernel: smp: Bringing up secondary CPUs ... May 13 23:43:10.864329 kernel: Detected PIPT I-cache on CPU1 May 13 23:43:10.864335 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 May 13 23:43:10.864342 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 May 13 23:43:10.864351 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 13 23:43:10.864358 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] May 13 23:43:10.864370 kernel: smp: Brought up 1 node, 2 CPUs May 13 23:43:10.864379 kernel: SMP: Total of 2 processors activated. May 13 23:43:10.864386 kernel: CPU features: detected: 32-bit EL0 Support May 13 23:43:10.864393 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence May 13 23:43:10.864400 kernel: CPU features: detected: Common not Private translations May 13 23:43:10.864407 kernel: CPU features: detected: CRC32 instructions May 13 23:43:10.864414 kernel: CPU features: detected: Enhanced Virtualization Traps May 13 23:43:10.864421 kernel: CPU features: detected: RCpc load-acquire (LDAPR) May 13 23:43:10.864429 kernel: CPU features: detected: LSE atomic instructions May 13 23:43:10.864436 kernel: CPU features: detected: Privileged Access Never May 13 23:43:10.864443 kernel: CPU features: detected: RAS Extension Support May 13 23:43:10.864450 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) May 13 23:43:10.864457 kernel: CPU: All CPU(s) started at EL1 May 13 23:43:10.864464 kernel: alternatives: applying system-wide alternatives May 13 23:43:10.864471 kernel: devtmpfs: initialized May 13 23:43:10.864479 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 13 23:43:10.864487 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 13 23:43:10.864493 kernel: pinctrl core: initialized pinctrl subsystem May 13 23:43:10.864500 kernel: SMBIOS 3.0.0 present. May 13 23:43:10.864507 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 May 13 23:43:10.864514 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 13 23:43:10.864521 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations May 13 23:43:10.864534 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations May 13 23:43:10.864542 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations May 13 23:43:10.864551 kernel: audit: initializing netlink subsys (disabled) May 13 23:43:10.864558 kernel: audit: type=2000 audit(0.011:1): state=initialized audit_enabled=0 res=1 May 13 23:43:10.864565 kernel: thermal_sys: Registered thermal governor 'step_wise' May 13 23:43:10.864572 kernel: cpuidle: using governor menu May 13 23:43:10.864579 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. May 13 23:43:10.864586 kernel: ASID allocator initialised with 32768 entries May 13 23:43:10.864593 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 13 23:43:10.864600 kernel: Serial: AMBA PL011 UART driver May 13 23:43:10.864607 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL May 13 23:43:10.864615 kernel: Modules: 0 pages in range for non-PLT usage May 13 23:43:10.864622 kernel: Modules: 509232 pages in range for PLT usage May 13 23:43:10.864629 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 13 23:43:10.864636 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page May 13 23:43:10.864643 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages May 13 23:43:10.864650 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page May 13 23:43:10.864657 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 13 23:43:10.864664 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page May 13 23:43:10.864671 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages May 13 23:43:10.864679 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page May 13 23:43:10.864686 kernel: ACPI: Added _OSI(Module Device) May 13 23:43:10.864693 kernel: ACPI: Added _OSI(Processor Device) May 13 23:43:10.864700 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 13 23:43:10.864707 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 13 23:43:10.864714 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 13 23:43:10.864721 kernel: ACPI: Interpreter enabled May 13 23:43:10.864728 kernel: ACPI: Using GIC for interrupt routing May 13 23:43:10.864734 kernel: ACPI: MCFG table detected, 1 entries May 13 23:43:10.864743 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA May 13 23:43:10.864751 kernel: printk: console [ttyAMA0] enabled May 13 23:43:10.864757 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 13 23:43:10.864920 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 13 23:43:10.864999 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] May 13 23:43:10.865102 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] May 13 23:43:10.865183 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 May 13 23:43:10.865255 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] May 13 23:43:10.865264 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] May 13 23:43:10.865271 kernel: PCI host bridge to bus 0000:00 May 13 23:43:10.865364 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] May 13 23:43:10.865429 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] May 13 23:43:10.865493 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] May 13 23:43:10.865554 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 13 23:43:10.865640 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 May 13 23:43:10.865722 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 May 13 23:43:10.865792 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] May 13 23:43:10.865861 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] May 13 23:43:10.865936 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 May 13 23:43:10.866041 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] May 13 23:43:10.866153 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 May 13 23:43:10.866232 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] May 13 23:43:10.866336 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 May 13 23:43:10.866423 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] May 13 23:43:10.866501 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 May 13 23:43:10.866570 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] May 13 23:43:10.866650 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 May 13 23:43:10.866724 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] May 13 23:43:10.866799 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 May 13 23:43:10.866869 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] May 13 23:43:10.866943 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 May 13 23:43:10.867013 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] May 13 23:43:10.867116 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 May 13 23:43:10.867195 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] May 13 23:43:10.867270 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 May 13 23:43:10.867359 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] May 13 23:43:10.867439 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 May 13 23:43:10.867509 kernel: pci 0000:00:04.0: reg 0x10: [io 0x0000-0x0007] May 13 23:43:10.867585 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 May 13 23:43:10.867661 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] May 13 23:43:10.867732 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] May 13 23:43:10.867801 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] May 13 23:43:10.867877 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 May 13 23:43:10.867947 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] May 13 23:43:10.868029 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 May 13 23:43:10.868135 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] May 13 23:43:10.868216 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] May 13 23:43:10.868297 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 May 13 23:43:10.868397 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] May 13 23:43:10.868479 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 May 13 23:43:10.868552 kernel: pci 0000:05:00.0: reg 0x14: [mem 0x10800000-0x10800fff] May 13 23:43:10.868622 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] May 13 23:43:10.868702 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 May 13 23:43:10.868773 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] May 13 23:43:10.868844 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] May 13 23:43:10.868923 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 May 13 23:43:10.868994 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] May 13 23:43:10.869141 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] May 13 23:43:10.869230 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] May 13 23:43:10.869302 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 13 23:43:10.869417 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 May 13 23:43:10.869486 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 May 13 23:43:10.869556 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 13 23:43:10.869621 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 13 23:43:10.869688 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 May 13 23:43:10.869762 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 13 23:43:10.869829 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 May 13 23:43:10.869895 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 May 13 23:43:10.869965 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 13 23:43:10.870031 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 May 13 23:43:10.870130 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 13 23:43:10.870205 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 May 13 23:43:10.870274 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 May 13 23:43:10.870386 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 May 13 23:43:10.870465 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 May 13 23:43:10.870534 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 May 13 23:43:10.871418 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 May 13 23:43:10.871534 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 May 13 23:43:10.871602 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 May 13 23:43:10.872202 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 May 13 23:43:10.872297 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 May 13 23:43:10.872390 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 May 13 23:43:10.872461 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 May 13 23:43:10.872533 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 May 13 23:43:10.872600 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 May 13 23:43:10.872667 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 May 13 23:43:10.872737 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] May 13 23:43:10.872804 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] May 13 23:43:10.872878 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] May 13 23:43:10.872947 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] May 13 23:43:10.873017 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] May 13 23:43:10.874021 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] May 13 23:43:10.874218 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] May 13 23:43:10.874290 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] May 13 23:43:10.874419 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] May 13 23:43:10.874496 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] May 13 23:43:10.874566 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] May 13 23:43:10.874634 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] May 13 23:43:10.874703 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] May 13 23:43:10.874769 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] May 13 23:43:10.874838 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] May 13 23:43:10.874910 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] May 13 23:43:10.874981 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] May 13 23:43:10.875047 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] May 13 23:43:10.877263 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] May 13 23:43:10.877410 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] May 13 23:43:10.877497 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] May 13 23:43:10.877568 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] May 13 23:43:10.877649 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] May 13 23:43:10.877716 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] May 13 23:43:10.877786 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] May 13 23:43:10.877854 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] May 13 23:43:10.877935 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] May 13 23:43:10.878001 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] May 13 23:43:10.880147 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] May 13 23:43:10.880275 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] May 13 23:43:10.880405 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] May 13 23:43:10.880492 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] May 13 23:43:10.880564 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] May 13 23:43:10.880634 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] May 13 23:43:10.880706 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] May 13 23:43:10.880774 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] May 13 23:43:10.880845 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] May 13 23:43:10.880913 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] May 13 23:43:10.880986 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] May 13 23:43:10.884721 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] May 13 23:43:10.884874 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] May 13 23:43:10.884948 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] May 13 23:43:10.885022 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] May 13 23:43:10.885118 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] May 13 23:43:10.885193 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] May 13 23:43:10.885263 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] May 13 23:43:10.885397 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] May 13 23:43:10.885478 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] May 13 23:43:10.885547 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] May 13 23:43:10.885615 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] May 13 23:43:10.885684 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] May 13 23:43:10.885765 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] May 13 23:43:10.885837 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] May 13 23:43:10.885909 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] May 13 23:43:10.885978 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] May 13 23:43:10.886046 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] May 13 23:43:10.886141 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] May 13 23:43:10.886225 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] May 13 23:43:10.886300 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] May 13 23:43:10.886416 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] May 13 23:43:10.886486 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] May 13 23:43:10.886554 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] May 13 23:43:10.886631 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] May 13 23:43:10.886701 kernel: pci 0000:05:00.0: BAR 1: assigned [mem 0x10800000-0x10800fff] May 13 23:43:10.886773 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] May 13 23:43:10.886841 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] May 13 23:43:10.886908 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] May 13 23:43:10.886979 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] May 13 23:43:10.887057 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] May 13 23:43:10.889209 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] May 13 23:43:10.889290 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] May 13 23:43:10.889383 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] May 13 23:43:10.889456 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] May 13 23:43:10.889524 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] May 13 23:43:10.889603 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] May 13 23:43:10.889683 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] May 13 23:43:10.889753 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] May 13 23:43:10.889825 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] May 13 23:43:10.889892 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] May 13 23:43:10.889960 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] May 13 23:43:10.890029 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] May 13 23:43:10.892194 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] May 13 23:43:10.892299 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] May 13 23:43:10.892392 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] May 13 23:43:10.892461 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] May 13 23:43:10.892530 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] May 13 23:43:10.892596 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] May 13 23:43:10.892698 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] May 13 23:43:10.892767 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] May 13 23:43:10.892836 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] May 13 23:43:10.892901 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] May 13 23:43:10.892966 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] May 13 23:43:10.893039 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] May 13 23:43:10.893208 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] May 13 23:43:10.893277 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] May 13 23:43:10.893401 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] May 13 23:43:10.893472 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] May 13 23:43:10.893552 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] May 13 23:43:10.893623 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] May 13 23:43:10.893688 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] May 13 23:43:10.893751 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] May 13 23:43:10.893823 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] May 13 23:43:10.893888 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] May 13 23:43:10.893953 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] May 13 23:43:10.894032 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] May 13 23:43:10.894119 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] May 13 23:43:10.894189 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] May 13 23:43:10.894264 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] May 13 23:43:10.894347 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] May 13 23:43:10.894412 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] May 13 23:43:10.894481 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] May 13 23:43:10.894543 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] May 13 23:43:10.894604 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] May 13 23:43:10.894671 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] May 13 23:43:10.894739 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] May 13 23:43:10.894800 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] May 13 23:43:10.894869 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] May 13 23:43:10.894934 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] May 13 23:43:10.894996 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] May 13 23:43:10.895005 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 May 13 23:43:10.895013 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 May 13 23:43:10.895023 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 May 13 23:43:10.895031 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 May 13 23:43:10.895039 kernel: iommu: Default domain type: Translated May 13 23:43:10.895046 kernel: iommu: DMA domain TLB invalidation policy: strict mode May 13 23:43:10.895054 kernel: efivars: Registered efivars operations May 13 23:43:10.895061 kernel: vgaarb: loaded May 13 23:43:10.897534 kernel: clocksource: Switched to clocksource arch_sys_counter May 13 23:43:10.897548 kernel: VFS: Disk quotas dquot_6.6.0 May 13 23:43:10.897557 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 13 23:43:10.897571 kernel: pnp: PnP ACPI init May 13 23:43:10.897723 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved May 13 23:43:10.897738 kernel: pnp: PnP ACPI: found 1 devices May 13 23:43:10.897746 kernel: NET: Registered PF_INET protocol family May 13 23:43:10.897753 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 13 23:43:10.897762 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 13 23:43:10.897769 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 13 23:43:10.897777 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 13 23:43:10.897785 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 13 23:43:10.897795 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 13 23:43:10.897803 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 23:43:10.897811 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 23:43:10.897818 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 13 23:43:10.897908 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) May 13 23:43:10.897919 kernel: PCI: CLS 0 bytes, default 64 May 13 23:43:10.897927 kernel: kvm [1]: HYP mode not available May 13 23:43:10.897934 kernel: Initialise system trusted keyrings May 13 23:43:10.897942 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 13 23:43:10.897951 kernel: Key type asymmetric registered May 13 23:43:10.897959 kernel: Asymmetric key parser 'x509' registered May 13 23:43:10.897967 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 13 23:43:10.897974 kernel: io scheduler mq-deadline registered May 13 23:43:10.897981 kernel: io scheduler kyber registered May 13 23:43:10.897988 kernel: io scheduler bfq registered May 13 23:43:10.897997 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 May 13 23:43:10.899163 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 May 13 23:43:10.899288 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 May 13 23:43:10.899385 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:43:10.899461 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 May 13 23:43:10.899529 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 May 13 23:43:10.899594 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:43:10.899664 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 May 13 23:43:10.899735 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 May 13 23:43:10.899801 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:43:10.899870 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 May 13 23:43:10.899936 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 May 13 23:43:10.900004 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:43:10.901113 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 May 13 23:43:10.901216 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 May 13 23:43:10.901286 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:43:10.901407 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 May 13 23:43:10.901483 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 May 13 23:43:10.901552 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:43:10.901626 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 May 13 23:43:10.901703 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 May 13 23:43:10.901772 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:43:10.901846 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 May 13 23:43:10.901915 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 May 13 23:43:10.901984 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:43:10.901996 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 May 13 23:43:10.904085 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 May 13 23:43:10.904218 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 May 13 23:43:10.904291 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:43:10.904302 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 May 13 23:43:10.904344 kernel: ACPI: button: Power Button [PWRB] May 13 23:43:10.904354 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 May 13 23:43:10.904447 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) May 13 23:43:10.904527 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) May 13 23:43:10.904543 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 13 23:43:10.904551 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 May 13 23:43:10.904625 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) May 13 23:43:10.904635 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A May 13 23:43:10.904643 kernel: thunder_xcv, ver 1.0 May 13 23:43:10.904650 kernel: thunder_bgx, ver 1.0 May 13 23:43:10.904657 kernel: nicpf, ver 1.0 May 13 23:43:10.904665 kernel: nicvf, ver 1.0 May 13 23:43:10.904752 kernel: rtc-efi rtc-efi.0: registered as rtc0 May 13 23:43:10.904824 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-05-13T23:43:10 UTC (1747179790) May 13 23:43:10.904833 kernel: hid: raw HID events driver (C) Jiri Kosina May 13 23:43:10.904841 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available May 13 23:43:10.904848 kernel: watchdog: Delayed init of the lockup detector failed: -19 May 13 23:43:10.904856 kernel: watchdog: Hard watchdog permanently disabled May 13 23:43:10.904864 kernel: NET: Registered PF_INET6 protocol family May 13 23:43:10.904871 kernel: Segment Routing with IPv6 May 13 23:43:10.904879 kernel: In-situ OAM (IOAM) with IPv6 May 13 23:43:10.904888 kernel: NET: Registered PF_PACKET protocol family May 13 23:43:10.904896 kernel: Key type dns_resolver registered May 13 23:43:10.904903 kernel: registered taskstats version 1 May 13 23:43:10.904911 kernel: Loading compiled-in X.509 certificates May 13 23:43:10.904918 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.89-flatcar: 568a15bbab977599d8f910f319ba50c03c8a57bd' May 13 23:43:10.904925 kernel: Key type .fscrypt registered May 13 23:43:10.904933 kernel: Key type fscrypt-provisioning registered May 13 23:43:10.904940 kernel: ima: No TPM chip found, activating TPM-bypass! May 13 23:43:10.904947 kernel: ima: Allocated hash algorithm: sha1 May 13 23:43:10.904957 kernel: ima: No architecture policies found May 13 23:43:10.904964 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) May 13 23:43:10.904972 kernel: clk: Disabling unused clocks May 13 23:43:10.904979 kernel: Freeing unused kernel memory: 38464K May 13 23:43:10.904988 kernel: Run /init as init process May 13 23:43:10.904997 kernel: with arguments: May 13 23:43:10.905006 kernel: /init May 13 23:43:10.905014 kernel: with environment: May 13 23:43:10.905021 kernel: HOME=/ May 13 23:43:10.905030 kernel: TERM=linux May 13 23:43:10.905040 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 13 23:43:10.905049 systemd[1]: Successfully made /usr/ read-only. May 13 23:43:10.905062 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 23:43:10.905085 systemd[1]: Detected virtualization kvm. May 13 23:43:10.905096 systemd[1]: Detected architecture arm64. May 13 23:43:10.905104 systemd[1]: Running in initrd. May 13 23:43:10.905116 systemd[1]: No hostname configured, using default hostname. May 13 23:43:10.905127 systemd[1]: Hostname set to . May 13 23:43:10.905136 systemd[1]: Initializing machine ID from VM UUID. May 13 23:43:10.905145 systemd[1]: Queued start job for default target initrd.target. May 13 23:43:10.905154 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:43:10.905162 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:43:10.905173 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 13 23:43:10.905182 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 23:43:10.905193 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 13 23:43:10.905203 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 13 23:43:10.905214 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 13 23:43:10.905226 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 13 23:43:10.905236 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:43:10.905245 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 23:43:10.905253 systemd[1]: Reached target paths.target - Path Units. May 13 23:43:10.905262 systemd[1]: Reached target slices.target - Slice Units. May 13 23:43:10.905270 systemd[1]: Reached target swap.target - Swaps. May 13 23:43:10.905278 systemd[1]: Reached target timers.target - Timer Units. May 13 23:43:10.905289 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 13 23:43:10.905298 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 23:43:10.905317 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 13 23:43:10.905327 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 13 23:43:10.905335 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 23:43:10.905344 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 23:43:10.905357 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:43:10.905366 systemd[1]: Reached target sockets.target - Socket Units. May 13 23:43:10.905376 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 13 23:43:10.905384 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 23:43:10.905392 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 13 23:43:10.905402 systemd[1]: Starting systemd-fsck-usr.service... May 13 23:43:10.905411 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 23:43:10.905421 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 23:43:10.905431 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:43:10.905471 systemd-journald[237]: Collecting audit messages is disabled. May 13 23:43:10.905493 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 13 23:43:10.905503 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:43:10.905515 systemd[1]: Finished systemd-fsck-usr.service. May 13 23:43:10.905525 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 23:43:10.905535 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 23:43:10.905544 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 23:43:10.905554 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:43:10.905565 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:43:10.905575 systemd-journald[237]: Journal started May 13 23:43:10.905595 systemd-journald[237]: Runtime Journal (/run/log/journal/c3fb02349eba47f3952a38800ffa7061) is 8M, max 76.6M, 68.6M free. May 13 23:43:10.896302 systemd-modules-load[239]: Inserted module 'overlay' May 13 23:43:10.910118 systemd[1]: Started systemd-journald.service - Journal Service. May 13 23:43:10.916093 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 13 23:43:10.919200 kernel: Bridge firewalling registered May 13 23:43:10.919257 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 23:43:10.919402 systemd-modules-load[239]: Inserted module 'br_netfilter' May 13 23:43:10.921575 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 23:43:10.922824 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:43:10.934281 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 23:43:10.943305 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:43:10.947383 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 13 23:43:10.951117 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:43:10.960099 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 23:43:10.963262 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 23:43:10.973424 dracut-cmdline[269]: dracut-dracut-053 May 13 23:43:10.978171 dracut-cmdline[269]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=3174b2682629aa8ad4069807ed6fd62c10f62266ee1e150a1104f2a2fb6489b5 May 13 23:43:11.007141 systemd-resolved[274]: Positive Trust Anchors: May 13 23:43:11.007831 systemd-resolved[274]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 23:43:11.007867 systemd-resolved[274]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 23:43:11.017780 systemd-resolved[274]: Defaulting to hostname 'linux'. May 13 23:43:11.018848 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 23:43:11.019546 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 23:43:11.071095 kernel: SCSI subsystem initialized May 13 23:43:11.075097 kernel: Loading iSCSI transport class v2.0-870. May 13 23:43:11.083098 kernel: iscsi: registered transport (tcp) May 13 23:43:11.096184 kernel: iscsi: registered transport (qla4xxx) May 13 23:43:11.096271 kernel: QLogic iSCSI HBA Driver May 13 23:43:11.143594 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 13 23:43:11.145516 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 13 23:43:11.173407 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 13 23:43:11.173476 kernel: device-mapper: uevent: version 1.0.3 May 13 23:43:11.174260 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com May 13 23:43:11.225735 kernel: raid6: neonx8 gen() 11932 MB/s May 13 23:43:11.242111 kernel: raid6: neonx4 gen() 14430 MB/s May 13 23:43:11.259121 kernel: raid6: neonx2 gen() 13164 MB/s May 13 23:43:11.276137 kernel: raid6: neonx1 gen() 9757 MB/s May 13 23:43:11.293117 kernel: raid6: int64x8 gen() 5742 MB/s May 13 23:43:11.310142 kernel: raid6: int64x4 gen() 6695 MB/s May 13 23:43:11.327131 kernel: raid6: int64x2 gen() 5935 MB/s May 13 23:43:11.344136 kernel: raid6: int64x1 gen() 4810 MB/s May 13 23:43:11.344224 kernel: raid6: using algorithm neonx4 gen() 14430 MB/s May 13 23:43:11.361157 kernel: raid6: .... xor() 11887 MB/s, rmw enabled May 13 23:43:11.361249 kernel: raid6: using neon recovery algorithm May 13 23:43:11.366281 kernel: xor: measuring software checksum speed May 13 23:43:11.366409 kernel: 8regs : 21613 MB/sec May 13 23:43:11.366429 kernel: 32regs : 21704 MB/sec May 13 23:43:11.366444 kernel: arm64_neon : 26665 MB/sec May 13 23:43:11.367742 kernel: xor: using function: arm64_neon (26665 MB/sec) May 13 23:43:11.419129 kernel: Btrfs loaded, zoned=no, fsverity=no May 13 23:43:11.433663 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 13 23:43:11.436986 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:43:11.464016 systemd-udevd[456]: Using default interface naming scheme 'v255'. May 13 23:43:11.468168 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:43:11.473624 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 13 23:43:11.502412 dracut-pre-trigger[466]: rd.md=0: removing MD RAID activation May 13 23:43:11.538136 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 13 23:43:11.541605 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 23:43:11.610595 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:43:11.614156 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 13 23:43:11.639600 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 13 23:43:11.642618 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 13 23:43:11.643950 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:43:11.645382 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 23:43:11.650678 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 13 23:43:11.674289 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 13 23:43:11.708480 kernel: scsi host0: Virtio SCSI HBA May 13 23:43:11.714267 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 May 13 23:43:11.714385 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 May 13 23:43:11.748122 kernel: ACPI: bus type USB registered May 13 23:43:11.748173 kernel: usbcore: registered new interface driver usbfs May 13 23:43:11.753386 kernel: usbcore: registered new interface driver hub May 13 23:43:11.754083 kernel: usbcore: registered new device driver usb May 13 23:43:11.759540 kernel: sr 0:0:0:0: Power-on or device reset occurred May 13 23:43:11.759423 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 23:43:11.759625 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:43:11.762832 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:43:11.766404 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray May 13 23:43:11.766583 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 13 23:43:11.763476 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:43:11.763624 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:43:11.765541 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:43:11.772154 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 May 13 23:43:11.768291 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:43:11.793327 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller May 13 23:43:11.793550 kernel: sd 0:0:0:1: Power-on or device reset occurred May 13 23:43:11.793673 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 May 13 23:43:11.793761 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) May 13 23:43:11.795120 kernel: sd 0:0:0:1: [sda] Write Protect is off May 13 23:43:11.795332 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 May 13 23:43:11.795435 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 May 13 23:43:11.797047 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:43:11.800017 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA May 13 23:43:11.800273 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller May 13 23:43:11.800471 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 May 13 23:43:11.800575 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed May 13 23:43:11.801395 kernel: hub 1-0:1.0: USB hub found May 13 23:43:11.802087 kernel: hub 1-0:1.0: 4 ports detected May 13 23:43:11.804611 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 13 23:43:11.804627 kernel: GPT:17805311 != 80003071 May 13 23:43:11.804637 kernel: GPT:Alternate GPT header not at the end of the disk. May 13 23:43:11.804646 kernel: GPT:17805311 != 80003071 May 13 23:43:11.804655 kernel: GPT: Use GNU Parted to correct GPT errors. May 13 23:43:11.804672 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 13 23:43:11.804681 kernel: sd 0:0:0:1: [sda] Attached SCSI disk May 13 23:43:11.802298 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:43:11.809332 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. May 13 23:43:11.809528 kernel: hub 2-0:1.0: USB hub found May 13 23:43:11.809633 kernel: hub 2-0:1.0: 4 ports detected May 13 23:43:11.831417 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:43:11.861095 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (524) May 13 23:43:11.871127 kernel: BTRFS: device fsid ee830c17-a93d-4109-bd12-3fec8ef6763d devid 1 transid 41 /dev/sda3 scanned by (udev-worker) (512) May 13 23:43:11.879909 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. May 13 23:43:11.893138 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. May 13 23:43:11.901794 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. May 13 23:43:11.916263 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. May 13 23:43:11.916937 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. May 13 23:43:11.919936 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 13 23:43:11.942025 disk-uuid[575]: Primary Header is updated. May 13 23:43:11.942025 disk-uuid[575]: Secondary Entries is updated. May 13 23:43:11.942025 disk-uuid[575]: Secondary Header is updated. May 13 23:43:11.949122 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 13 23:43:12.046146 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd May 13 23:43:12.180091 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 May 13 23:43:12.181134 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 May 13 23:43:12.181470 kernel: usbcore: registered new interface driver usbhid May 13 23:43:12.182103 kernel: usbhid: USB HID core driver May 13 23:43:12.287123 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd May 13 23:43:12.418105 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 May 13 23:43:12.471535 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 May 13 23:43:12.961956 disk-uuid[576]: The operation has completed successfully. May 13 23:43:12.962730 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 13 23:43:13.024976 systemd[1]: disk-uuid.service: Deactivated successfully. May 13 23:43:13.025112 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 13 23:43:13.049053 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 13 23:43:13.068109 sh[590]: Success May 13 23:43:13.083092 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" May 13 23:43:13.144215 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 13 23:43:13.150218 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 13 23:43:13.156484 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 13 23:43:13.177106 kernel: BTRFS info (device dm-0): first mount of filesystem ee830c17-a93d-4109-bd12-3fec8ef6763d May 13 23:43:13.177194 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm May 13 23:43:13.177216 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead May 13 23:43:13.179389 kernel: BTRFS info (device dm-0): disabling log replay at mount time May 13 23:43:13.179418 kernel: BTRFS info (device dm-0): using free space tree May 13 23:43:13.186088 kernel: BTRFS info (device dm-0): enabling ssd optimizations May 13 23:43:13.189121 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 13 23:43:13.191527 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 13 23:43:13.193064 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 13 23:43:13.196237 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 13 23:43:13.227268 kernel: BTRFS info (device sda6): first mount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:43:13.227345 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 13 23:43:13.228104 kernel: BTRFS info (device sda6): using free space tree May 13 23:43:13.233084 kernel: BTRFS info (device sda6): enabling ssd optimizations May 13 23:43:13.233204 kernel: BTRFS info (device sda6): auto enabling async discard May 13 23:43:13.238137 kernel: BTRFS info (device sda6): last unmount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:43:13.239562 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 13 23:43:13.243806 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 13 23:43:13.315662 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 23:43:13.319319 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 23:43:13.355288 ignition[681]: Ignition 2.20.0 May 13 23:43:13.355849 ignition[681]: Stage: fetch-offline May 13 23:43:13.355887 ignition[681]: no configs at "/usr/lib/ignition/base.d" May 13 23:43:13.356216 systemd-networkd[770]: lo: Link UP May 13 23:43:13.355895 ignition[681]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:43:13.356220 systemd-networkd[770]: lo: Gained carrier May 13 23:43:13.356575 ignition[681]: parsed url from cmdline: "" May 13 23:43:13.358365 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 13 23:43:13.356578 ignition[681]: no config URL provided May 13 23:43:13.360865 systemd-networkd[770]: Enumeration completed May 13 23:43:13.356585 ignition[681]: reading system config file "/usr/lib/ignition/user.ign" May 13 23:43:13.361445 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 23:43:13.356595 ignition[681]: no config at "/usr/lib/ignition/user.ign" May 13 23:43:13.361458 systemd-networkd[770]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:43:13.356601 ignition[681]: failed to fetch config: resource requires networking May 13 23:43:13.361462 systemd-networkd[770]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:43:13.356783 ignition[681]: Ignition finished successfully May 13 23:43:13.362088 systemd-networkd[770]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:43:13.362092 systemd-networkd[770]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:43:13.362606 systemd-networkd[770]: eth0: Link UP May 13 23:43:13.362609 systemd-networkd[770]: eth0: Gained carrier May 13 23:43:13.362616 systemd-networkd[770]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:43:13.364435 systemd[1]: Reached target network.target - Network. May 13 23:43:13.368451 systemd-networkd[770]: eth1: Link UP May 13 23:43:13.368456 systemd-networkd[770]: eth1: Gained carrier May 13 23:43:13.368469 systemd-networkd[770]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:43:13.369788 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 13 23:43:13.393883 systemd-networkd[770]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 23:43:13.395948 ignition[777]: Ignition 2.20.0 May 13 23:43:13.395959 ignition[777]: Stage: fetch May 13 23:43:13.396225 ignition[777]: no configs at "/usr/lib/ignition/base.d" May 13 23:43:13.396235 ignition[777]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:43:13.396349 ignition[777]: parsed url from cmdline: "" May 13 23:43:13.396353 ignition[777]: no config URL provided May 13 23:43:13.396361 ignition[777]: reading system config file "/usr/lib/ignition/user.ign" May 13 23:43:13.396369 ignition[777]: no config at "/usr/lib/ignition/user.ign" May 13 23:43:13.396457 ignition[777]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 May 13 23:43:13.397292 ignition[777]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable May 13 23:43:13.491208 systemd-networkd[770]: eth0: DHCPv4 address 138.199.236.81/32, gateway 172.31.1.1 acquired from 172.31.1.1 May 13 23:43:13.597533 ignition[777]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 May 13 23:43:13.604908 ignition[777]: GET result: OK May 13 23:43:13.605098 ignition[777]: parsing config with SHA512: db8c667124b52388aac48339a990ad1c28e1943766096a40cfd741a1aa18b26e3095efa30815d5103fb9acfb16ea692d1aa7d3caefc608999c832c06d2a9d1cc May 13 23:43:13.613868 unknown[777]: fetched base config from "system" May 13 23:43:13.613881 unknown[777]: fetched base config from "system" May 13 23:43:13.614400 ignition[777]: fetch: fetch complete May 13 23:43:13.613888 unknown[777]: fetched user config from "hetzner" May 13 23:43:13.614406 ignition[777]: fetch: fetch passed May 13 23:43:13.617100 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 13 23:43:13.614483 ignition[777]: Ignition finished successfully May 13 23:43:13.620216 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 13 23:43:13.646328 ignition[785]: Ignition 2.20.0 May 13 23:43:13.646342 ignition[785]: Stage: kargs May 13 23:43:13.646573 ignition[785]: no configs at "/usr/lib/ignition/base.d" May 13 23:43:13.646585 ignition[785]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:43:13.650300 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 13 23:43:13.648569 ignition[785]: kargs: kargs passed May 13 23:43:13.648645 ignition[785]: Ignition finished successfully May 13 23:43:13.652209 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 13 23:43:13.682809 ignition[792]: Ignition 2.20.0 May 13 23:43:13.682827 ignition[792]: Stage: disks May 13 23:43:13.682998 ignition[792]: no configs at "/usr/lib/ignition/base.d" May 13 23:43:13.683008 ignition[792]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:43:13.683926 ignition[792]: disks: disks passed May 13 23:43:13.683973 ignition[792]: Ignition finished successfully May 13 23:43:13.686479 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 13 23:43:13.687575 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 13 23:43:13.688382 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 13 23:43:13.689513 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 23:43:13.690615 systemd[1]: Reached target sysinit.target - System Initialization. May 13 23:43:13.691267 systemd[1]: Reached target basic.target - Basic System. May 13 23:43:13.694229 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 13 23:43:13.720099 systemd-fsck[801]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks May 13 23:43:13.724718 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 13 23:43:13.732276 systemd[1]: Mounting sysroot.mount - /sysroot... May 13 23:43:13.795169 kernel: EXT4-fs (sda9): mounted filesystem 9f8d74e6-c079-469f-823a-18a62077a2c7 r/w with ordered data mode. Quota mode: none. May 13 23:43:13.796436 systemd[1]: Mounted sysroot.mount - /sysroot. May 13 23:43:13.797995 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 13 23:43:13.800926 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 23:43:13.805200 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 13 23:43:13.811405 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 13 23:43:13.818269 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 13 23:43:13.818333 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 13 23:43:13.821150 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 13 23:43:13.825249 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (809) May 13 23:43:13.827212 kernel: BTRFS info (device sda6): first mount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:43:13.827256 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 13 23:43:13.827267 kernel: BTRFS info (device sda6): using free space tree May 13 23:43:13.827851 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 13 23:43:13.836632 kernel: BTRFS info (device sda6): enabling ssd optimizations May 13 23:43:13.836696 kernel: BTRFS info (device sda6): auto enabling async discard May 13 23:43:13.841853 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 23:43:13.882967 initrd-setup-root[839]: cut: /sysroot/etc/passwd: No such file or directory May 13 23:43:13.885101 coreos-metadata[811]: May 13 23:43:13.884 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 May 13 23:43:13.889121 coreos-metadata[811]: May 13 23:43:13.887 INFO Fetch successful May 13 23:43:13.889121 coreos-metadata[811]: May 13 23:43:13.888 INFO wrote hostname ci-4284-0-0-n-732e99817a to /sysroot/etc/hostname May 13 23:43:13.892194 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 13 23:43:13.897417 initrd-setup-root[847]: cut: /sysroot/etc/group: No such file or directory May 13 23:43:13.902114 initrd-setup-root[854]: cut: /sysroot/etc/shadow: No such file or directory May 13 23:43:13.907227 initrd-setup-root[861]: cut: /sysroot/etc/gshadow: No such file or directory May 13 23:43:14.009056 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 13 23:43:14.011209 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 13 23:43:14.014296 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 13 23:43:14.035111 kernel: BTRFS info (device sda6): last unmount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:43:14.054519 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 13 23:43:14.064995 ignition[930]: INFO : Ignition 2.20.0 May 13 23:43:14.064995 ignition[930]: INFO : Stage: mount May 13 23:43:14.064995 ignition[930]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:43:14.064995 ignition[930]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:43:14.068803 ignition[930]: INFO : mount: mount passed May 13 23:43:14.068803 ignition[930]: INFO : Ignition finished successfully May 13 23:43:14.070021 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 13 23:43:14.073194 systemd[1]: Starting ignition-files.service - Ignition (files)... May 13 23:43:14.176117 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 13 23:43:14.178969 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 23:43:14.201139 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (940) May 13 23:43:14.203225 kernel: BTRFS info (device sda6): first mount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:43:14.203283 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 13 23:43:14.203324 kernel: BTRFS info (device sda6): using free space tree May 13 23:43:14.208116 kernel: BTRFS info (device sda6): enabling ssd optimizations May 13 23:43:14.208189 kernel: BTRFS info (device sda6): auto enabling async discard May 13 23:43:14.210676 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 23:43:14.243193 ignition[957]: INFO : Ignition 2.20.0 May 13 23:43:14.243193 ignition[957]: INFO : Stage: files May 13 23:43:14.245569 ignition[957]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:43:14.245569 ignition[957]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:43:14.245569 ignition[957]: DEBUG : files: compiled without relabeling support, skipping May 13 23:43:14.250351 ignition[957]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 13 23:43:14.250351 ignition[957]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 13 23:43:14.250351 ignition[957]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 13 23:43:14.252867 ignition[957]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 13 23:43:14.253847 unknown[957]: wrote ssh authorized keys file for user: core May 13 23:43:14.254704 ignition[957]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 13 23:43:14.258526 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" May 13 23:43:14.258526 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 May 13 23:43:14.392881 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 13 23:43:15.130703 systemd-networkd[770]: eth1: Gained IPv6LL May 13 23:43:15.194325 systemd-networkd[770]: eth0: Gained IPv6LL May 13 23:43:15.329167 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" May 13 23:43:15.331150 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 13 23:43:15.331150 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 13 23:43:15.331150 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 13 23:43:15.331150 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 13 23:43:15.331150 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 23:43:15.331150 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 23:43:15.331150 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 23:43:15.331150 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 23:43:15.331150 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 13 23:43:15.331150 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 13 23:43:15.331150 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" May 13 23:43:15.331150 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" May 13 23:43:15.331150 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" May 13 23:43:15.344432 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-arm64.raw: attempt #1 May 13 23:43:15.895592 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 13 23:43:17.049520 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" May 13 23:43:17.049520 ignition[957]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 13 23:43:17.055765 ignition[957]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 23:43:17.055765 ignition[957]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 23:43:17.055765 ignition[957]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 13 23:43:17.055765 ignition[957]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 13 23:43:17.055765 ignition[957]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" May 13 23:43:17.055765 ignition[957]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" May 13 23:43:17.055765 ignition[957]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 13 23:43:17.055765 ignition[957]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" May 13 23:43:17.055765 ignition[957]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" May 13 23:43:17.055765 ignition[957]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" May 13 23:43:17.055765 ignition[957]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" May 13 23:43:17.055765 ignition[957]: INFO : files: files passed May 13 23:43:17.055765 ignition[957]: INFO : Ignition finished successfully May 13 23:43:17.055116 systemd[1]: Finished ignition-files.service - Ignition (files). May 13 23:43:17.058219 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 13 23:43:17.066276 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 13 23:43:17.079362 systemd[1]: ignition-quench.service: Deactivated successfully. May 13 23:43:17.087889 initrd-setup-root-after-ignition[985]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 23:43:17.087889 initrd-setup-root-after-ignition[985]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 13 23:43:17.079463 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 13 23:43:17.092720 initrd-setup-root-after-ignition[989]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 23:43:17.087093 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 23:43:17.088805 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 13 23:43:17.093738 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 13 23:43:17.152534 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 13 23:43:17.152697 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 13 23:43:17.154391 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 13 23:43:17.155874 systemd[1]: Reached target initrd.target - Initrd Default Target. May 13 23:43:17.157257 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 13 23:43:17.158218 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 13 23:43:17.187258 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 23:43:17.189625 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 13 23:43:17.207527 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 13 23:43:17.208288 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:43:17.209949 systemd[1]: Stopped target timers.target - Timer Units. May 13 23:43:17.211931 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 13 23:43:17.212076 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 23:43:17.214319 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 13 23:43:17.214894 systemd[1]: Stopped target basic.target - Basic System. May 13 23:43:17.216190 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 13 23:43:17.217257 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 13 23:43:17.218015 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 13 23:43:17.218987 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 13 23:43:17.220062 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 13 23:43:17.221318 systemd[1]: Stopped target sysinit.target - System Initialization. May 13 23:43:17.222818 systemd[1]: Stopped target local-fs.target - Local File Systems. May 13 23:43:17.223972 systemd[1]: Stopped target swap.target - Swaps. May 13 23:43:17.224996 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 13 23:43:17.225135 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 13 23:43:17.226529 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 13 23:43:17.227153 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:43:17.228138 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 13 23:43:17.229152 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:43:17.229831 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 13 23:43:17.229945 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 13 23:43:17.231510 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 13 23:43:17.231625 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 23:43:17.232884 systemd[1]: ignition-files.service: Deactivated successfully. May 13 23:43:17.232974 systemd[1]: Stopped ignition-files.service - Ignition (files). May 13 23:43:17.234058 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 13 23:43:17.234170 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 13 23:43:17.237804 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 13 23:43:17.238363 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 13 23:43:17.238490 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:43:17.242337 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 13 23:43:17.246641 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 13 23:43:17.246794 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:43:17.247637 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 13 23:43:17.247738 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 13 23:43:17.253948 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 13 23:43:17.254846 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 13 23:43:17.267603 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 13 23:43:17.270500 ignition[1010]: INFO : Ignition 2.20.0 May 13 23:43:17.270500 ignition[1010]: INFO : Stage: umount May 13 23:43:17.271600 ignition[1010]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:43:17.271600 ignition[1010]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:43:17.276192 ignition[1010]: INFO : umount: umount passed May 13 23:43:17.276192 ignition[1010]: INFO : Ignition finished successfully May 13 23:43:17.275528 systemd[1]: ignition-mount.service: Deactivated successfully. May 13 23:43:17.275647 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 13 23:43:17.276465 systemd[1]: ignition-disks.service: Deactivated successfully. May 13 23:43:17.276516 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 13 23:43:17.277853 systemd[1]: ignition-kargs.service: Deactivated successfully. May 13 23:43:17.277892 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 13 23:43:17.283328 systemd[1]: ignition-fetch.service: Deactivated successfully. May 13 23:43:17.283393 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 13 23:43:17.285586 systemd[1]: Stopped target network.target - Network. May 13 23:43:17.287938 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 13 23:43:17.288023 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 13 23:43:17.290327 systemd[1]: Stopped target paths.target - Path Units. May 13 23:43:17.294553 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 13 23:43:17.299686 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:43:17.301174 systemd[1]: Stopped target slices.target - Slice Units. May 13 23:43:17.302210 systemd[1]: Stopped target sockets.target - Socket Units. May 13 23:43:17.303686 systemd[1]: iscsid.socket: Deactivated successfully. May 13 23:43:17.303730 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 13 23:43:17.305168 systemd[1]: iscsiuio.socket: Deactivated successfully. May 13 23:43:17.305199 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 23:43:17.306915 systemd[1]: ignition-setup.service: Deactivated successfully. May 13 23:43:17.306968 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 13 23:43:17.309158 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 13 23:43:17.309200 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 13 23:43:17.310901 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 13 23:43:17.311882 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 13 23:43:17.317688 systemd[1]: systemd-resolved.service: Deactivated successfully. May 13 23:43:17.318143 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 13 23:43:17.325804 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 13 23:43:17.327839 systemd[1]: sysroot-boot.service: Deactivated successfully. May 13 23:43:17.328013 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 13 23:43:17.329149 systemd[1]: systemd-networkd.service: Deactivated successfully. May 13 23:43:17.329242 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 13 23:43:17.331515 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 13 23:43:17.332534 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 13 23:43:17.332643 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 13 23:43:17.334594 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 13 23:43:17.334666 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 13 23:43:17.336799 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 13 23:43:17.337389 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 13 23:43:17.337447 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 23:43:17.338195 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 13 23:43:17.338323 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 13 23:43:17.340647 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 13 23:43:17.340690 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 13 23:43:17.341730 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 13 23:43:17.341807 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:43:17.347927 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:43:17.351715 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 13 23:43:17.351798 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 13 23:43:17.366217 systemd[1]: systemd-udevd.service: Deactivated successfully. May 13 23:43:17.366469 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:43:17.369025 systemd[1]: network-cleanup.service: Deactivated successfully. May 13 23:43:17.370120 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 13 23:43:17.371903 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 13 23:43:17.371984 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 13 23:43:17.372701 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 13 23:43:17.372732 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:43:17.373760 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 13 23:43:17.373812 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 13 23:43:17.375566 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 13 23:43:17.375620 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 13 23:43:17.377099 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 23:43:17.377155 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:43:17.380225 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 13 23:43:17.381031 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 13 23:43:17.381112 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:43:17.384867 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:43:17.384927 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:43:17.388023 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 13 23:43:17.388118 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 23:43:17.396490 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 13 23:43:17.396689 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 13 23:43:17.398768 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 13 23:43:17.401634 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 13 23:43:17.428349 systemd[1]: Switching root. May 13 23:43:17.472559 systemd-journald[237]: Journal stopped May 13 23:43:18.409352 systemd-journald[237]: Received SIGTERM from PID 1 (systemd). May 13 23:43:18.409417 kernel: SELinux: policy capability network_peer_controls=1 May 13 23:43:18.409429 kernel: SELinux: policy capability open_perms=1 May 13 23:43:18.409445 kernel: SELinux: policy capability extended_socket_class=1 May 13 23:43:18.409454 kernel: SELinux: policy capability always_check_network=0 May 13 23:43:18.409463 kernel: SELinux: policy capability cgroup_seclabel=1 May 13 23:43:18.409472 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 13 23:43:18.409489 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 13 23:43:18.409501 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 13 23:43:18.409510 kernel: audit: type=1403 audit(1747179797.580:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 13 23:43:18.409520 systemd[1]: Successfully loaded SELinux policy in 36.101ms. May 13 23:43:18.409542 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 11.082ms. May 13 23:43:18.409553 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 23:43:18.409564 systemd[1]: Detected virtualization kvm. May 13 23:43:18.409574 systemd[1]: Detected architecture arm64. May 13 23:43:18.409584 systemd[1]: Detected first boot. May 13 23:43:18.409594 systemd[1]: Hostname set to . May 13 23:43:18.409604 systemd[1]: Initializing machine ID from VM UUID. May 13 23:43:18.409614 zram_generator::config[1054]: No configuration found. May 13 23:43:18.409625 kernel: NET: Registered PF_VSOCK protocol family May 13 23:43:18.409635 systemd[1]: Populated /etc with preset unit settings. May 13 23:43:18.409646 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 13 23:43:18.409664 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 13 23:43:18.409678 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 13 23:43:18.409688 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 13 23:43:18.409698 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 13 23:43:18.409712 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 13 23:43:18.409723 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 13 23:43:18.409734 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 13 23:43:18.409744 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 13 23:43:18.409756 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 13 23:43:18.409766 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 13 23:43:18.409776 systemd[1]: Created slice user.slice - User and Session Slice. May 13 23:43:18.409785 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:43:18.409795 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:43:18.409805 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 13 23:43:18.409815 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 13 23:43:18.409827 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 13 23:43:18.409837 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 23:43:18.409847 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... May 13 23:43:18.409858 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:43:18.409868 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 13 23:43:18.409878 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 13 23:43:18.409891 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 13 23:43:18.409901 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 13 23:43:18.409911 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:43:18.409921 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 23:43:18.409931 systemd[1]: Reached target slices.target - Slice Units. May 13 23:43:18.409941 systemd[1]: Reached target swap.target - Swaps. May 13 23:43:18.409957 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 13 23:43:18.409967 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 13 23:43:18.409978 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 13 23:43:18.409989 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 23:43:18.410000 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 23:43:18.410011 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:43:18.410026 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 13 23:43:18.410040 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 13 23:43:18.410052 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 13 23:43:18.410072 systemd[1]: Mounting media.mount - External Media Directory... May 13 23:43:18.410086 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 13 23:43:18.410096 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 13 23:43:18.410106 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 13 23:43:18.410116 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 13 23:43:18.410127 systemd[1]: Reached target machines.target - Containers. May 13 23:43:18.410136 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 13 23:43:18.410147 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:43:18.410159 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 23:43:18.410169 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 13 23:43:18.410180 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:43:18.410190 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 23:43:18.410200 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:43:18.410210 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 13 23:43:18.410220 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:43:18.410230 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 13 23:43:18.410240 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 13 23:43:18.410252 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 13 23:43:18.410263 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 13 23:43:18.410276 systemd[1]: Stopped systemd-fsck-usr.service. May 13 23:43:18.410286 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:43:18.410305 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 23:43:18.410317 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 23:43:18.410327 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 13 23:43:18.410338 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 13 23:43:18.410350 kernel: fuse: init (API version 7.39) May 13 23:43:18.410360 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 13 23:43:18.410370 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 23:43:18.410384 systemd[1]: verity-setup.service: Deactivated successfully. May 13 23:43:18.410399 systemd[1]: Stopped verity-setup.service. May 13 23:43:18.410411 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 13 23:43:18.410421 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 13 23:43:18.410431 systemd[1]: Mounted media.mount - External Media Directory. May 13 23:43:18.410441 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 13 23:43:18.410451 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 13 23:43:18.410465 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 13 23:43:18.412111 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:43:18.412133 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 13 23:43:18.412145 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 13 23:43:18.412156 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:43:18.412166 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:43:18.412177 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:43:18.412187 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:43:18.412198 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 13 23:43:18.412213 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 13 23:43:18.412224 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 23:43:18.412234 kernel: loop: module loaded May 13 23:43:18.412246 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 13 23:43:18.412256 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 13 23:43:18.412266 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 23:43:18.412276 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:43:18.412287 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:43:18.412311 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 13 23:43:18.412328 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 13 23:43:18.412338 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 13 23:43:18.412377 systemd-journald[1118]: Collecting audit messages is disabled. May 13 23:43:18.412404 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 13 23:43:18.412415 systemd[1]: Reached target network-pre.target - Preparation for Network. May 13 23:43:18.412426 systemd-journald[1118]: Journal started May 13 23:43:18.412452 systemd-journald[1118]: Runtime Journal (/run/log/journal/c3fb02349eba47f3952a38800ffa7061) is 8M, max 76.6M, 68.6M free. May 13 23:43:18.145463 systemd[1]: Queued start job for default target multi-user.target. May 13 23:43:18.161868 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. May 13 23:43:18.162410 systemd[1]: systemd-journald.service: Deactivated successfully. May 13 23:43:18.414643 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 13 23:43:18.415109 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 23:43:18.418081 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 13 23:43:18.422305 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 13 23:43:18.431151 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 13 23:43:18.431223 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:43:18.434349 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 13 23:43:18.438213 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:43:18.443578 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 13 23:43:18.445121 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:43:18.452086 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 13 23:43:18.456118 systemd[1]: Started systemd-journald.service - Journal Service. May 13 23:43:18.458831 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 13 23:43:18.472929 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 13 23:43:18.479180 kernel: ACPI: bus type drm_connector registered May 13 23:43:18.484497 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 23:43:18.486922 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 23:43:18.498514 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 13 23:43:18.519545 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 23:43:18.524984 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:43:18.528230 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 13 23:43:18.532838 kernel: loop0: detected capacity change from 0 to 8 May 13 23:43:18.532168 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 13 23:43:18.537678 systemd-journald[1118]: Time spent on flushing to /var/log/journal/c3fb02349eba47f3952a38800ffa7061 is 45.720ms for 1142 entries. May 13 23:43:18.537678 systemd-journald[1118]: System Journal (/var/log/journal/c3fb02349eba47f3952a38800ffa7061) is 8M, max 584.8M, 576.8M free. May 13 23:43:18.596812 systemd-journald[1118]: Received client request to flush runtime journal. May 13 23:43:18.596879 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 13 23:43:18.596902 kernel: loop1: detected capacity change from 0 to 201592 May 13 23:43:18.540682 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 13 23:43:18.550231 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... May 13 23:43:18.560489 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 13 23:43:18.566304 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 13 23:43:18.596938 udevadm[1186]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. May 13 23:43:18.601052 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 13 23:43:18.616626 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 13 23:43:18.622025 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 13 23:43:18.628279 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 23:43:18.646104 kernel: loop2: detected capacity change from 0 to 126448 May 13 23:43:18.686992 systemd-tmpfiles[1195]: ACLs are not supported, ignoring. May 13 23:43:18.687446 systemd-tmpfiles[1195]: ACLs are not supported, ignoring. May 13 23:43:18.696164 kernel: loop3: detected capacity change from 0 to 103832 May 13 23:43:18.696860 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:43:18.736129 kernel: loop4: detected capacity change from 0 to 8 May 13 23:43:18.742099 kernel: loop5: detected capacity change from 0 to 201592 May 13 23:43:18.764099 kernel: loop6: detected capacity change from 0 to 126448 May 13 23:43:18.786109 kernel: loop7: detected capacity change from 0 to 103832 May 13 23:43:18.810284 (sd-merge)[1200]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. May 13 23:43:18.810742 (sd-merge)[1200]: Merged extensions into '/usr'. May 13 23:43:18.815910 systemd[1]: Reload requested from client PID 1146 ('systemd-sysext') (unit systemd-sysext.service)... May 13 23:43:18.816063 systemd[1]: Reloading... May 13 23:43:18.932141 zram_generator::config[1240]: No configuration found. May 13 23:43:19.034792 ldconfig[1142]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 13 23:43:19.062871 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:43:19.124056 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 13 23:43:19.124549 systemd[1]: Reloading finished in 307 ms. May 13 23:43:19.147116 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 13 23:43:19.150263 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 13 23:43:19.171369 systemd[1]: Starting ensure-sysext.service... May 13 23:43:19.173393 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 23:43:19.202201 systemd-tmpfiles[1266]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 13 23:43:19.202422 systemd-tmpfiles[1266]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 13 23:43:19.203025 systemd-tmpfiles[1266]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 13 23:43:19.203249 systemd-tmpfiles[1266]: ACLs are not supported, ignoring. May 13 23:43:19.203340 systemd-tmpfiles[1266]: ACLs are not supported, ignoring. May 13 23:43:19.206233 systemd[1]: Reload requested from client PID 1265 ('systemctl') (unit ensure-sysext.service)... May 13 23:43:19.206250 systemd[1]: Reloading... May 13 23:43:19.206629 systemd-tmpfiles[1266]: Detected autofs mount point /boot during canonicalization of boot. May 13 23:43:19.206634 systemd-tmpfiles[1266]: Skipping /boot May 13 23:43:19.216004 systemd-tmpfiles[1266]: Detected autofs mount point /boot during canonicalization of boot. May 13 23:43:19.216163 systemd-tmpfiles[1266]: Skipping /boot May 13 23:43:19.288149 zram_generator::config[1295]: No configuration found. May 13 23:43:19.393956 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:43:19.454484 systemd[1]: Reloading finished in 247 ms. May 13 23:43:19.469212 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 13 23:43:19.479223 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:43:19.488257 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 23:43:19.493351 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 13 23:43:19.498577 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 13 23:43:19.505896 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 23:43:19.509024 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:43:19.514373 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 13 23:43:19.527197 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 13 23:43:19.532017 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:43:19.536397 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:43:19.544704 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:43:19.553577 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:43:19.554269 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:43:19.554432 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:43:19.558112 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 13 23:43:19.569186 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 13 23:43:19.571892 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:43:19.572223 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:43:19.575950 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:43:19.576405 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:43:19.589380 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:43:19.593042 systemd-udevd[1338]: Using default interface naming scheme 'v255'. May 13 23:43:19.596422 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:43:19.606588 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:43:19.607884 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:43:19.608083 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:43:19.614144 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 13 23:43:19.625980 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:43:19.630980 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 23:43:19.631736 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:43:19.631857 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:43:19.641535 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:43:19.647984 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:43:19.648181 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:43:19.649581 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:43:19.652147 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:43:19.658598 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 13 23:43:19.662861 systemd[1]: Finished ensure-sysext.service. May 13 23:43:19.664448 augenrules[1372]: No rules May 13 23:43:19.666106 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 13 23:43:19.667500 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:43:19.668282 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:43:19.669449 systemd[1]: audit-rules.service: Deactivated successfully. May 13 23:43:19.669621 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 23:43:19.671593 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 23:43:19.671746 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 23:43:19.686319 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 23:43:19.687253 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:43:19.687357 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:43:19.692357 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 13 23:43:19.695183 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 13 23:43:19.702335 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 13 23:43:19.873755 systemd-networkd[1392]: lo: Link UP May 13 23:43:19.873768 systemd-networkd[1392]: lo: Gained carrier May 13 23:43:19.897875 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 13 23:43:19.899380 systemd[1]: Reached target time-set.target - System Time Set. May 13 23:43:19.901243 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. May 13 23:43:19.905715 systemd-resolved[1337]: Positive Trust Anchors: May 13 23:43:19.906012 systemd-resolved[1337]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 23:43:19.906109 systemd-resolved[1337]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 23:43:19.912618 systemd-resolved[1337]: Using system hostname 'ci-4284-0-0-n-732e99817a'. May 13 23:43:19.916772 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 23:43:19.917700 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 23:43:19.930770 systemd-networkd[1392]: Enumeration completed May 13 23:43:19.930874 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 23:43:19.932379 systemd[1]: Reached target network.target - Network. May 13 23:43:19.935374 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 13 23:43:19.938436 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 13 23:43:19.965240 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 13 23:43:19.974258 systemd-networkd[1392]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:43:19.974270 systemd-networkd[1392]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:43:19.974919 systemd-networkd[1392]: eth1: Link UP May 13 23:43:19.974923 systemd-networkd[1392]: eth1: Gained carrier May 13 23:43:19.974940 systemd-networkd[1392]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:43:19.985102 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1376) May 13 23:43:20.004183 systemd-networkd[1392]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 23:43:20.007513 systemd-timesyncd[1394]: Network configuration changed, trying to establish connection. May 13 23:43:20.044388 systemd-networkd[1392]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:43:20.044401 systemd-networkd[1392]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:43:20.045162 systemd-timesyncd[1394]: Network configuration changed, trying to establish connection. May 13 23:43:20.045360 systemd-networkd[1392]: eth0: Link UP May 13 23:43:20.045368 systemd-networkd[1392]: eth0: Gained carrier May 13 23:43:20.045383 systemd-networkd[1392]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:43:20.046467 systemd-timesyncd[1394]: Network configuration changed, trying to establish connection. May 13 23:43:20.049108 kernel: mousedev: PS/2 mouse device common for all mice May 13 23:43:20.094277 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. May 13 23:43:20.097424 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 13 23:43:20.113720 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 May 13 23:43:20.113799 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 13 23:43:20.113815 kernel: [drm] features: -context_init May 13 23:43:20.115114 kernel: [drm] number of scanouts: 1 May 13 23:43:20.115195 kernel: [drm] number of cap sets: 0 May 13 23:43:20.118101 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 May 13 23:43:20.125660 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 13 23:43:20.128020 kernel: Console: switching to colour frame buffer device 160x50 May 13 23:43:20.133154 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device May 13 23:43:20.136715 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. May 13 23:43:20.136841 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:43:20.138460 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:43:20.140564 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:43:20.144011 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:43:20.145228 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:43:20.145275 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:43:20.145343 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 13 23:43:20.148165 systemd-networkd[1392]: eth0: DHCPv4 address 138.199.236.81/32, gateway 172.31.1.1 acquired from 172.31.1.1 May 13 23:43:20.148579 systemd-timesyncd[1394]: Network configuration changed, trying to establish connection. May 13 23:43:20.148954 systemd-timesyncd[1394]: Network configuration changed, trying to establish connection. May 13 23:43:20.169561 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:43:20.169746 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:43:20.174417 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:43:20.174615 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:43:20.178265 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:43:20.178467 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:43:20.179691 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:43:20.179747 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:43:20.207477 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:43:20.216961 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:43:20.217185 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:43:20.219220 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:43:20.299853 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:43:20.363613 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. May 13 23:43:20.368529 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... May 13 23:43:20.399776 lvm[1456]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 13 23:43:20.428801 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. May 13 23:43:20.430405 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 23:43:20.431477 systemd[1]: Reached target sysinit.target - System Initialization. May 13 23:43:20.432712 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 13 23:43:20.433972 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 13 23:43:20.435498 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 13 23:43:20.436659 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 13 23:43:20.437915 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 13 23:43:20.439128 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 13 23:43:20.439181 systemd[1]: Reached target paths.target - Path Units. May 13 23:43:20.439991 systemd[1]: Reached target timers.target - Timer Units. May 13 23:43:20.442740 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 13 23:43:20.445803 systemd[1]: Starting docker.socket - Docker Socket for the API... May 13 23:43:20.449903 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 13 23:43:20.450932 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 13 23:43:20.451655 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 13 23:43:20.460060 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 13 23:43:20.461199 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 13 23:43:20.463208 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... May 13 23:43:20.464443 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 13 23:43:20.465117 systemd[1]: Reached target sockets.target - Socket Units. May 13 23:43:20.465629 systemd[1]: Reached target basic.target - Basic System. May 13 23:43:20.466184 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 13 23:43:20.466211 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 13 23:43:20.469188 systemd[1]: Starting containerd.service - containerd container runtime... May 13 23:43:20.474612 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 13 23:43:20.477998 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 13 23:43:20.482406 lvm[1460]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 13 23:43:20.484202 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 13 23:43:20.489868 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 13 23:43:20.492175 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 13 23:43:20.496402 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 13 23:43:20.500129 jq[1464]: false May 13 23:43:20.499723 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 13 23:43:20.504649 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. May 13 23:43:20.511388 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 13 23:43:20.517340 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 13 23:43:20.525044 systemd[1]: Starting systemd-logind.service - User Login Management... May 13 23:43:20.526734 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 13 23:43:20.528326 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 13 23:43:20.536499 systemd[1]: Starting update-engine.service - Update Engine... May 13 23:43:20.541320 dbus-daemon[1463]: [system] SELinux support is enabled May 13 23:43:20.544156 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 13 23:43:20.547234 extend-filesystems[1465]: Found loop4 May 13 23:43:20.547234 extend-filesystems[1465]: Found loop5 May 13 23:43:20.547234 extend-filesystems[1465]: Found loop6 May 13 23:43:20.547234 extend-filesystems[1465]: Found loop7 May 13 23:43:20.547234 extend-filesystems[1465]: Found sda May 13 23:43:20.547234 extend-filesystems[1465]: Found sda1 May 13 23:43:20.547234 extend-filesystems[1465]: Found sda2 May 13 23:43:20.547234 extend-filesystems[1465]: Found sda3 May 13 23:43:20.547234 extend-filesystems[1465]: Found usr May 13 23:43:20.547234 extend-filesystems[1465]: Found sda4 May 13 23:43:20.547234 extend-filesystems[1465]: Found sda6 May 13 23:43:20.547234 extend-filesystems[1465]: Found sda7 May 13 23:43:20.547234 extend-filesystems[1465]: Found sda9 May 13 23:43:20.547234 extend-filesystems[1465]: Checking size of /dev/sda9 May 13 23:43:20.576001 coreos-metadata[1462]: May 13 23:43:20.573 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 May 13 23:43:20.576001 coreos-metadata[1462]: May 13 23:43:20.573 INFO Fetch successful May 13 23:43:20.576001 coreos-metadata[1462]: May 13 23:43:20.573 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 May 13 23:43:20.576001 coreos-metadata[1462]: May 13 23:43:20.573 INFO Fetch successful May 13 23:43:20.551816 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 13 23:43:20.563598 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. May 13 23:43:20.566994 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 13 23:43:20.567207 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 13 23:43:20.568565 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 13 23:43:20.569132 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 13 23:43:20.599169 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 13 23:43:20.599229 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 13 23:43:20.600489 extend-filesystems[1465]: Resized partition /dev/sda9 May 13 23:43:20.603635 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 13 23:43:20.605187 extend-filesystems[1501]: resize2fs 1.47.2 (1-Jan-2025) May 13 23:43:20.603665 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 13 23:43:20.610131 jq[1477]: true May 13 23:43:20.615083 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks May 13 23:43:20.616833 systemd[1]: motdgen.service: Deactivated successfully. May 13 23:43:20.617046 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 13 23:43:20.643728 (ntainerd)[1504]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 13 23:43:20.649801 tar[1484]: linux-arm64/LICENSE May 13 23:43:20.649801 tar[1484]: linux-arm64/helm May 13 23:43:20.676182 jq[1506]: true May 13 23:43:20.691034 update_engine[1475]: I20250513 23:43:20.690793 1475 main.cc:92] Flatcar Update Engine starting May 13 23:43:20.705618 systemd[1]: Started update-engine.service - Update Engine. May 13 23:43:20.713358 update_engine[1475]: I20250513 23:43:20.710328 1475 update_check_scheduler.cc:74] Next update check in 6m47s May 13 23:43:20.726099 kernel: EXT4-fs (sda9): resized filesystem to 9393147 May 13 23:43:20.740288 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 13 23:43:20.751721 extend-filesystems[1501]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required May 13 23:43:20.751721 extend-filesystems[1501]: old_desc_blocks = 1, new_desc_blocks = 5 May 13 23:43:20.751721 extend-filesystems[1501]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. May 13 23:43:20.762584 extend-filesystems[1465]: Resized filesystem in /dev/sda9 May 13 23:43:20.762584 extend-filesystems[1465]: Found sr0 May 13 23:43:20.755364 systemd[1]: extend-filesystems.service: Deactivated successfully. May 13 23:43:20.755605 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 13 23:43:20.799369 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1409) May 13 23:43:20.839011 bash[1534]: Updated "/home/core/.ssh/authorized_keys" May 13 23:43:20.856607 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 13 23:43:20.865345 systemd-logind[1474]: New seat seat0. May 13 23:43:20.868842 systemd-logind[1474]: Watching system buttons on /dev/input/event0 (Power Button) May 13 23:43:20.868863 systemd-logind[1474]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) May 13 23:43:20.869242 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 13 23:43:20.871442 systemd[1]: Started systemd-logind.service - User Login Management. May 13 23:43:20.881868 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 13 23:43:20.887376 systemd[1]: Starting sshkeys.service... May 13 23:43:20.926468 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 13 23:43:20.931708 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 13 23:43:20.992377 coreos-metadata[1542]: May 13 23:43:20.992 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 May 13 23:43:20.998944 coreos-metadata[1542]: May 13 23:43:20.997 INFO Fetch successful May 13 23:43:20.999471 unknown[1542]: wrote ssh authorized keys file for user: core May 13 23:43:21.052781 update-ssh-keys[1546]: Updated "/home/core/.ssh/authorized_keys" May 13 23:43:21.054126 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 13 23:43:21.058497 systemd[1]: Finished sshkeys.service. May 13 23:43:21.090687 containerd[1504]: time="2025-05-13T23:43:21Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 13 23:43:21.096241 containerd[1504]: time="2025-05-13T23:43:21.096195680Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 May 13 23:43:21.129112 containerd[1504]: time="2025-05-13T23:43:21.129042800Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.16µs" May 13 23:43:21.129112 containerd[1504]: time="2025-05-13T23:43:21.129100120Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 13 23:43:21.129112 containerd[1504]: time="2025-05-13T23:43:21.129123080Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 13 23:43:21.129373 containerd[1504]: time="2025-05-13T23:43:21.129344280Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 13 23:43:21.129409 containerd[1504]: time="2025-05-13T23:43:21.129373560Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 13 23:43:21.129409 containerd[1504]: time="2025-05-13T23:43:21.129402280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 23:43:21.129483 containerd[1504]: time="2025-05-13T23:43:21.129462440Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 23:43:21.129483 containerd[1504]: time="2025-05-13T23:43:21.129478000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 23:43:21.129822 containerd[1504]: time="2025-05-13T23:43:21.129795240Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 23:43:21.129822 containerd[1504]: time="2025-05-13T23:43:21.129816360Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 23:43:21.129876 containerd[1504]: time="2025-05-13T23:43:21.129828200Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 23:43:21.129876 containerd[1504]: time="2025-05-13T23:43:21.129837360Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 13 23:43:21.129955 containerd[1504]: time="2025-05-13T23:43:21.129906640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 13 23:43:21.132169 containerd[1504]: time="2025-05-13T23:43:21.132137840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 23:43:21.132248 containerd[1504]: time="2025-05-13T23:43:21.132185840Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 23:43:21.132248 containerd[1504]: time="2025-05-13T23:43:21.132197520Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 13 23:43:21.132248 containerd[1504]: time="2025-05-13T23:43:21.132236600Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 13 23:43:21.133413 containerd[1504]: time="2025-05-13T23:43:21.132598680Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 13 23:43:21.133413 containerd[1504]: time="2025-05-13T23:43:21.132681200Z" level=info msg="metadata content store policy set" policy=shared May 13 23:43:21.139750 containerd[1504]: time="2025-05-13T23:43:21.139694920Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 13 23:43:21.139750 containerd[1504]: time="2025-05-13T23:43:21.139766240Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 13 23:43:21.139750 containerd[1504]: time="2025-05-13T23:43:21.139780960Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 13 23:43:21.139750 containerd[1504]: time="2025-05-13T23:43:21.139793520Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 13 23:43:21.139750 containerd[1504]: time="2025-05-13T23:43:21.139807720Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 13 23:43:21.139998 containerd[1504]: time="2025-05-13T23:43:21.139821280Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 13 23:43:21.139998 containerd[1504]: time="2025-05-13T23:43:21.139832960Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 13 23:43:21.139998 containerd[1504]: time="2025-05-13T23:43:21.139844480Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 13 23:43:21.139998 containerd[1504]: time="2025-05-13T23:43:21.139856160Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 13 23:43:21.139998 containerd[1504]: time="2025-05-13T23:43:21.139866800Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 13 23:43:21.139998 containerd[1504]: time="2025-05-13T23:43:21.139875760Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 13 23:43:21.139998 containerd[1504]: time="2025-05-13T23:43:21.139886960Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 13 23:43:21.140151 containerd[1504]: time="2025-05-13T23:43:21.140030480Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 13 23:43:21.140151 containerd[1504]: time="2025-05-13T23:43:21.140052480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 13 23:43:21.140151 containerd[1504]: time="2025-05-13T23:43:21.140088440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 13 23:43:21.140151 containerd[1504]: time="2025-05-13T23:43:21.140116080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 13 23:43:21.140151 containerd[1504]: time="2025-05-13T23:43:21.140127240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 13 23:43:21.140151 containerd[1504]: time="2025-05-13T23:43:21.140137440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 13 23:43:21.140151 containerd[1504]: time="2025-05-13T23:43:21.140148320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 13 23:43:21.140263 containerd[1504]: time="2025-05-13T23:43:21.140159040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 13 23:43:21.140263 containerd[1504]: time="2025-05-13T23:43:21.140171000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 13 23:43:21.140263 containerd[1504]: time="2025-05-13T23:43:21.140182880Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 13 23:43:21.140263 containerd[1504]: time="2025-05-13T23:43:21.140194720Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 13 23:43:21.141001 containerd[1504]: time="2025-05-13T23:43:21.140497720Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 13 23:43:21.141001 containerd[1504]: time="2025-05-13T23:43:21.140523520Z" level=info msg="Start snapshots syncer" May 13 23:43:21.141001 containerd[1504]: time="2025-05-13T23:43:21.140550680Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.140773080Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.140840040Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.140913080Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141021600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141044120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141055280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141106360Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141155160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141170480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141183360Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141213760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141228280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141238400Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141275640Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141326960Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141339760Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141351000Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141359440Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141369040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141380080Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141459280Z" level=info msg="runtime interface created" May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141464880Z" level=info msg="created NRI interface" May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141473760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141488480Z" level=info msg="Connect containerd service" May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.141520000Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 13 23:43:21.142941 containerd[1504]: time="2025-05-13T23:43:21.142751720Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 23:43:21.193972 locksmithd[1515]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 13 23:43:21.376396 containerd[1504]: time="2025-05-13T23:43:21.375496400Z" level=info msg="Start subscribing containerd event" May 13 23:43:21.376396 containerd[1504]: time="2025-05-13T23:43:21.375576080Z" level=info msg="Start recovering state" May 13 23:43:21.376396 containerd[1504]: time="2025-05-13T23:43:21.375679440Z" level=info msg="Start event monitor" May 13 23:43:21.376396 containerd[1504]: time="2025-05-13T23:43:21.375696400Z" level=info msg="Start cni network conf syncer for default" May 13 23:43:21.376396 containerd[1504]: time="2025-05-13T23:43:21.375704960Z" level=info msg="Start streaming server" May 13 23:43:21.376396 containerd[1504]: time="2025-05-13T23:43:21.375715800Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 13 23:43:21.376396 containerd[1504]: time="2025-05-13T23:43:21.375723160Z" level=info msg="runtime interface starting up..." May 13 23:43:21.376396 containerd[1504]: time="2025-05-13T23:43:21.375728600Z" level=info msg="starting plugins..." May 13 23:43:21.376396 containerd[1504]: time="2025-05-13T23:43:21.375743560Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 13 23:43:21.376626 containerd[1504]: time="2025-05-13T23:43:21.376579320Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 13 23:43:21.376646 containerd[1504]: time="2025-05-13T23:43:21.376633080Z" level=info msg=serving... address=/run/containerd/containerd.sock May 13 23:43:21.376798 systemd[1]: Started containerd.service - containerd container runtime. May 13 23:43:21.381661 containerd[1504]: time="2025-05-13T23:43:21.381625840Z" level=info msg="containerd successfully booted in 0.291575s" May 13 23:43:21.429318 tar[1484]: linux-arm64/README.md May 13 23:43:21.447115 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 13 23:43:21.722310 systemd-networkd[1392]: eth0: Gained IPv6LL May 13 23:43:21.723057 systemd-timesyncd[1394]: Network configuration changed, trying to establish connection. May 13 23:43:21.727208 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 13 23:43:21.729013 systemd[1]: Reached target network-online.target - Network is Online. May 13 23:43:21.733246 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:43:21.736281 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 13 23:43:21.779767 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 13 23:43:21.979462 systemd-networkd[1392]: eth1: Gained IPv6LL May 13 23:43:21.980489 systemd-timesyncd[1394]: Network configuration changed, trying to establish connection. May 13 23:43:22.194075 sshd_keygen[1497]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 13 23:43:22.216986 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 13 23:43:22.223206 systemd[1]: Starting issuegen.service - Generate /run/issue... May 13 23:43:22.243133 systemd[1]: issuegen.service: Deactivated successfully. May 13 23:43:22.243521 systemd[1]: Finished issuegen.service - Generate /run/issue. May 13 23:43:22.250217 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 13 23:43:22.273909 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 13 23:43:22.278709 systemd[1]: Started getty@tty1.service - Getty on tty1. May 13 23:43:22.284516 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. May 13 23:43:22.286362 systemd[1]: Reached target getty.target - Login Prompts. May 13 23:43:22.551508 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:43:22.553636 systemd[1]: Reached target multi-user.target - Multi-User System. May 13 23:43:22.558158 systemd[1]: Startup finished in 865ms (kernel) + 6.892s (initrd) + 5.014s (userspace) = 12.772s. May 13 23:43:22.564337 (kubelet)[1605]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:43:23.070037 kubelet[1605]: E0513 23:43:23.069981 1605 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:43:23.074144 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:43:23.074411 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:43:23.074840 systemd[1]: kubelet.service: Consumed 863ms CPU time, 248M memory peak. May 13 23:43:33.326223 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 13 23:43:33.329461 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:43:33.486318 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:43:33.495684 (kubelet)[1624]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:43:33.548519 kubelet[1624]: E0513 23:43:33.548456 1624 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:43:33.552534 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:43:33.552821 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:43:33.554453 systemd[1]: kubelet.service: Consumed 172ms CPU time, 102.9M memory peak. May 13 23:43:35.068409 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 13 23:43:35.069968 systemd[1]: Started sshd@0-138.199.236.81:22-1.214.197.163:50688.service - OpenSSH per-connection server daemon (1.214.197.163:50688). May 13 23:43:36.370160 sshd[1632]: Invalid user openstack from 1.214.197.163 port 50688 May 13 23:43:36.616362 sshd[1632]: Received disconnect from 1.214.197.163 port 50688:11: Bye Bye [preauth] May 13 23:43:36.616362 sshd[1632]: Disconnected from invalid user openstack 1.214.197.163 port 50688 [preauth] May 13 23:43:36.620717 systemd[1]: sshd@0-138.199.236.81:22-1.214.197.163:50688.service: Deactivated successfully. May 13 23:43:43.803388 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 13 23:43:43.806208 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:43:43.956470 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:43:43.967765 (kubelet)[1644]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:43:44.020883 kubelet[1644]: E0513 23:43:44.020820 1644 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:43:44.023828 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:43:44.024007 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:43:44.024628 systemd[1]: kubelet.service: Consumed 169ms CPU time, 101.8M memory peak. May 13 23:43:52.039270 systemd-timesyncd[1394]: Contacted time server 49.12.125.53:123 (2.flatcar.pool.ntp.org). May 13 23:43:52.039386 systemd-timesyncd[1394]: Initial clock synchronization to Tue 2025-05-13 23:43:51.941816 UTC. May 13 23:43:54.239211 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 13 23:43:54.241884 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:43:54.393217 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:43:54.404781 (kubelet)[1659]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:43:54.457248 kubelet[1659]: E0513 23:43:54.457186 1659 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:43:54.459780 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:43:54.459949 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:43:54.460320 systemd[1]: kubelet.service: Consumed 170ms CPU time, 104.1M memory peak. May 13 23:44:04.489358 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 13 23:44:04.492503 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:44:04.639967 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:44:04.651702 (kubelet)[1674]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:44:04.698656 kubelet[1674]: E0513 23:44:04.698522 1674 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:44:04.700858 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:44:04.701083 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:44:04.701627 systemd[1]: kubelet.service: Consumed 174ms CPU time, 101.5M memory peak. May 13 23:44:05.788133 update_engine[1475]: I20250513 23:44:05.787142 1475 update_attempter.cc:509] Updating boot flags... May 13 23:44:05.826116 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1690) May 13 23:44:05.914354 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1685) May 13 23:44:14.738723 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. May 13 23:44:14.740858 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:44:14.897356 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:44:14.909973 (kubelet)[1707]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:44:14.961512 kubelet[1707]: E0513 23:44:14.961398 1707 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:44:14.963879 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:44:14.964111 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:44:14.966181 systemd[1]: kubelet.service: Consumed 172ms CPU time, 102.6M memory peak. May 13 23:44:24.989520 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. May 13 23:44:24.992578 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:44:25.139819 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:44:25.149162 (kubelet)[1723]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:44:25.199434 kubelet[1723]: E0513 23:44:25.199319 1723 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:44:25.201888 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:44:25.202062 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:44:25.202734 systemd[1]: kubelet.service: Consumed 169ms CPU time, 99.7M memory peak. May 13 23:44:35.239389 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. May 13 23:44:35.242117 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:44:35.390727 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:44:35.405774 (kubelet)[1738]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:44:35.447134 kubelet[1738]: E0513 23:44:35.446595 1738 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:44:35.449454 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:44:35.449592 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:44:35.449973 systemd[1]: kubelet.service: Consumed 159ms CPU time, 101.8M memory peak. May 13 23:44:45.489568 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. May 13 23:44:45.492727 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:44:45.641937 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:44:45.652660 (kubelet)[1752]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:44:45.706016 kubelet[1752]: E0513 23:44:45.705849 1752 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:44:45.709928 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:44:45.710455 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:44:45.711103 systemd[1]: kubelet.service: Consumed 170ms CPU time, 101.6M memory peak. May 13 23:44:55.739271 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. May 13 23:44:55.742010 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:44:55.893598 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:44:55.901389 (kubelet)[1767]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:44:55.948155 kubelet[1767]: E0513 23:44:55.948048 1767 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:44:55.950772 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:44:55.951050 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:44:55.951672 systemd[1]: kubelet.service: Consumed 165ms CPU time, 104.1M memory peak. May 13 23:44:58.271673 systemd[1]: Started sshd@1-138.199.236.81:22-139.178.89.65:57644.service - OpenSSH per-connection server daemon (139.178.89.65:57644). May 13 23:44:59.289050 sshd[1775]: Accepted publickey for core from 139.178.89.65 port 57644 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:44:59.292447 sshd-session[1775]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:44:59.304228 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 13 23:44:59.306693 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 13 23:44:59.315247 systemd-logind[1474]: New session 1 of user core. May 13 23:44:59.339380 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 13 23:44:59.344236 systemd[1]: Starting user@500.service - User Manager for UID 500... May 13 23:44:59.358048 (systemd)[1779]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 13 23:44:59.361654 systemd-logind[1474]: New session c1 of user core. May 13 23:44:59.518430 systemd[1779]: Queued start job for default target default.target. May 13 23:44:59.528098 systemd[1779]: Created slice app.slice - User Application Slice. May 13 23:44:59.528324 systemd[1779]: Reached target paths.target - Paths. May 13 23:44:59.528562 systemd[1779]: Reached target timers.target - Timers. May 13 23:44:59.530437 systemd[1779]: Starting dbus.socket - D-Bus User Message Bus Socket... May 13 23:44:59.545608 systemd[1779]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 13 23:44:59.545736 systemd[1779]: Reached target sockets.target - Sockets. May 13 23:44:59.545789 systemd[1779]: Reached target basic.target - Basic System. May 13 23:44:59.545818 systemd[1779]: Reached target default.target - Main User Target. May 13 23:44:59.545842 systemd[1779]: Startup finished in 174ms. May 13 23:44:59.546028 systemd[1]: Started user@500.service - User Manager for UID 500. May 13 23:44:59.557518 systemd[1]: Started session-1.scope - Session 1 of User core. May 13 23:45:00.269305 systemd[1]: Started sshd@2-138.199.236.81:22-139.178.89.65:57654.service - OpenSSH per-connection server daemon (139.178.89.65:57654). May 13 23:45:01.265969 sshd[1790]: Accepted publickey for core from 139.178.89.65 port 57654 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:45:01.269368 sshd-session[1790]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:45:01.277238 systemd-logind[1474]: New session 2 of user core. May 13 23:45:01.287526 systemd[1]: Started session-2.scope - Session 2 of User core. May 13 23:45:01.950668 sshd[1792]: Connection closed by 139.178.89.65 port 57654 May 13 23:45:01.951552 sshd-session[1790]: pam_unix(sshd:session): session closed for user core May 13 23:45:01.956390 systemd[1]: sshd@2-138.199.236.81:22-139.178.89.65:57654.service: Deactivated successfully. May 13 23:45:01.958709 systemd[1]: session-2.scope: Deactivated successfully. May 13 23:45:01.960735 systemd-logind[1474]: Session 2 logged out. Waiting for processes to exit. May 13 23:45:01.961976 systemd-logind[1474]: Removed session 2. May 13 23:45:02.128283 systemd[1]: Started sshd@3-138.199.236.81:22-139.178.89.65:57656.service - OpenSSH per-connection server daemon (139.178.89.65:57656). May 13 23:45:03.154758 sshd[1798]: Accepted publickey for core from 139.178.89.65 port 57656 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:45:03.156876 sshd-session[1798]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:45:03.163707 systemd-logind[1474]: New session 3 of user core. May 13 23:45:03.172453 systemd[1]: Started session-3.scope - Session 3 of User core. May 13 23:45:03.850508 sshd[1800]: Connection closed by 139.178.89.65 port 57656 May 13 23:45:03.851539 sshd-session[1798]: pam_unix(sshd:session): session closed for user core May 13 23:45:03.857356 systemd[1]: sshd@3-138.199.236.81:22-139.178.89.65:57656.service: Deactivated successfully. May 13 23:45:03.859882 systemd[1]: session-3.scope: Deactivated successfully. May 13 23:45:03.860599 systemd-logind[1474]: Session 3 logged out. Waiting for processes to exit. May 13 23:45:03.861626 systemd-logind[1474]: Removed session 3. May 13 23:45:04.030794 systemd[1]: Started sshd@4-138.199.236.81:22-139.178.89.65:57664.service - OpenSSH per-connection server daemon (139.178.89.65:57664). May 13 23:45:05.056390 sshd[1806]: Accepted publickey for core from 139.178.89.65 port 57664 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:45:05.058469 sshd-session[1806]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:45:05.064263 systemd-logind[1474]: New session 4 of user core. May 13 23:45:05.075377 systemd[1]: Started session-4.scope - Session 4 of User core. May 13 23:45:05.755171 sshd[1808]: Connection closed by 139.178.89.65 port 57664 May 13 23:45:05.755957 sshd-session[1806]: pam_unix(sshd:session): session closed for user core May 13 23:45:05.761649 systemd[1]: sshd@4-138.199.236.81:22-139.178.89.65:57664.service: Deactivated successfully. May 13 23:45:05.764979 systemd[1]: session-4.scope: Deactivated successfully. May 13 23:45:05.765901 systemd-logind[1474]: Session 4 logged out. Waiting for processes to exit. May 13 23:45:05.767129 systemd-logind[1474]: Removed session 4. May 13 23:45:05.925055 systemd[1]: Started sshd@5-138.199.236.81:22-139.178.89.65:57678.service - OpenSSH per-connection server daemon (139.178.89.65:57678). May 13 23:45:05.988786 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. May 13 23:45:05.991536 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:45:06.135829 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:45:06.148658 (kubelet)[1824]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:45:06.196578 kubelet[1824]: E0513 23:45:06.196455 1824 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:45:06.199775 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:45:06.199919 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:45:06.200434 systemd[1]: kubelet.service: Consumed 173ms CPU time, 104.2M memory peak. May 13 23:45:06.925801 sshd[1814]: Accepted publickey for core from 139.178.89.65 port 57678 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:45:06.927872 sshd-session[1814]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:45:06.934824 systemd-logind[1474]: New session 5 of user core. May 13 23:45:06.941459 systemd[1]: Started session-5.scope - Session 5 of User core. May 13 23:45:07.470056 sudo[1832]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 13 23:45:07.470396 sudo[1832]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:45:07.486908 sudo[1832]: pam_unix(sudo:session): session closed for user root May 13 23:45:07.648796 sshd[1831]: Connection closed by 139.178.89.65 port 57678 May 13 23:45:07.648051 sshd-session[1814]: pam_unix(sshd:session): session closed for user core May 13 23:45:07.652882 systemd[1]: sshd@5-138.199.236.81:22-139.178.89.65:57678.service: Deactivated successfully. May 13 23:45:07.654521 systemd[1]: session-5.scope: Deactivated successfully. May 13 23:45:07.655593 systemd-logind[1474]: Session 5 logged out. Waiting for processes to exit. May 13 23:45:07.656976 systemd-logind[1474]: Removed session 5. May 13 23:45:07.827647 systemd[1]: Started sshd@6-138.199.236.81:22-139.178.89.65:35312.service - OpenSSH per-connection server daemon (139.178.89.65:35312). May 13 23:45:08.838638 sshd[1838]: Accepted publickey for core from 139.178.89.65 port 35312 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:45:08.840384 sshd-session[1838]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:45:08.847311 systemd-logind[1474]: New session 6 of user core. May 13 23:45:08.857421 systemd[1]: Started session-6.scope - Session 6 of User core. May 13 23:45:09.376964 sudo[1842]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 13 23:45:09.377325 sudo[1842]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:45:09.382288 sudo[1842]: pam_unix(sudo:session): session closed for user root May 13 23:45:09.388910 sudo[1841]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 13 23:45:09.389251 sudo[1841]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:45:09.403800 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 23:45:09.447781 augenrules[1864]: No rules May 13 23:45:09.449589 systemd[1]: audit-rules.service: Deactivated successfully. May 13 23:45:09.449931 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 23:45:09.453317 sudo[1841]: pam_unix(sudo:session): session closed for user root May 13 23:45:09.617926 sshd[1840]: Connection closed by 139.178.89.65 port 35312 May 13 23:45:09.617315 sshd-session[1838]: pam_unix(sshd:session): session closed for user core May 13 23:45:09.621459 systemd[1]: sshd@6-138.199.236.81:22-139.178.89.65:35312.service: Deactivated successfully. May 13 23:45:09.623610 systemd[1]: session-6.scope: Deactivated successfully. May 13 23:45:09.625850 systemd-logind[1474]: Session 6 logged out. Waiting for processes to exit. May 13 23:45:09.627477 systemd-logind[1474]: Removed session 6. May 13 23:45:09.791490 systemd[1]: Started sshd@7-138.199.236.81:22-139.178.89.65:35318.service - OpenSSH per-connection server daemon (139.178.89.65:35318). May 13 23:45:10.815534 sshd[1873]: Accepted publickey for core from 139.178.89.65 port 35318 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:45:10.817399 sshd-session[1873]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:45:10.823547 systemd-logind[1474]: New session 7 of user core. May 13 23:45:10.830400 systemd[1]: Started session-7.scope - Session 7 of User core. May 13 23:45:11.344638 sudo[1876]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 13 23:45:11.344911 sudo[1876]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:45:11.686262 systemd[1]: Starting docker.service - Docker Application Container Engine... May 13 23:45:11.696711 (dockerd)[1893]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 13 23:45:11.943619 dockerd[1893]: time="2025-05-13T23:45:11.942672569Z" level=info msg="Starting up" May 13 23:45:11.946490 dockerd[1893]: time="2025-05-13T23:45:11.946159007Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 13 23:45:12.005719 dockerd[1893]: time="2025-05-13T23:45:12.005646170Z" level=info msg="Loading containers: start." May 13 23:45:12.172174 kernel: Initializing XFRM netlink socket May 13 23:45:12.254297 systemd-networkd[1392]: docker0: Link UP May 13 23:45:12.317350 dockerd[1893]: time="2025-05-13T23:45:12.317276633Z" level=info msg="Loading containers: done." May 13 23:45:12.336152 dockerd[1893]: time="2025-05-13T23:45:12.335883052Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 13 23:45:12.336152 dockerd[1893]: time="2025-05-13T23:45:12.335980251Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 May 13 23:45:12.336441 dockerd[1893]: time="2025-05-13T23:45:12.336224848Z" level=info msg="Daemon has completed initialization" May 13 23:45:12.374402 dockerd[1893]: time="2025-05-13T23:45:12.373808523Z" level=info msg="API listen on /run/docker.sock" May 13 23:45:12.373905 systemd[1]: Started docker.service - Docker Application Container Engine. May 13 23:45:13.423771 containerd[1504]: time="2025-05-13T23:45:13.423710715Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.4\"" May 13 23:45:14.166459 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount130323428.mount: Deactivated successfully. May 13 23:45:15.990823 containerd[1504]: time="2025-05-13T23:45:15.990691678Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:15.992090 containerd[1504]: time="2025-05-13T23:45:15.991850745Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.4: active requests=0, bytes read=26233210" May 13 23:45:15.993428 containerd[1504]: time="2025-05-13T23:45:15.993350568Z" level=info msg="ImageCreate event name:\"sha256:ab579d62aa850c7d0eca948aad11fcf813743e3b6c9742241c32cb4f1638968b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:15.996291 containerd[1504]: time="2025-05-13T23:45:15.996218776Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:631c6cc78b2862be4fed7df3384a643ef7297eebadae22e8ef9cbe2e19b6386f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:15.997627 containerd[1504]: time="2025-05-13T23:45:15.997451682Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.4\" with image id \"sha256:ab579d62aa850c7d0eca948aad11fcf813743e3b6c9742241c32cb4f1638968b\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:631c6cc78b2862be4fed7df3384a643ef7297eebadae22e8ef9cbe2e19b6386f\", size \"26229918\" in 2.573695887s" May 13 23:45:15.997627 containerd[1504]: time="2025-05-13T23:45:15.997494961Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.4\" returns image reference \"sha256:ab579d62aa850c7d0eca948aad11fcf813743e3b6c9742241c32cb4f1638968b\"" May 13 23:45:15.998353 containerd[1504]: time="2025-05-13T23:45:15.998323272Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.4\"" May 13 23:45:16.239334 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. May 13 23:45:16.241922 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:45:16.393298 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:45:16.404818 (kubelet)[2151]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:45:16.460640 kubelet[2151]: E0513 23:45:16.460594 2151 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:45:16.463276 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:45:16.463547 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:45:16.464046 systemd[1]: kubelet.service: Consumed 177ms CPU time, 99.5M memory peak. May 13 23:45:18.314137 containerd[1504]: time="2025-05-13T23:45:18.313335525Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:18.315248 containerd[1504]: time="2025-05-13T23:45:18.315173946Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.4: active requests=0, bytes read=22529591" May 13 23:45:18.316413 containerd[1504]: time="2025-05-13T23:45:18.316337293Z" level=info msg="ImageCreate event name:\"sha256:79534fade29d07745acc698bbf598b0604a9ea1fd7917822c816a74fc0b55965\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:18.319430 containerd[1504]: time="2025-05-13T23:45:18.319378100Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:25e29187ea66f0ff9b9a00114849c3a30b649005c900a8b2a69e3f3fa56448fb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:18.321142 containerd[1504]: time="2025-05-13T23:45:18.320886324Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.4\" with image id \"sha256:79534fade29d07745acc698bbf598b0604a9ea1fd7917822c816a74fc0b55965\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:25e29187ea66f0ff9b9a00114849c3a30b649005c900a8b2a69e3f3fa56448fb\", size \"23971132\" in 2.322518412s" May 13 23:45:18.321142 containerd[1504]: time="2025-05-13T23:45:18.320940043Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.4\" returns image reference \"sha256:79534fade29d07745acc698bbf598b0604a9ea1fd7917822c816a74fc0b55965\"" May 13 23:45:18.321648 containerd[1504]: time="2025-05-13T23:45:18.321610756Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.4\"" May 13 23:45:20.097159 containerd[1504]: time="2025-05-13T23:45:20.097028704Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:20.098621 containerd[1504]: time="2025-05-13T23:45:20.098554608Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.4: active requests=0, bytes read=17482193" May 13 23:45:20.099720 containerd[1504]: time="2025-05-13T23:45:20.099624636Z" level=info msg="ImageCreate event name:\"sha256:730fbc2590716b8202fcdd928a813b847575ebf03911a059979257cd6cbb8245\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:20.102863 containerd[1504]: time="2025-05-13T23:45:20.102789123Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:09c55f8dac59a4b8e5e354140f5a4bdd6fa9bd95c42d6bcba6782ed37c31b5a2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:20.105110 containerd[1504]: time="2025-05-13T23:45:20.104939980Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.4\" with image id \"sha256:730fbc2590716b8202fcdd928a813b847575ebf03911a059979257cd6cbb8245\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:09c55f8dac59a4b8e5e354140f5a4bdd6fa9bd95c42d6bcba6782ed37c31b5a2\", size \"18923752\" in 1.783281944s" May 13 23:45:20.105110 containerd[1504]: time="2025-05-13T23:45:20.104994060Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.4\" returns image reference \"sha256:730fbc2590716b8202fcdd928a813b847575ebf03911a059979257cd6cbb8245\"" May 13 23:45:20.105851 containerd[1504]: time="2025-05-13T23:45:20.105670093Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.4\"" May 13 23:45:21.704439 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount47636467.mount: Deactivated successfully. May 13 23:45:22.278621 containerd[1504]: time="2025-05-13T23:45:22.278568279Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:22.279860 containerd[1504]: time="2025-05-13T23:45:22.279785426Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.4: active requests=0, bytes read=27370377" May 13 23:45:22.280690 containerd[1504]: time="2025-05-13T23:45:22.280618018Z" level=info msg="ImageCreate event name:\"sha256:62c496efa595c8eb7d098e43430b2b94ad66812214759a7ea9daaaa1ed901fc7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:22.282681 containerd[1504]: time="2025-05-13T23:45:22.282633477Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:152638222ecf265eb8e5352e3c50e8fc520994e8ffcff1ee1490c975f7fc2b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:22.284148 containerd[1504]: time="2025-05-13T23:45:22.283984183Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.4\" with image id \"sha256:62c496efa595c8eb7d098e43430b2b94ad66812214759a7ea9daaaa1ed901fc7\", repo tag \"registry.k8s.io/kube-proxy:v1.32.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:152638222ecf265eb8e5352e3c50e8fc520994e8ffcff1ee1490c975f7fc2b36\", size \"27369370\" in 2.178268931s" May 13 23:45:22.284148 containerd[1504]: time="2025-05-13T23:45:22.284020383Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.4\" returns image reference \"sha256:62c496efa595c8eb7d098e43430b2b94ad66812214759a7ea9daaaa1ed901fc7\"" May 13 23:45:22.284689 containerd[1504]: time="2025-05-13T23:45:22.284661696Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 13 23:45:23.010907 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount173728441.mount: Deactivated successfully. May 13 23:45:24.045044 containerd[1504]: time="2025-05-13T23:45:24.044951935Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:24.046682 containerd[1504]: time="2025-05-13T23:45:24.046612518Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951714" May 13 23:45:24.047444 containerd[1504]: time="2025-05-13T23:45:24.046818756Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:24.051098 containerd[1504]: time="2025-05-13T23:45:24.051028394Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:24.052373 containerd[1504]: time="2025-05-13T23:45:24.052330181Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.767628565s" May 13 23:45:24.052373 containerd[1504]: time="2025-05-13T23:45:24.052369900Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" May 13 23:45:24.053006 containerd[1504]: time="2025-05-13T23:45:24.052969294Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 13 23:45:24.651578 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount470120422.mount: Deactivated successfully. May 13 23:45:24.657762 containerd[1504]: time="2025-05-13T23:45:24.657674437Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:45:24.659287 containerd[1504]: time="2025-05-13T23:45:24.659231381Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" May 13 23:45:24.660384 containerd[1504]: time="2025-05-13T23:45:24.660337450Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:45:24.662474 containerd[1504]: time="2025-05-13T23:45:24.662444429Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:45:24.663760 containerd[1504]: time="2025-05-13T23:45:24.663727896Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 610.545324ms" May 13 23:45:24.663816 containerd[1504]: time="2025-05-13T23:45:24.663762496Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" May 13 23:45:24.664245 containerd[1504]: time="2025-05-13T23:45:24.664217851Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" May 13 23:45:25.382956 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2815073096.mount: Deactivated successfully. May 13 23:45:26.489234 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. May 13 23:45:26.491912 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:45:26.640134 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:45:26.650712 (kubelet)[2288]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:45:26.693633 kubelet[2288]: E0513 23:45:26.693558 2288 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:45:26.696760 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:45:26.696955 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:45:26.697645 systemd[1]: kubelet.service: Consumed 162ms CPU time, 102.1M memory peak. May 13 23:45:28.914546 containerd[1504]: time="2025-05-13T23:45:28.914480062Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:28.916588 containerd[1504]: time="2025-05-13T23:45:28.916296331Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67812537" May 13 23:45:28.920102 containerd[1504]: time="2025-05-13T23:45:28.918288158Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:28.921769 containerd[1504]: time="2025-05-13T23:45:28.921726657Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:28.922935 containerd[1504]: time="2025-05-13T23:45:28.922894250Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 4.258643119s" May 13 23:45:28.922935 containerd[1504]: time="2025-05-13T23:45:28.922932929Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" May 13 23:45:33.077886 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:45:33.078044 systemd[1]: kubelet.service: Consumed 162ms CPU time, 102.1M memory peak. May 13 23:45:33.081426 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:45:33.116552 systemd[1]: Reload requested from client PID 2328 ('systemctl') (unit session-7.scope)... May 13 23:45:33.116762 systemd[1]: Reloading... May 13 23:45:33.258110 zram_generator::config[2373]: No configuration found. May 13 23:45:33.344952 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:45:33.437845 systemd[1]: Reloading finished in 320 ms. May 13 23:45:33.492656 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 13 23:45:33.492920 systemd[1]: kubelet.service: Failed with result 'signal'. May 13 23:45:33.493323 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:45:33.493468 systemd[1]: kubelet.service: Consumed 108ms CPU time, 90.2M memory peak. May 13 23:45:33.495603 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:45:33.650514 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:45:33.661641 (kubelet)[2421]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 23:45:33.708281 kubelet[2421]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:45:33.708281 kubelet[2421]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 13 23:45:33.708281 kubelet[2421]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:45:33.708818 kubelet[2421]: I0513 23:45:33.708377 2421 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 23:45:35.035588 kubelet[2421]: I0513 23:45:35.035523 2421 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" May 13 23:45:35.035588 kubelet[2421]: I0513 23:45:35.035571 2421 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 23:45:35.037776 kubelet[2421]: I0513 23:45:35.035949 2421 server.go:954] "Client rotation is on, will bootstrap in background" May 13 23:45:35.066966 kubelet[2421]: E0513 23:45:35.066895 2421 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://138.199.236.81:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 138.199.236.81:6443: connect: connection refused" logger="UnhandledError" May 13 23:45:35.069394 kubelet[2421]: I0513 23:45:35.069260 2421 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 23:45:35.080666 kubelet[2421]: I0513 23:45:35.080566 2421 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 13 23:45:35.083631 kubelet[2421]: I0513 23:45:35.083604 2421 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 23:45:35.084630 kubelet[2421]: I0513 23:45:35.084570 2421 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 23:45:35.084821 kubelet[2421]: I0513 23:45:35.084636 2421 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-732e99817a","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 23:45:35.084938 kubelet[2421]: I0513 23:45:35.084892 2421 topology_manager.go:138] "Creating topology manager with none policy" May 13 23:45:35.084938 kubelet[2421]: I0513 23:45:35.084900 2421 container_manager_linux.go:304] "Creating device plugin manager" May 13 23:45:35.085182 kubelet[2421]: I0513 23:45:35.085161 2421 state_mem.go:36] "Initialized new in-memory state store" May 13 23:45:35.088732 kubelet[2421]: I0513 23:45:35.088596 2421 kubelet.go:446] "Attempting to sync node with API server" May 13 23:45:35.088732 kubelet[2421]: I0513 23:45:35.088630 2421 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 23:45:35.088732 kubelet[2421]: I0513 23:45:35.088661 2421 kubelet.go:352] "Adding apiserver pod source" May 13 23:45:35.088732 kubelet[2421]: I0513 23:45:35.088676 2421 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 23:45:35.097106 kubelet[2421]: W0513 23:45:35.095935 2421 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://138.199.236.81:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-732e99817a&limit=500&resourceVersion=0": dial tcp 138.199.236.81:6443: connect: connection refused May 13 23:45:35.097106 kubelet[2421]: E0513 23:45:35.096008 2421 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://138.199.236.81:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-732e99817a&limit=500&resourceVersion=0\": dial tcp 138.199.236.81:6443: connect: connection refused" logger="UnhandledError" May 13 23:45:35.097106 kubelet[2421]: W0513 23:45:35.096422 2421 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://138.199.236.81:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 138.199.236.81:6443: connect: connection refused May 13 23:45:35.097106 kubelet[2421]: E0513 23:45:35.096459 2421 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://138.199.236.81:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 138.199.236.81:6443: connect: connection refused" logger="UnhandledError" May 13 23:45:35.097106 kubelet[2421]: I0513 23:45:35.096555 2421 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 13 23:45:35.097504 kubelet[2421]: I0513 23:45:35.097488 2421 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 23:45:35.097675 kubelet[2421]: W0513 23:45:35.097664 2421 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 13 23:45:35.099906 kubelet[2421]: I0513 23:45:35.099878 2421 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 13 23:45:35.100060 kubelet[2421]: I0513 23:45:35.100049 2421 server.go:1287] "Started kubelet" May 13 23:45:35.101682 kubelet[2421]: I0513 23:45:35.101542 2421 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 13 23:45:35.102980 kubelet[2421]: I0513 23:45:35.102944 2421 server.go:490] "Adding debug handlers to kubelet server" May 13 23:45:35.104494 kubelet[2421]: I0513 23:45:35.104429 2421 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 23:45:35.105462 kubelet[2421]: I0513 23:45:35.104841 2421 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 23:45:35.106208 kubelet[2421]: E0513 23:45:35.105930 2421 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://138.199.236.81:6443/api/v1/namespaces/default/events\": dial tcp 138.199.236.81:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284-0-0-n-732e99817a.183f3ae4de53b2ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284-0-0-n-732e99817a,UID:ci-4284-0-0-n-732e99817a,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-n-732e99817a,},FirstTimestamp:2025-05-13 23:45:35.100023535 +0000 UTC m=+1.432804117,LastTimestamp:2025-05-13 23:45:35.100023535 +0000 UTC m=+1.432804117,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-n-732e99817a,}" May 13 23:45:35.106946 kubelet[2421]: I0513 23:45:35.106558 2421 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 23:45:35.107501 kubelet[2421]: I0513 23:45:35.107471 2421 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 23:45:35.112353 kubelet[2421]: E0513 23:45:35.111046 2421 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-732e99817a\" not found" May 13 23:45:35.112353 kubelet[2421]: I0513 23:45:35.111138 2421 volume_manager.go:297] "Starting Kubelet Volume Manager" May 13 23:45:35.112353 kubelet[2421]: I0513 23:45:35.111317 2421 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 13 23:45:35.112353 kubelet[2421]: I0513 23:45:35.111367 2421 reconciler.go:26] "Reconciler: start to sync state" May 13 23:45:35.112353 kubelet[2421]: W0513 23:45:35.111720 2421 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://138.199.236.81:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 138.199.236.81:6443: connect: connection refused May 13 23:45:35.112353 kubelet[2421]: E0513 23:45:35.111760 2421 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://138.199.236.81:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 138.199.236.81:6443: connect: connection refused" logger="UnhandledError" May 13 23:45:35.114170 kubelet[2421]: E0513 23:45:35.114090 2421 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://138.199.236.81:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-732e99817a?timeout=10s\": dial tcp 138.199.236.81:6443: connect: connection refused" interval="200ms" May 13 23:45:35.115373 kubelet[2421]: I0513 23:45:35.115343 2421 factory.go:221] Registration of the systemd container factory successfully May 13 23:45:35.115579 kubelet[2421]: I0513 23:45:35.115558 2421 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 23:45:35.117483 kubelet[2421]: I0513 23:45:35.117462 2421 factory.go:221] Registration of the containerd container factory successfully May 13 23:45:35.124901 kubelet[2421]: E0513 23:45:35.124869 2421 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 23:45:35.129817 kubelet[2421]: I0513 23:45:35.129774 2421 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 23:45:35.134282 kubelet[2421]: I0513 23:45:35.134238 2421 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 23:45:35.134282 kubelet[2421]: I0513 23:45:35.134274 2421 status_manager.go:227] "Starting to sync pod status with apiserver" May 13 23:45:35.134435 kubelet[2421]: I0513 23:45:35.134298 2421 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 13 23:45:35.134435 kubelet[2421]: I0513 23:45:35.134305 2421 kubelet.go:2388] "Starting kubelet main sync loop" May 13 23:45:35.134435 kubelet[2421]: E0513 23:45:35.134350 2421 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 23:45:35.140563 kubelet[2421]: I0513 23:45:35.140537 2421 cpu_manager.go:221] "Starting CPU manager" policy="none" May 13 23:45:35.140707 kubelet[2421]: I0513 23:45:35.140696 2421 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 13 23:45:35.140763 kubelet[2421]: I0513 23:45:35.140756 2421 state_mem.go:36] "Initialized new in-memory state store" May 13 23:45:35.143207 kubelet[2421]: I0513 23:45:35.143181 2421 policy_none.go:49] "None policy: Start" May 13 23:45:35.143349 kubelet[2421]: I0513 23:45:35.143338 2421 memory_manager.go:186] "Starting memorymanager" policy="None" May 13 23:45:35.143493 kubelet[2421]: I0513 23:45:35.143482 2421 state_mem.go:35] "Initializing new in-memory state store" May 13 23:45:35.145133 kubelet[2421]: W0513 23:45:35.145002 2421 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://138.199.236.81:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 138.199.236.81:6443: connect: connection refused May 13 23:45:35.145287 kubelet[2421]: E0513 23:45:35.145141 2421 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://138.199.236.81:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 138.199.236.81:6443: connect: connection refused" logger="UnhandledError" May 13 23:45:35.151360 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 13 23:45:35.170627 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 13 23:45:35.175983 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 13 23:45:35.188705 kubelet[2421]: I0513 23:45:35.187642 2421 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 23:45:35.188705 kubelet[2421]: I0513 23:45:35.187948 2421 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 23:45:35.188705 kubelet[2421]: I0513 23:45:35.187966 2421 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 23:45:35.188968 kubelet[2421]: I0513 23:45:35.188764 2421 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 23:45:35.191433 kubelet[2421]: E0513 23:45:35.191404 2421 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 13 23:45:35.191558 kubelet[2421]: E0513 23:45:35.191460 2421 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4284-0-0-n-732e99817a\" not found" May 13 23:45:35.252738 systemd[1]: Created slice kubepods-burstable-podf63763a596870305a0780083caca1262.slice - libcontainer container kubepods-burstable-podf63763a596870305a0780083caca1262.slice. May 13 23:45:35.271236 kubelet[2421]: E0513 23:45:35.270921 2421 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-732e99817a\" not found" node="ci-4284-0-0-n-732e99817a" May 13 23:45:35.274532 systemd[1]: Created slice kubepods-burstable-pod9ce869bf71179ce9ccb9f6662ffc6652.slice - libcontainer container kubepods-burstable-pod9ce869bf71179ce9ccb9f6662ffc6652.slice. May 13 23:45:35.277535 kubelet[2421]: E0513 23:45:35.277507 2421 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-732e99817a\" not found" node="ci-4284-0-0-n-732e99817a" May 13 23:45:35.291140 systemd[1]: Created slice kubepods-burstable-pod3d8a964e51f55a69682f8cdcaca3e273.slice - libcontainer container kubepods-burstable-pod3d8a964e51f55a69682f8cdcaca3e273.slice. May 13 23:45:35.294462 kubelet[2421]: I0513 23:45:35.293579 2421 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-732e99817a" May 13 23:45:35.294462 kubelet[2421]: E0513 23:45:35.293998 2421 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://138.199.236.81:6443/api/v1/nodes\": dial tcp 138.199.236.81:6443: connect: connection refused" node="ci-4284-0-0-n-732e99817a" May 13 23:45:35.295765 kubelet[2421]: E0513 23:45:35.295743 2421 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-732e99817a\" not found" node="ci-4284-0-0-n-732e99817a" May 13 23:45:35.312423 kubelet[2421]: I0513 23:45:35.312359 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9ce869bf71179ce9ccb9f6662ffc6652-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-732e99817a\" (UID: \"9ce869bf71179ce9ccb9f6662ffc6652\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-732e99817a" May 13 23:45:35.312590 kubelet[2421]: I0513 23:45:35.312437 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3d8a964e51f55a69682f8cdcaca3e273-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-732e99817a\" (UID: \"3d8a964e51f55a69682f8cdcaca3e273\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:35.312590 kubelet[2421]: I0513 23:45:35.312474 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3d8a964e51f55a69682f8cdcaca3e273-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-732e99817a\" (UID: \"3d8a964e51f55a69682f8cdcaca3e273\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:35.312590 kubelet[2421]: I0513 23:45:35.312501 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3d8a964e51f55a69682f8cdcaca3e273-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-732e99817a\" (UID: \"3d8a964e51f55a69682f8cdcaca3e273\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:35.312590 kubelet[2421]: I0513 23:45:35.312529 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3d8a964e51f55a69682f8cdcaca3e273-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-732e99817a\" (UID: \"3d8a964e51f55a69682f8cdcaca3e273\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:35.312737 kubelet[2421]: I0513 23:45:35.312584 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f63763a596870305a0780083caca1262-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-732e99817a\" (UID: \"f63763a596870305a0780083caca1262\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-732e99817a" May 13 23:45:35.312737 kubelet[2421]: I0513 23:45:35.312622 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9ce869bf71179ce9ccb9f6662ffc6652-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-732e99817a\" (UID: \"9ce869bf71179ce9ccb9f6662ffc6652\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-732e99817a" May 13 23:45:35.312737 kubelet[2421]: I0513 23:45:35.312647 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9ce869bf71179ce9ccb9f6662ffc6652-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-732e99817a\" (UID: \"9ce869bf71179ce9ccb9f6662ffc6652\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-732e99817a" May 13 23:45:35.312737 kubelet[2421]: I0513 23:45:35.312675 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3d8a964e51f55a69682f8cdcaca3e273-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-732e99817a\" (UID: \"3d8a964e51f55a69682f8cdcaca3e273\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:35.315583 kubelet[2421]: E0513 23:45:35.315521 2421 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://138.199.236.81:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-732e99817a?timeout=10s\": dial tcp 138.199.236.81:6443: connect: connection refused" interval="400ms" May 13 23:45:35.507187 kubelet[2421]: I0513 23:45:35.496838 2421 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-732e99817a" May 13 23:45:35.507939 kubelet[2421]: E0513 23:45:35.507842 2421 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://138.199.236.81:6443/api/v1/nodes\": dial tcp 138.199.236.81:6443: connect: connection refused" node="ci-4284-0-0-n-732e99817a" May 13 23:45:35.574843 containerd[1504]: time="2025-05-13T23:45:35.574146068Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-732e99817a,Uid:f63763a596870305a0780083caca1262,Namespace:kube-system,Attempt:0,}" May 13 23:45:35.582473 containerd[1504]: time="2025-05-13T23:45:35.582206733Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-732e99817a,Uid:9ce869bf71179ce9ccb9f6662ffc6652,Namespace:kube-system,Attempt:0,}" May 13 23:45:35.598516 containerd[1504]: time="2025-05-13T23:45:35.598282944Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-732e99817a,Uid:3d8a964e51f55a69682f8cdcaca3e273,Namespace:kube-system,Attempt:0,}" May 13 23:45:35.619808 containerd[1504]: time="2025-05-13T23:45:35.619627130Z" level=info msg="connecting to shim db3b737e89b7f17d83d874b3ee74520d1f351a648c95a706901194847310610c" address="unix:///run/containerd/s/93612302044af4e350d606a86221cc76236bdbf5e8fbc314ae7dac098529ca52" namespace=k8s.io protocol=ttrpc version=3 May 13 23:45:35.647406 containerd[1504]: time="2025-05-13T23:45:35.646830942Z" level=info msg="connecting to shim 64210b927d7725f252706030e2b5f1e64c6ce39a8b910aefd8c6c7e881abf001" address="unix:///run/containerd/s/15e5ed63b034a7b158b61dd7854c90cb4d6203c4414d472cfce2a7f5df8f5c85" namespace=k8s.io protocol=ttrpc version=3 May 13 23:45:35.674968 containerd[1504]: time="2025-05-13T23:45:35.674927290Z" level=info msg="connecting to shim d05c4f82ed799321aaf1608ef36f8f707e728b40d6b65935e37253dacdc1bf51" address="unix:///run/containerd/s/2c03b64b8fd65efaa8d716073da9eb54b5a1000a6652c713832344d38e8704c1" namespace=k8s.io protocol=ttrpc version=3 May 13 23:45:35.677499 systemd[1]: Started cri-containerd-db3b737e89b7f17d83d874b3ee74520d1f351a648c95a706901194847310610c.scope - libcontainer container db3b737e89b7f17d83d874b3ee74520d1f351a648c95a706901194847310610c. May 13 23:45:35.703302 systemd[1]: Started cri-containerd-64210b927d7725f252706030e2b5f1e64c6ce39a8b910aefd8c6c7e881abf001.scope - libcontainer container 64210b927d7725f252706030e2b5f1e64c6ce39a8b910aefd8c6c7e881abf001. May 13 23:45:35.711829 systemd[1]: Started cri-containerd-d05c4f82ed799321aaf1608ef36f8f707e728b40d6b65935e37253dacdc1bf51.scope - libcontainer container d05c4f82ed799321aaf1608ef36f8f707e728b40d6b65935e37253dacdc1bf51. May 13 23:45:35.718449 kubelet[2421]: E0513 23:45:35.718173 2421 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://138.199.236.81:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-732e99817a?timeout=10s\": dial tcp 138.199.236.81:6443: connect: connection refused" interval="800ms" May 13 23:45:35.765969 containerd[1504]: time="2025-05-13T23:45:35.765471607Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-732e99817a,Uid:f63763a596870305a0780083caca1262,Namespace:kube-system,Attempt:0,} returns sandbox id \"db3b737e89b7f17d83d874b3ee74520d1f351a648c95a706901194847310610c\"" May 13 23:45:35.777632 containerd[1504]: time="2025-05-13T23:45:35.777565145Z" level=info msg="CreateContainer within sandbox \"db3b737e89b7f17d83d874b3ee74520d1f351a648c95a706901194847310610c\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 13 23:45:35.792246 containerd[1504]: time="2025-05-13T23:45:35.792009046Z" level=info msg="Container 152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55: CDI devices from CRI Config.CDIDevices: []" May 13 23:45:35.794948 containerd[1504]: time="2025-05-13T23:45:35.794820497Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-732e99817a,Uid:9ce869bf71179ce9ccb9f6662ffc6652,Namespace:kube-system,Attempt:0,} returns sandbox id \"64210b927d7725f252706030e2b5f1e64c6ce39a8b910aefd8c6c7e881abf001\"" May 13 23:45:35.798144 containerd[1504]: time="2025-05-13T23:45:35.797632988Z" level=info msg="CreateContainer within sandbox \"64210b927d7725f252706030e2b5f1e64c6ce39a8b910aefd8c6c7e881abf001\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 13 23:45:35.807418 containerd[1504]: time="2025-05-13T23:45:35.807354124Z" level=info msg="Container 54354e152d93b1dc3e36e54a98d41d1c660a0b859f8396cd30b90d3a55bbffbb: CDI devices from CRI Config.CDIDevices: []" May 13 23:45:35.808911 containerd[1504]: time="2025-05-13T23:45:35.808868791Z" level=info msg="CreateContainer within sandbox \"db3b737e89b7f17d83d874b3ee74520d1f351a648c95a706901194847310610c\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55\"" May 13 23:45:35.809447 containerd[1504]: time="2025-05-13T23:45:35.809416801Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-732e99817a,Uid:3d8a964e51f55a69682f8cdcaca3e273,Namespace:kube-system,Attempt:0,} returns sandbox id \"d05c4f82ed799321aaf1608ef36f8f707e728b40d6b65935e37253dacdc1bf51\"" May 13 23:45:35.820911 containerd[1504]: time="2025-05-13T23:45:35.819093176Z" level=info msg="StartContainer for \"152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55\"" May 13 23:45:35.820911 containerd[1504]: time="2025-05-13T23:45:35.819795589Z" level=info msg="CreateContainer within sandbox \"64210b927d7725f252706030e2b5f1e64c6ce39a8b910aefd8c6c7e881abf001\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"54354e152d93b1dc3e36e54a98d41d1c660a0b859f8396cd30b90d3a55bbffbb\"" May 13 23:45:35.820911 containerd[1504]: time="2025-05-13T23:45:35.820842728Z" level=info msg="StartContainer for \"54354e152d93b1dc3e36e54a98d41d1c660a0b859f8396cd30b90d3a55bbffbb\"" May 13 23:45:35.821722 containerd[1504]: time="2025-05-13T23:45:35.821666463Z" level=info msg="connecting to shim 152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55" address="unix:///run/containerd/s/93612302044af4e350d606a86221cc76236bdbf5e8fbc314ae7dac098529ca52" protocol=ttrpc version=3 May 13 23:45:35.822325 containerd[1504]: time="2025-05-13T23:45:35.822268274Z" level=info msg="connecting to shim 54354e152d93b1dc3e36e54a98d41d1c660a0b859f8396cd30b90d3a55bbffbb" address="unix:///run/containerd/s/15e5ed63b034a7b158b61dd7854c90cb4d6203c4414d472cfce2a7f5df8f5c85" protocol=ttrpc version=3 May 13 23:45:35.825339 containerd[1504]: time="2025-05-13T23:45:35.824581075Z" level=info msg="CreateContainer within sandbox \"d05c4f82ed799321aaf1608ef36f8f707e728b40d6b65935e37253dacdc1bf51\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 13 23:45:35.840492 containerd[1504]: time="2025-05-13T23:45:35.840430042Z" level=info msg="Container 33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9: CDI devices from CRI Config.CDIDevices: []" May 13 23:45:35.852300 systemd[1]: Started cri-containerd-54354e152d93b1dc3e36e54a98d41d1c660a0b859f8396cd30b90d3a55bbffbb.scope - libcontainer container 54354e152d93b1dc3e36e54a98d41d1c660a0b859f8396cd30b90d3a55bbffbb. May 13 23:45:35.861136 containerd[1504]: time="2025-05-13T23:45:35.860574326Z" level=info msg="CreateContainer within sandbox \"d05c4f82ed799321aaf1608ef36f8f707e728b40d6b65935e37253dacdc1bf51\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9\"" May 13 23:45:35.861293 systemd[1]: Started cri-containerd-152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55.scope - libcontainer container 152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55. May 13 23:45:35.862871 containerd[1504]: time="2025-05-13T23:45:35.862732765Z" level=info msg="StartContainer for \"33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9\"" May 13 23:45:35.864473 containerd[1504]: time="2025-05-13T23:45:35.864436636Z" level=info msg="connecting to shim 33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9" address="unix:///run/containerd/s/2c03b64b8fd65efaa8d716073da9eb54b5a1000a6652c713832344d38e8704c1" protocol=ttrpc version=3 May 13 23:45:35.895615 systemd[1]: Started cri-containerd-33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9.scope - libcontainer container 33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9. May 13 23:45:35.912392 kubelet[2421]: I0513 23:45:35.912159 2421 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-732e99817a" May 13 23:45:35.912812 kubelet[2421]: E0513 23:45:35.912629 2421 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://138.199.236.81:6443/api/v1/nodes\": dial tcp 138.199.236.81:6443: connect: connection refused" node="ci-4284-0-0-n-732e99817a" May 13 23:45:35.920147 kubelet[2421]: W0513 23:45:35.919750 2421 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://138.199.236.81:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 138.199.236.81:6443: connect: connection refused May 13 23:45:35.920147 kubelet[2421]: E0513 23:45:35.919817 2421 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://138.199.236.81:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 138.199.236.81:6443: connect: connection refused" logger="UnhandledError" May 13 23:45:35.931152 containerd[1504]: time="2025-05-13T23:45:35.930446589Z" level=info msg="StartContainer for \"54354e152d93b1dc3e36e54a98d41d1c660a0b859f8396cd30b90d3a55bbffbb\" returns successfully" May 13 23:45:35.958319 containerd[1504]: time="2025-05-13T23:45:35.958278613Z" level=info msg="StartContainer for \"152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55\" returns successfully" May 13 23:45:35.975163 kubelet[2421]: W0513 23:45:35.975030 2421 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://138.199.236.81:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-732e99817a&limit=500&resourceVersion=0": dial tcp 138.199.236.81:6443: connect: connection refused May 13 23:45:35.975163 kubelet[2421]: E0513 23:45:35.975156 2421 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://138.199.236.81:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-732e99817a&limit=500&resourceVersion=0\": dial tcp 138.199.236.81:6443: connect: connection refused" logger="UnhandledError" May 13 23:45:35.988689 containerd[1504]: time="2025-05-13T23:45:35.988543440Z" level=info msg="StartContainer for \"33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9\" returns successfully" May 13 23:45:36.156476 kubelet[2421]: E0513 23:45:36.156361 2421 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-732e99817a\" not found" node="ci-4284-0-0-n-732e99817a" May 13 23:45:36.159098 kubelet[2421]: E0513 23:45:36.159045 2421 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-732e99817a\" not found" node="ci-4284-0-0-n-732e99817a" May 13 23:45:36.164008 kubelet[2421]: E0513 23:45:36.163977 2421 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-732e99817a\" not found" node="ci-4284-0-0-n-732e99817a" May 13 23:45:36.718193 kubelet[2421]: I0513 23:45:36.718156 2421 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-732e99817a" May 13 23:45:37.170855 kubelet[2421]: E0513 23:45:37.170492 2421 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-732e99817a\" not found" node="ci-4284-0-0-n-732e99817a" May 13 23:45:37.170855 kubelet[2421]: E0513 23:45:37.170734 2421 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-732e99817a\" not found" node="ci-4284-0-0-n-732e99817a" May 13 23:45:37.962688 kubelet[2421]: E0513 23:45:37.962644 2421 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4284-0-0-n-732e99817a\" not found" node="ci-4284-0-0-n-732e99817a" May 13 23:45:38.145370 kubelet[2421]: I0513 23:45:38.145298 2421 kubelet_node_status.go:79] "Successfully registered node" node="ci-4284-0-0-n-732e99817a" May 13 23:45:38.145370 kubelet[2421]: E0513 23:45:38.145357 2421 kubelet_node_status.go:549] "Error updating node status, will retry" err="error getting node \"ci-4284-0-0-n-732e99817a\": node \"ci-4284-0-0-n-732e99817a\" not found" May 13 23:45:38.151933 kubelet[2421]: E0513 23:45:38.151896 2421 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-732e99817a\" not found" May 13 23:45:38.252874 kubelet[2421]: E0513 23:45:38.252817 2421 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-732e99817a\" not found" May 13 23:45:38.353755 kubelet[2421]: E0513 23:45:38.353702 2421 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-732e99817a\" not found" May 13 23:45:38.454623 kubelet[2421]: E0513 23:45:38.454561 2421 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-732e99817a\" not found" May 13 23:45:38.555436 kubelet[2421]: E0513 23:45:38.555298 2421 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-732e99817a\" not found" May 13 23:45:38.656399 kubelet[2421]: E0513 23:45:38.656335 2421 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-732e99817a\" not found" May 13 23:45:38.757040 kubelet[2421]: E0513 23:45:38.756977 2421 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-732e99817a\" not found" May 13 23:45:38.904643 kubelet[2421]: I0513 23:45:38.904182 2421 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284-0-0-n-732e99817a" May 13 23:45:38.914432 kubelet[2421]: I0513 23:45:38.912488 2421 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284-0-0-n-732e99817a" May 13 23:45:38.921764 kubelet[2421]: I0513 23:45:38.921711 2421 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284-0-0-n-732e99817a" May 13 23:45:38.924285 kubelet[2421]: E0513 23:45:38.924235 2421 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4284-0-0-n-732e99817a\" already exists" pod="kube-system/kube-scheduler-ci-4284-0-0-n-732e99817a" May 13 23:45:38.929489 kubelet[2421]: I0513 23:45:38.929451 2421 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:39.099950 kubelet[2421]: I0513 23:45:39.099738 2421 apiserver.go:52] "Watching apiserver" May 13 23:45:39.113417 kubelet[2421]: I0513 23:45:39.113360 2421 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 13 23:45:40.618475 systemd[1]: Reload requested from client PID 2690 ('systemctl') (unit session-7.scope)... May 13 23:45:40.618509 systemd[1]: Reloading... May 13 23:45:40.739377 zram_generator::config[2744]: No configuration found. May 13 23:45:40.826783 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:45:40.939258 systemd[1]: Reloading finished in 320 ms. May 13 23:45:40.965798 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:45:40.981357 systemd[1]: kubelet.service: Deactivated successfully. May 13 23:45:40.981822 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:45:40.981921 systemd[1]: kubelet.service: Consumed 1.887s CPU time, 125.5M memory peak. May 13 23:45:40.985990 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:45:41.161933 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:45:41.173678 (kubelet)[2780]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 23:45:41.234806 kubelet[2780]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:45:41.234806 kubelet[2780]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 13 23:45:41.234806 kubelet[2780]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:45:41.234806 kubelet[2780]: I0513 23:45:41.233954 2780 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 23:45:41.241250 kubelet[2780]: I0513 23:45:41.241054 2780 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" May 13 23:45:41.241250 kubelet[2780]: I0513 23:45:41.241243 2780 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 23:45:41.241674 kubelet[2780]: I0513 23:45:41.241531 2780 server.go:954] "Client rotation is on, will bootstrap in background" May 13 23:45:41.243220 kubelet[2780]: I0513 23:45:41.243162 2780 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 13 23:45:41.246024 kubelet[2780]: I0513 23:45:41.245747 2780 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 23:45:41.255319 kubelet[2780]: I0513 23:45:41.255289 2780 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 13 23:45:41.259061 kubelet[2780]: I0513 23:45:41.259014 2780 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 23:45:41.259554 kubelet[2780]: I0513 23:45:41.259422 2780 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 23:45:41.259783 kubelet[2780]: I0513 23:45:41.259460 2780 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-732e99817a","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 23:45:41.259783 kubelet[2780]: I0513 23:45:41.259733 2780 topology_manager.go:138] "Creating topology manager with none policy" May 13 23:45:41.259783 kubelet[2780]: I0513 23:45:41.259742 2780 container_manager_linux.go:304] "Creating device plugin manager" May 13 23:45:41.259926 kubelet[2780]: I0513 23:45:41.259794 2780 state_mem.go:36] "Initialized new in-memory state store" May 13 23:45:41.259949 kubelet[2780]: I0513 23:45:41.259934 2780 kubelet.go:446] "Attempting to sync node with API server" May 13 23:45:41.259949 kubelet[2780]: I0513 23:45:41.259945 2780 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 23:45:41.259986 kubelet[2780]: I0513 23:45:41.259970 2780 kubelet.go:352] "Adding apiserver pod source" May 13 23:45:41.259986 kubelet[2780]: I0513 23:45:41.259982 2780 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 23:45:41.268051 kubelet[2780]: I0513 23:45:41.265663 2780 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 13 23:45:41.268950 kubelet[2780]: I0513 23:45:41.268634 2780 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 23:45:41.271081 kubelet[2780]: I0513 23:45:41.269956 2780 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 13 23:45:41.271310 kubelet[2780]: I0513 23:45:41.271293 2780 server.go:1287] "Started kubelet" May 13 23:45:41.280945 kubelet[2780]: I0513 23:45:41.280898 2780 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 23:45:41.281723 kubelet[2780]: I0513 23:45:41.281679 2780 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 13 23:45:41.284900 kubelet[2780]: I0513 23:45:41.284775 2780 server.go:490] "Adding debug handlers to kubelet server" May 13 23:45:41.288687 kubelet[2780]: I0513 23:45:41.287258 2780 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 23:45:41.289395 kubelet[2780]: I0513 23:45:41.289015 2780 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 23:45:41.289602 kubelet[2780]: I0513 23:45:41.288716 2780 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 23:45:41.290589 kubelet[2780]: I0513 23:45:41.290559 2780 volume_manager.go:297] "Starting Kubelet Volume Manager" May 13 23:45:41.291016 kubelet[2780]: E0513 23:45:41.290833 2780 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-732e99817a\" not found" May 13 23:45:41.293075 kubelet[2780]: I0513 23:45:41.293039 2780 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 13 23:45:41.294748 kubelet[2780]: I0513 23:45:41.293199 2780 reconciler.go:26] "Reconciler: start to sync state" May 13 23:45:41.295284 kubelet[2780]: I0513 23:45:41.295171 2780 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 23:45:41.296150 kubelet[2780]: I0513 23:45:41.296132 2780 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 23:45:41.296201 kubelet[2780]: I0513 23:45:41.296155 2780 status_manager.go:227] "Starting to sync pod status with apiserver" May 13 23:45:41.296201 kubelet[2780]: I0513 23:45:41.296174 2780 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 13 23:45:41.296201 kubelet[2780]: I0513 23:45:41.296180 2780 kubelet.go:2388] "Starting kubelet main sync loop" May 13 23:45:41.296377 kubelet[2780]: E0513 23:45:41.296269 2780 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 23:45:41.311078 kubelet[2780]: I0513 23:45:41.309022 2780 factory.go:221] Registration of the systemd container factory successfully May 13 23:45:41.311078 kubelet[2780]: I0513 23:45:41.309180 2780 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 23:45:41.313412 kubelet[2780]: I0513 23:45:41.313148 2780 factory.go:221] Registration of the containerd container factory successfully May 13 23:45:41.315982 kubelet[2780]: E0513 23:45:41.315959 2780 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 23:45:41.396464 kubelet[2780]: E0513 23:45:41.396406 2780 kubelet.go:2412] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 13 23:45:41.401534 kubelet[2780]: I0513 23:45:41.401282 2780 cpu_manager.go:221] "Starting CPU manager" policy="none" May 13 23:45:41.401534 kubelet[2780]: I0513 23:45:41.401303 2780 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 13 23:45:41.401534 kubelet[2780]: I0513 23:45:41.401323 2780 state_mem.go:36] "Initialized new in-memory state store" May 13 23:45:41.402026 kubelet[2780]: I0513 23:45:41.402011 2780 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 13 23:45:41.402246 kubelet[2780]: I0513 23:45:41.402126 2780 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 13 23:45:41.402246 kubelet[2780]: I0513 23:45:41.402179 2780 policy_none.go:49] "None policy: Start" May 13 23:45:41.402246 kubelet[2780]: I0513 23:45:41.402208 2780 memory_manager.go:186] "Starting memorymanager" policy="None" May 13 23:45:41.402246 kubelet[2780]: I0513 23:45:41.402222 2780 state_mem.go:35] "Initializing new in-memory state store" May 13 23:45:41.402615 kubelet[2780]: I0513 23:45:41.402522 2780 state_mem.go:75] "Updated machine memory state" May 13 23:45:41.409339 kubelet[2780]: I0513 23:45:41.408535 2780 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 23:45:41.409339 kubelet[2780]: I0513 23:45:41.408721 2780 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 23:45:41.409339 kubelet[2780]: I0513 23:45:41.408732 2780 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 23:45:41.409339 kubelet[2780]: I0513 23:45:41.408945 2780 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 23:45:41.417202 kubelet[2780]: E0513 23:45:41.417152 2780 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 13 23:45:41.530321 kubelet[2780]: I0513 23:45:41.530280 2780 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-732e99817a" May 13 23:45:41.546679 kubelet[2780]: I0513 23:45:41.546636 2780 kubelet_node_status.go:125] "Node was previously registered" node="ci-4284-0-0-n-732e99817a" May 13 23:45:41.546886 kubelet[2780]: I0513 23:45:41.546739 2780 kubelet_node_status.go:79] "Successfully registered node" node="ci-4284-0-0-n-732e99817a" May 13 23:45:41.598506 kubelet[2780]: I0513 23:45:41.597881 2780 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284-0-0-n-732e99817a" May 13 23:45:41.598506 kubelet[2780]: I0513 23:45:41.597907 2780 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:41.599044 kubelet[2780]: I0513 23:45:41.599018 2780 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284-0-0-n-732e99817a" May 13 23:45:41.611630 kubelet[2780]: E0513 23:45:41.611256 2780 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4284-0-0-n-732e99817a\" already exists" pod="kube-system/kube-apiserver-ci-4284-0-0-n-732e99817a" May 13 23:45:41.611630 kubelet[2780]: E0513 23:45:41.611408 2780 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4284-0-0-n-732e99817a\" already exists" pod="kube-system/kube-scheduler-ci-4284-0-0-n-732e99817a" May 13 23:45:41.611630 kubelet[2780]: E0513 23:45:41.611463 2780 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4284-0-0-n-732e99817a\" already exists" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:41.697080 kubelet[2780]: I0513 23:45:41.695538 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3d8a964e51f55a69682f8cdcaca3e273-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-732e99817a\" (UID: \"3d8a964e51f55a69682f8cdcaca3e273\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:41.697080 kubelet[2780]: I0513 23:45:41.695584 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3d8a964e51f55a69682f8cdcaca3e273-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-732e99817a\" (UID: \"3d8a964e51f55a69682f8cdcaca3e273\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:41.697080 kubelet[2780]: I0513 23:45:41.695603 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3d8a964e51f55a69682f8cdcaca3e273-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-732e99817a\" (UID: \"3d8a964e51f55a69682f8cdcaca3e273\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:41.697080 kubelet[2780]: I0513 23:45:41.695621 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3d8a964e51f55a69682f8cdcaca3e273-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-732e99817a\" (UID: \"3d8a964e51f55a69682f8cdcaca3e273\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:41.697080 kubelet[2780]: I0513 23:45:41.695648 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9ce869bf71179ce9ccb9f6662ffc6652-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-732e99817a\" (UID: \"9ce869bf71179ce9ccb9f6662ffc6652\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-732e99817a" May 13 23:45:41.697321 kubelet[2780]: I0513 23:45:41.695666 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3d8a964e51f55a69682f8cdcaca3e273-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-732e99817a\" (UID: \"3d8a964e51f55a69682f8cdcaca3e273\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:41.697321 kubelet[2780]: I0513 23:45:41.697158 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f63763a596870305a0780083caca1262-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-732e99817a\" (UID: \"f63763a596870305a0780083caca1262\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-732e99817a" May 13 23:45:41.697321 kubelet[2780]: I0513 23:45:41.697250 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9ce869bf71179ce9ccb9f6662ffc6652-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-732e99817a\" (UID: \"9ce869bf71179ce9ccb9f6662ffc6652\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-732e99817a" May 13 23:45:41.697321 kubelet[2780]: I0513 23:45:41.697278 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9ce869bf71179ce9ccb9f6662ffc6652-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-732e99817a\" (UID: \"9ce869bf71179ce9ccb9f6662ffc6652\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-732e99817a" May 13 23:45:42.267149 kubelet[2780]: I0513 23:45:42.265480 2780 apiserver.go:52] "Watching apiserver" May 13 23:45:42.294133 kubelet[2780]: I0513 23:45:42.293171 2780 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 13 23:45:42.362471 kubelet[2780]: I0513 23:45:42.361432 2780 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:42.362471 kubelet[2780]: I0513 23:45:42.361971 2780 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284-0-0-n-732e99817a" May 13 23:45:42.389344 kubelet[2780]: E0513 23:45:42.388820 2780 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4284-0-0-n-732e99817a\" already exists" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" May 13 23:45:42.390581 kubelet[2780]: E0513 23:45:42.390555 2780 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4284-0-0-n-732e99817a\" already exists" pod="kube-system/kube-apiserver-ci-4284-0-0-n-732e99817a" May 13 23:45:42.550298 kubelet[2780]: I0513 23:45:42.550015 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4284-0-0-n-732e99817a" podStartSLOduration=4.549993598 podStartE2EDuration="4.549993598s" podCreationTimestamp="2025-05-13 23:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:45:42.498345823 +0000 UTC m=+1.316611543" watchObservedRunningTime="2025-05-13 23:45:42.549993598 +0000 UTC m=+1.368259278" May 13 23:45:42.551036 kubelet[2780]: I0513 23:45:42.550955 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-732e99817a" podStartSLOduration=4.550941371 podStartE2EDuration="4.550941371s" podCreationTimestamp="2025-05-13 23:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:45:42.550573886 +0000 UTC m=+1.368839646" watchObservedRunningTime="2025-05-13 23:45:42.550941371 +0000 UTC m=+1.369207051" May 13 23:45:42.620897 kubelet[2780]: I0513 23:45:42.620830 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4284-0-0-n-732e99817a" podStartSLOduration=4.619055528 podStartE2EDuration="4.619055528s" podCreationTimestamp="2025-05-13 23:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:45:42.595900976 +0000 UTC m=+1.414166656" watchObservedRunningTime="2025-05-13 23:45:42.619055528 +0000 UTC m=+1.437321208" May 13 23:45:45.952981 kubelet[2780]: I0513 23:45:45.952786 2780 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 13 23:45:45.953838 containerd[1504]: time="2025-05-13T23:45:45.953499957Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 13 23:45:45.954445 kubelet[2780]: I0513 23:45:45.954371 2780 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 13 23:45:46.714787 systemd[1]: Created slice kubepods-besteffort-pod17a56264_c96a_421b_95ae_d518c1dc7f16.slice - libcontainer container kubepods-besteffort-pod17a56264_c96a_421b_95ae_d518c1dc7f16.slice. May 13 23:45:46.737125 kubelet[2780]: I0513 23:45:46.736872 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/17a56264-c96a-421b-95ae-d518c1dc7f16-xtables-lock\") pod \"kube-proxy-6bp5g\" (UID: \"17a56264-c96a-421b-95ae-d518c1dc7f16\") " pod="kube-system/kube-proxy-6bp5g" May 13 23:45:46.737125 kubelet[2780]: I0513 23:45:46.736929 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fzmz\" (UniqueName: \"kubernetes.io/projected/17a56264-c96a-421b-95ae-d518c1dc7f16-kube-api-access-6fzmz\") pod \"kube-proxy-6bp5g\" (UID: \"17a56264-c96a-421b-95ae-d518c1dc7f16\") " pod="kube-system/kube-proxy-6bp5g" May 13 23:45:46.737125 kubelet[2780]: I0513 23:45:46.736969 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/17a56264-c96a-421b-95ae-d518c1dc7f16-kube-proxy\") pod \"kube-proxy-6bp5g\" (UID: \"17a56264-c96a-421b-95ae-d518c1dc7f16\") " pod="kube-system/kube-proxy-6bp5g" May 13 23:45:46.737125 kubelet[2780]: I0513 23:45:46.736993 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/17a56264-c96a-421b-95ae-d518c1dc7f16-lib-modules\") pod \"kube-proxy-6bp5g\" (UID: \"17a56264-c96a-421b-95ae-d518c1dc7f16\") " pod="kube-system/kube-proxy-6bp5g" May 13 23:45:46.947520 sudo[1876]: pam_unix(sudo:session): session closed for user root May 13 23:45:47.025484 containerd[1504]: time="2025-05-13T23:45:47.025441758Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-6bp5g,Uid:17a56264-c96a-421b-95ae-d518c1dc7f16,Namespace:kube-system,Attempt:0,}" May 13 23:45:47.059466 containerd[1504]: time="2025-05-13T23:45:47.057872747Z" level=info msg="connecting to shim e634c64f7da54236c82a7aec1392e7db8aa16538dbda7a0f52335ba2b04bebf1" address="unix:///run/containerd/s/cbe9ce1b5ab7a5799bc2750c7be835519391d3ef4ed822c54c50607d14c6989c" namespace=k8s.io protocol=ttrpc version=3 May 13 23:45:47.091332 systemd[1]: Started cri-containerd-e634c64f7da54236c82a7aec1392e7db8aa16538dbda7a0f52335ba2b04bebf1.scope - libcontainer container e634c64f7da54236c82a7aec1392e7db8aa16538dbda7a0f52335ba2b04bebf1. May 13 23:45:47.103169 systemd[1]: Created slice kubepods-besteffort-pod458eda0b_c132_4a02_9169_ebb10765d15d.slice - libcontainer container kubepods-besteffort-pod458eda0b_c132_4a02_9169_ebb10765d15d.slice. May 13 23:45:47.107971 sshd[1875]: Connection closed by 139.178.89.65 port 35318 May 13 23:45:47.112280 sshd-session[1873]: pam_unix(sshd:session): session closed for user core May 13 23:45:47.117047 systemd[1]: sshd@7-138.199.236.81:22-139.178.89.65:35318.service: Deactivated successfully. May 13 23:45:47.122696 systemd[1]: session-7.scope: Deactivated successfully. May 13 23:45:47.124241 systemd[1]: session-7.scope: Consumed 5.856s CPU time, 224.7M memory peak. May 13 23:45:47.127603 systemd-logind[1474]: Session 7 logged out. Waiting for processes to exit. May 13 23:45:47.129541 systemd-logind[1474]: Removed session 7. May 13 23:45:47.140629 containerd[1504]: time="2025-05-13T23:45:47.140551555Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-6bp5g,Uid:17a56264-c96a-421b-95ae-d518c1dc7f16,Namespace:kube-system,Attempt:0,} returns sandbox id \"e634c64f7da54236c82a7aec1392e7db8aa16538dbda7a0f52335ba2b04bebf1\"" May 13 23:45:47.140941 kubelet[2780]: I0513 23:45:47.140790 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/458eda0b-c132-4a02-9169-ebb10765d15d-var-lib-calico\") pod \"tigera-operator-789496d6f5-w8hcx\" (UID: \"458eda0b-c132-4a02-9169-ebb10765d15d\") " pod="tigera-operator/tigera-operator-789496d6f5-w8hcx" May 13 23:45:47.140941 kubelet[2780]: I0513 23:45:47.140848 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hbnx\" (UniqueName: \"kubernetes.io/projected/458eda0b-c132-4a02-9169-ebb10765d15d-kube-api-access-5hbnx\") pod \"tigera-operator-789496d6f5-w8hcx\" (UID: \"458eda0b-c132-4a02-9169-ebb10765d15d\") " pod="tigera-operator/tigera-operator-789496d6f5-w8hcx" May 13 23:45:47.150124 containerd[1504]: time="2025-05-13T23:45:47.150004377Z" level=info msg="CreateContainer within sandbox \"e634c64f7da54236c82a7aec1392e7db8aa16538dbda7a0f52335ba2b04bebf1\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 13 23:45:47.163099 containerd[1504]: time="2025-05-13T23:45:47.161428820Z" level=info msg="Container 45c165aa5a49d9af786ef6f45f0669395e300e33682f7c674f263fe5c04c6907: CDI devices from CRI Config.CDIDevices: []" May 13 23:45:47.174048 containerd[1504]: time="2025-05-13T23:45:47.173990395Z" level=info msg="CreateContainer within sandbox \"e634c64f7da54236c82a7aec1392e7db8aa16538dbda7a0f52335ba2b04bebf1\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"45c165aa5a49d9af786ef6f45f0669395e300e33682f7c674f263fe5c04c6907\"" May 13 23:45:47.176821 containerd[1504]: time="2025-05-13T23:45:47.175356529Z" level=info msg="StartContainer for \"45c165aa5a49d9af786ef6f45f0669395e300e33682f7c674f263fe5c04c6907\"" May 13 23:45:47.177916 containerd[1504]: time="2025-05-13T23:45:47.177884796Z" level=info msg="connecting to shim 45c165aa5a49d9af786ef6f45f0669395e300e33682f7c674f263fe5c04c6907" address="unix:///run/containerd/s/cbe9ce1b5ab7a5799bc2750c7be835519391d3ef4ed822c54c50607d14c6989c" protocol=ttrpc version=3 May 13 23:45:47.201300 systemd[1]: Started cri-containerd-45c165aa5a49d9af786ef6f45f0669395e300e33682f7c674f263fe5c04c6907.scope - libcontainer container 45c165aa5a49d9af786ef6f45f0669395e300e33682f7c674f263fe5c04c6907. May 13 23:45:47.247989 containerd[1504]: time="2025-05-13T23:45:47.247941909Z" level=info msg="StartContainer for \"45c165aa5a49d9af786ef6f45f0669395e300e33682f7c674f263fe5c04c6907\" returns successfully" May 13 23:45:47.392298 kubelet[2780]: I0513 23:45:47.390922 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-6bp5g" podStartSLOduration=1.390902526 podStartE2EDuration="1.390902526s" podCreationTimestamp="2025-05-13 23:45:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:45:47.390867846 +0000 UTC m=+6.209133566" watchObservedRunningTime="2025-05-13 23:45:47.390902526 +0000 UTC m=+6.209168206" May 13 23:45:47.408891 containerd[1504]: time="2025-05-13T23:45:47.408847239Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-789496d6f5-w8hcx,Uid:458eda0b-c132-4a02-9169-ebb10765d15d,Namespace:tigera-operator,Attempt:0,}" May 13 23:45:47.446031 containerd[1504]: time="2025-05-13T23:45:47.445620834Z" level=info msg="connecting to shim 5d0ffeaab52566e7db58f605bce7024eff088e7f2b3e7eddedec93db5aacd7a4" address="unix:///run/containerd/s/cd343983996a2457f24e26f8ba7bb8abc41075cd91cae3510f15ae5fd8fc4ea4" namespace=k8s.io protocol=ttrpc version=3 May 13 23:45:47.473397 systemd[1]: Started cri-containerd-5d0ffeaab52566e7db58f605bce7024eff088e7f2b3e7eddedec93db5aacd7a4.scope - libcontainer container 5d0ffeaab52566e7db58f605bce7024eff088e7f2b3e7eddedec93db5aacd7a4. May 13 23:45:47.517831 containerd[1504]: time="2025-05-13T23:45:47.517775570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-789496d6f5-w8hcx,Uid:458eda0b-c132-4a02-9169-ebb10765d15d,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"5d0ffeaab52566e7db58f605bce7024eff088e7f2b3e7eddedec93db5aacd7a4\"" May 13 23:45:47.521044 containerd[1504]: time="2025-05-13T23:45:47.520982524Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 13 23:45:49.387837 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1125420621.mount: Deactivated successfully. May 13 23:45:49.808709 containerd[1504]: time="2025-05-13T23:45:49.808649234Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:49.809891 containerd[1504]: time="2025-05-13T23:45:49.809647163Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=19323084" May 13 23:45:49.811227 containerd[1504]: time="2025-05-13T23:45:49.811178858Z" level=info msg="ImageCreate event name:\"sha256:27f7c2cfac802523e44ecd16453a4cc992f6c7d610c13054f2715a7cb4370565\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:49.813988 containerd[1504]: time="2025-05-13T23:45:49.813702683Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:49.814472 containerd[1504]: time="2025-05-13T23:45:49.814438450Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:27f7c2cfac802523e44ecd16453a4cc992f6c7d610c13054f2715a7cb4370565\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"19319079\" in 2.293414166s" May 13 23:45:49.814472 containerd[1504]: time="2025-05-13T23:45:49.814471291Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:27f7c2cfac802523e44ecd16453a4cc992f6c7d610c13054f2715a7cb4370565\"" May 13 23:45:49.818163 containerd[1504]: time="2025-05-13T23:45:49.818123086Z" level=info msg="CreateContainer within sandbox \"5d0ffeaab52566e7db58f605bce7024eff088e7f2b3e7eddedec93db5aacd7a4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 13 23:45:49.828326 containerd[1504]: time="2025-05-13T23:45:49.827619539Z" level=info msg="Container c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878: CDI devices from CRI Config.CDIDevices: []" May 13 23:45:49.832236 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount768863207.mount: Deactivated successfully. May 13 23:45:49.838217 containerd[1504]: time="2025-05-13T23:45:49.838166242Z" level=info msg="CreateContainer within sandbox \"5d0ffeaab52566e7db58f605bce7024eff088e7f2b3e7eddedec93db5aacd7a4\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878\"" May 13 23:45:49.841838 containerd[1504]: time="2025-05-13T23:45:49.840433864Z" level=info msg="StartContainer for \"c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878\"" May 13 23:45:49.841838 containerd[1504]: time="2025-05-13T23:45:49.841398474Z" level=info msg="connecting to shim c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878" address="unix:///run/containerd/s/cd343983996a2457f24e26f8ba7bb8abc41075cd91cae3510f15ae5fd8fc4ea4" protocol=ttrpc version=3 May 13 23:45:49.868382 systemd[1]: Started cri-containerd-c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878.scope - libcontainer container c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878. May 13 23:45:49.911373 containerd[1504]: time="2025-05-13T23:45:49.911249077Z" level=info msg="StartContainer for \"c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878\" returns successfully" May 13 23:45:52.544542 kubelet[2780]: I0513 23:45:52.544465 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-789496d6f5-w8hcx" podStartSLOduration=4.247921633 podStartE2EDuration="6.544446751s" podCreationTimestamp="2025-05-13 23:45:46 +0000 UTC" firstStartedPulling="2025-05-13 23:45:47.519150584 +0000 UTC m=+6.337416264" lastFinishedPulling="2025-05-13 23:45:49.815675742 +0000 UTC m=+8.633941382" observedRunningTime="2025-05-13 23:45:50.412754549 +0000 UTC m=+9.231020269" watchObservedRunningTime="2025-05-13 23:45:52.544446751 +0000 UTC m=+11.362712431" May 13 23:45:54.101822 systemd[1]: Created slice kubepods-besteffort-podd078a62c_ac98_4918_b0ab_53fa4ca1a484.slice - libcontainer container kubepods-besteffort-podd078a62c_ac98_4918_b0ab_53fa4ca1a484.slice. May 13 23:45:54.192287 kubelet[2780]: I0513 23:45:54.192244 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dtrj\" (UniqueName: \"kubernetes.io/projected/d078a62c-ac98-4918-b0ab-53fa4ca1a484-kube-api-access-8dtrj\") pod \"calico-typha-5d5564856f-2jt98\" (UID: \"d078a62c-ac98-4918-b0ab-53fa4ca1a484\") " pod="calico-system/calico-typha-5d5564856f-2jt98" May 13 23:45:54.192697 kubelet[2780]: I0513 23:45:54.192288 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d078a62c-ac98-4918-b0ab-53fa4ca1a484-typha-certs\") pod \"calico-typha-5d5564856f-2jt98\" (UID: \"d078a62c-ac98-4918-b0ab-53fa4ca1a484\") " pod="calico-system/calico-typha-5d5564856f-2jt98" May 13 23:45:54.192697 kubelet[2780]: I0513 23:45:54.192324 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d078a62c-ac98-4918-b0ab-53fa4ca1a484-tigera-ca-bundle\") pod \"calico-typha-5d5564856f-2jt98\" (UID: \"d078a62c-ac98-4918-b0ab-53fa4ca1a484\") " pod="calico-system/calico-typha-5d5564856f-2jt98" May 13 23:45:54.241240 systemd[1]: Created slice kubepods-besteffort-pod61ba2088_39e0_4223_b95b_586fe99f906e.slice - libcontainer container kubepods-besteffort-pod61ba2088_39e0_4223_b95b_586fe99f906e.slice. May 13 23:45:54.294661 kubelet[2780]: I0513 23:45:54.293579 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-policysync\") pod \"calico-node-z6rzk\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " pod="calico-system/calico-node-z6rzk" May 13 23:45:54.294661 kubelet[2780]: I0513 23:45:54.293635 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-flexvol-driver-host\") pod \"calico-node-z6rzk\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " pod="calico-system/calico-node-z6rzk" May 13 23:45:54.294661 kubelet[2780]: I0513 23:45:54.293657 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/61ba2088-39e0-4223-b95b-586fe99f906e-node-certs\") pod \"calico-node-z6rzk\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " pod="calico-system/calico-node-z6rzk" May 13 23:45:54.294661 kubelet[2780]: I0513 23:45:54.293673 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-cni-log-dir\") pod \"calico-node-z6rzk\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " pod="calico-system/calico-node-z6rzk" May 13 23:45:54.294661 kubelet[2780]: I0513 23:45:54.293688 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-cni-net-dir\") pod \"calico-node-z6rzk\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " pod="calico-system/calico-node-z6rzk" May 13 23:45:54.294886 kubelet[2780]: I0513 23:45:54.293705 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-lib-modules\") pod \"calico-node-z6rzk\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " pod="calico-system/calico-node-z6rzk" May 13 23:45:54.294886 kubelet[2780]: I0513 23:45:54.293721 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-var-run-calico\") pod \"calico-node-z6rzk\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " pod="calico-system/calico-node-z6rzk" May 13 23:45:54.294886 kubelet[2780]: I0513 23:45:54.293736 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-cni-bin-dir\") pod \"calico-node-z6rzk\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " pod="calico-system/calico-node-z6rzk" May 13 23:45:54.294886 kubelet[2780]: I0513 23:45:54.293751 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clm64\" (UniqueName: \"kubernetes.io/projected/61ba2088-39e0-4223-b95b-586fe99f906e-kube-api-access-clm64\") pod \"calico-node-z6rzk\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " pod="calico-system/calico-node-z6rzk" May 13 23:45:54.294886 kubelet[2780]: I0513 23:45:54.293769 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-xtables-lock\") pod \"calico-node-z6rzk\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " pod="calico-system/calico-node-z6rzk" May 13 23:45:54.294991 kubelet[2780]: I0513 23:45:54.293784 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61ba2088-39e0-4223-b95b-586fe99f906e-tigera-ca-bundle\") pod \"calico-node-z6rzk\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " pod="calico-system/calico-node-z6rzk" May 13 23:45:54.294991 kubelet[2780]: I0513 23:45:54.293810 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-var-lib-calico\") pod \"calico-node-z6rzk\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " pod="calico-system/calico-node-z6rzk" May 13 23:45:54.386623 kubelet[2780]: E0513 23:45:54.386488 2780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kxjnf" podUID="d52acfd2-8155-4dfe-acd7-8d0bba5d8c44" May 13 23:45:54.397424 kubelet[2780]: E0513 23:45:54.397248 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.397424 kubelet[2780]: W0513 23:45:54.397272 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.397424 kubelet[2780]: E0513 23:45:54.397298 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.398346 kubelet[2780]: E0513 23:45:54.398138 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.398346 kubelet[2780]: W0513 23:45:54.398155 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.398346 kubelet[2780]: E0513 23:45:54.398171 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.399266 kubelet[2780]: E0513 23:45:54.398400 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.399266 kubelet[2780]: W0513 23:45:54.398610 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.399266 kubelet[2780]: E0513 23:45:54.398624 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.399266 kubelet[2780]: E0513 23:45:54.398925 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.399266 kubelet[2780]: W0513 23:45:54.398936 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.399266 kubelet[2780]: E0513 23:45:54.398946 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.402411 kubelet[2780]: E0513 23:45:54.402362 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.402411 kubelet[2780]: W0513 23:45:54.402384 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.402411 kubelet[2780]: E0513 23:45:54.402401 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.411504 containerd[1504]: time="2025-05-13T23:45:54.411450802Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5d5564856f-2jt98,Uid:d078a62c-ac98-4918-b0ab-53fa4ca1a484,Namespace:calico-system,Attempt:0,}" May 13 23:45:54.426872 kubelet[2780]: E0513 23:45:54.426782 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.426872 kubelet[2780]: W0513 23:45:54.426805 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.426872 kubelet[2780]: E0513 23:45:54.426824 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.447083 containerd[1504]: time="2025-05-13T23:45:54.447015593Z" level=info msg="connecting to shim 496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653" address="unix:///run/containerd/s/8abba5706ed0185cec35e4dd6433e57ee3432d39117d528289d47005d2493141" namespace=k8s.io protocol=ttrpc version=3 May 13 23:45:54.475024 kubelet[2780]: E0513 23:45:54.474986 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.475154 kubelet[2780]: W0513 23:45:54.475038 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.475154 kubelet[2780]: E0513 23:45:54.475061 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.476368 kubelet[2780]: E0513 23:45:54.475221 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.476368 kubelet[2780]: W0513 23:45:54.475237 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.476368 kubelet[2780]: E0513 23:45:54.475278 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.477891 kubelet[2780]: E0513 23:45:54.477865 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.477891 kubelet[2780]: W0513 23:45:54.477885 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.478017 kubelet[2780]: E0513 23:45:54.477904 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.478800 kubelet[2780]: E0513 23:45:54.478779 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.478800 kubelet[2780]: W0513 23:45:54.478795 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.478933 kubelet[2780]: E0513 23:45:54.478808 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.479295 systemd[1]: Started cri-containerd-496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653.scope - libcontainer container 496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653. May 13 23:45:54.479740 kubelet[2780]: E0513 23:45:54.479719 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.479740 kubelet[2780]: W0513 23:45:54.479736 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.479843 kubelet[2780]: E0513 23:45:54.479748 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.481442 kubelet[2780]: E0513 23:45:54.481289 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.481442 kubelet[2780]: W0513 23:45:54.481322 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.481442 kubelet[2780]: E0513 23:45:54.481336 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.482189 kubelet[2780]: E0513 23:45:54.482044 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.482189 kubelet[2780]: W0513 23:45:54.482063 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.482189 kubelet[2780]: E0513 23:45:54.482090 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.482738 kubelet[2780]: E0513 23:45:54.482627 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.482738 kubelet[2780]: W0513 23:45:54.482641 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.482738 kubelet[2780]: E0513 23:45:54.482653 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.483166 kubelet[2780]: E0513 23:45:54.483145 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.483166 kubelet[2780]: W0513 23:45:54.483162 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.483387 kubelet[2780]: E0513 23:45:54.483174 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.484196 kubelet[2780]: E0513 23:45:54.483931 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.484196 kubelet[2780]: W0513 23:45:54.484040 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.484196 kubelet[2780]: E0513 23:45:54.484055 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.485472 kubelet[2780]: E0513 23:45:54.485173 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.485472 kubelet[2780]: W0513 23:45:54.485193 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.485472 kubelet[2780]: E0513 23:45:54.485207 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.485472 kubelet[2780]: E0513 23:45:54.485400 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.485472 kubelet[2780]: W0513 23:45:54.485410 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.485472 kubelet[2780]: E0513 23:45:54.485419 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.485651 kubelet[2780]: E0513 23:45:54.485561 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.485651 kubelet[2780]: W0513 23:45:54.485569 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.485651 kubelet[2780]: E0513 23:45:54.485577 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.486479 kubelet[2780]: E0513 23:45:54.485760 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.486479 kubelet[2780]: W0513 23:45:54.485776 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.486479 kubelet[2780]: E0513 23:45:54.485786 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.486479 kubelet[2780]: E0513 23:45:54.485976 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.486479 kubelet[2780]: W0513 23:45:54.485985 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.486479 kubelet[2780]: E0513 23:45:54.485993 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.486781 kubelet[2780]: E0513 23:45:54.486762 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.486781 kubelet[2780]: W0513 23:45:54.486777 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.486873 kubelet[2780]: E0513 23:45:54.486790 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.487335 kubelet[2780]: E0513 23:45:54.487293 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.487335 kubelet[2780]: W0513 23:45:54.487319 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.487335 kubelet[2780]: E0513 23:45:54.487332 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.488605 kubelet[2780]: E0513 23:45:54.487767 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.488605 kubelet[2780]: W0513 23:45:54.487778 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.488605 kubelet[2780]: E0513 23:45:54.487789 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.488605 kubelet[2780]: E0513 23:45:54.488217 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.488605 kubelet[2780]: W0513 23:45:54.488229 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.488605 kubelet[2780]: E0513 23:45:54.488241 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.488792 kubelet[2780]: E0513 23:45:54.488738 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.488792 kubelet[2780]: W0513 23:45:54.488750 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.488792 kubelet[2780]: E0513 23:45:54.488762 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.495170 kubelet[2780]: E0513 23:45:54.495145 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.495674 kubelet[2780]: W0513 23:45:54.495320 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.495674 kubelet[2780]: E0513 23:45:54.495347 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.495674 kubelet[2780]: I0513 23:45:54.495479 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpcqs\" (UniqueName: \"kubernetes.io/projected/d52acfd2-8155-4dfe-acd7-8d0bba5d8c44-kube-api-access-jpcqs\") pod \"csi-node-driver-kxjnf\" (UID: \"d52acfd2-8155-4dfe-acd7-8d0bba5d8c44\") " pod="calico-system/csi-node-driver-kxjnf" May 13 23:45:54.496112 kubelet[2780]: E0513 23:45:54.496093 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.496378 kubelet[2780]: W0513 23:45:54.496187 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.496378 kubelet[2780]: E0513 23:45:54.496217 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.496378 kubelet[2780]: I0513 23:45:54.496237 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/d52acfd2-8155-4dfe-acd7-8d0bba5d8c44-varrun\") pod \"csi-node-driver-kxjnf\" (UID: \"d52acfd2-8155-4dfe-acd7-8d0bba5d8c44\") " pod="calico-system/csi-node-driver-kxjnf" May 13 23:45:54.497354 kubelet[2780]: E0513 23:45:54.497111 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.497530 kubelet[2780]: W0513 23:45:54.497129 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.497530 kubelet[2780]: E0513 23:45:54.497465 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.497530 kubelet[2780]: I0513 23:45:54.497486 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d52acfd2-8155-4dfe-acd7-8d0bba5d8c44-socket-dir\") pod \"csi-node-driver-kxjnf\" (UID: \"d52acfd2-8155-4dfe-acd7-8d0bba5d8c44\") " pod="calico-system/csi-node-driver-kxjnf" May 13 23:45:54.497979 kubelet[2780]: E0513 23:45:54.497940 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.497979 kubelet[2780]: W0513 23:45:54.497957 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.497979 kubelet[2780]: E0513 23:45:54.497975 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.498707 kubelet[2780]: E0513 23:45:54.498584 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.498707 kubelet[2780]: W0513 23:45:54.498705 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.499218 kubelet[2780]: E0513 23:45:54.498773 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.499218 kubelet[2780]: E0513 23:45:54.499047 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.499218 kubelet[2780]: W0513 23:45:54.499057 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.499738 kubelet[2780]: E0513 23:45:54.499120 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.499738 kubelet[2780]: E0513 23:45:54.499406 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.499738 kubelet[2780]: W0513 23:45:54.499418 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.499738 kubelet[2780]: E0513 23:45:54.499484 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.499738 kubelet[2780]: I0513 23:45:54.499510 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d52acfd2-8155-4dfe-acd7-8d0bba5d8c44-registration-dir\") pod \"csi-node-driver-kxjnf\" (UID: \"d52acfd2-8155-4dfe-acd7-8d0bba5d8c44\") " pod="calico-system/csi-node-driver-kxjnf" May 13 23:45:54.499738 kubelet[2780]: E0513 23:45:54.499730 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.499738 kubelet[2780]: W0513 23:45:54.499740 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.500458 kubelet[2780]: E0513 23:45:54.499991 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.500458 kubelet[2780]: W0513 23:45:54.500006 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.500458 kubelet[2780]: E0513 23:45:54.500018 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.500458 kubelet[2780]: E0513 23:45:54.500102 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.500458 kubelet[2780]: E0513 23:45:54.500398 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.500458 kubelet[2780]: W0513 23:45:54.500408 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.500951 kubelet[2780]: E0513 23:45:54.500519 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.501908 kubelet[2780]: E0513 23:45:54.501359 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.501908 kubelet[2780]: W0513 23:45:54.501380 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.501908 kubelet[2780]: E0513 23:45:54.501396 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.503489 kubelet[2780]: E0513 23:45:54.503204 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.503489 kubelet[2780]: W0513 23:45:54.503227 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.503489 kubelet[2780]: E0513 23:45:54.503247 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.503997 kubelet[2780]: E0513 23:45:54.503770 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.503997 kubelet[2780]: W0513 23:45:54.503788 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.503997 kubelet[2780]: E0513 23:45:54.503805 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.503997 kubelet[2780]: I0513 23:45:54.503833 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d52acfd2-8155-4dfe-acd7-8d0bba5d8c44-kubelet-dir\") pod \"csi-node-driver-kxjnf\" (UID: \"d52acfd2-8155-4dfe-acd7-8d0bba5d8c44\") " pod="calico-system/csi-node-driver-kxjnf" May 13 23:45:54.504783 kubelet[2780]: E0513 23:45:54.504544 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.504783 kubelet[2780]: W0513 23:45:54.504567 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.504783 kubelet[2780]: E0513 23:45:54.504583 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.505224 kubelet[2780]: E0513 23:45:54.505053 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.505224 kubelet[2780]: W0513 23:45:54.505176 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.505224 kubelet[2780]: E0513 23:45:54.505194 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.546519 containerd[1504]: time="2025-05-13T23:45:54.546470508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-z6rzk,Uid:61ba2088-39e0-4223-b95b-586fe99f906e,Namespace:calico-system,Attempt:0,}" May 13 23:45:54.557932 containerd[1504]: time="2025-05-13T23:45:54.557880075Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5d5564856f-2jt98,Uid:d078a62c-ac98-4918-b0ab-53fa4ca1a484,Namespace:calico-system,Attempt:0,} returns sandbox id \"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\"" May 13 23:45:54.560799 containerd[1504]: time="2025-05-13T23:45:54.560732217Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 13 23:45:54.576786 containerd[1504]: time="2025-05-13T23:45:54.576436736Z" level=info msg="connecting to shim 1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca" address="unix:///run/containerd/s/85560a59758806642fe175fe595fce0f3e8145e74e0483d9672fc82a15766694" namespace=k8s.io protocol=ttrpc version=3 May 13 23:45:54.605407 kubelet[2780]: E0513 23:45:54.605370 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.605565 kubelet[2780]: W0513 23:45:54.605551 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.605647 kubelet[2780]: E0513 23:45:54.605634 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.607276 kubelet[2780]: E0513 23:45:54.607157 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.607502 kubelet[2780]: W0513 23:45:54.607406 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.607502 kubelet[2780]: E0513 23:45:54.607451 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.610120 kubelet[2780]: E0513 23:45:54.608020 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.610120 kubelet[2780]: W0513 23:45:54.610109 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.610322 kubelet[2780]: E0513 23:45:54.610146 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.610855 kubelet[2780]: E0513 23:45:54.610834 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.610855 kubelet[2780]: W0513 23:45:54.610850 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.610843 systemd[1]: Started cri-containerd-1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca.scope - libcontainer container 1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca. May 13 23:45:54.611643 kubelet[2780]: E0513 23:45:54.611615 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.611808 kubelet[2780]: E0513 23:45:54.611782 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.611808 kubelet[2780]: W0513 23:45:54.611799 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.611808 kubelet[2780]: E0513 23:45:54.611833 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.613105 kubelet[2780]: E0513 23:45:54.612361 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.613105 kubelet[2780]: W0513 23:45:54.612382 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.613105 kubelet[2780]: E0513 23:45:54.612856 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.613105 kubelet[2780]: W0513 23:45:54.612870 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.613105 kubelet[2780]: E0513 23:45:54.612883 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.613464 kubelet[2780]: E0513 23:45:54.613346 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.613959 kubelet[2780]: E0513 23:45:54.613941 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.613959 kubelet[2780]: W0513 23:45:54.613956 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.614048 kubelet[2780]: E0513 23:45:54.613983 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.614444 kubelet[2780]: E0513 23:45:54.614235 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.614444 kubelet[2780]: W0513 23:45:54.614245 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.614444 kubelet[2780]: E0513 23:45:54.614272 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.614444 kubelet[2780]: E0513 23:45:54.614461 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.614444 kubelet[2780]: W0513 23:45:54.614470 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.614444 kubelet[2780]: E0513 23:45:54.614534 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.614765 kubelet[2780]: E0513 23:45:54.614647 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.614765 kubelet[2780]: W0513 23:45:54.614653 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.615247 kubelet[2780]: E0513 23:45:54.615112 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.615680 kubelet[2780]: E0513 23:45:54.615656 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.615825 kubelet[2780]: W0513 23:45:54.615683 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.615825 kubelet[2780]: E0513 23:45:54.615720 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.616040 kubelet[2780]: E0513 23:45:54.615884 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.616040 kubelet[2780]: W0513 23:45:54.615893 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.616040 kubelet[2780]: E0513 23:45:54.615954 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.616040 kubelet[2780]: E0513 23:45:54.616033 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.616040 kubelet[2780]: W0513 23:45:54.616040 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.617112 kubelet[2780]: E0513 23:45:54.616060 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.617112 kubelet[2780]: E0513 23:45:54.616274 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.617112 kubelet[2780]: W0513 23:45:54.616282 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.617112 kubelet[2780]: E0513 23:45:54.616475 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.617112 kubelet[2780]: E0513 23:45:54.616600 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.617112 kubelet[2780]: W0513 23:45:54.616609 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.617112 kubelet[2780]: E0513 23:45:54.616621 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.617112 kubelet[2780]: E0513 23:45:54.616770 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.617112 kubelet[2780]: W0513 23:45:54.616778 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.617112 kubelet[2780]: E0513 23:45:54.616786 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.617420 kubelet[2780]: E0513 23:45:54.616929 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.617420 kubelet[2780]: W0513 23:45:54.616937 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.618447 kubelet[2780]: E0513 23:45:54.618399 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.618527 kubelet[2780]: E0513 23:45:54.618501 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.618527 kubelet[2780]: W0513 23:45:54.618509 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.618527 kubelet[2780]: E0513 23:45:54.618526 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.618927 kubelet[2780]: E0513 23:45:54.618903 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.618927 kubelet[2780]: W0513 23:45:54.618923 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.619599 kubelet[2780]: E0513 23:45:54.619127 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.619599 kubelet[2780]: W0513 23:45:54.619143 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.619599 kubelet[2780]: E0513 23:45:54.619058 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.619599 kubelet[2780]: E0513 23:45:54.619291 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.619599 kubelet[2780]: W0513 23:45:54.619299 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.619599 kubelet[2780]: E0513 23:45:54.619323 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.619599 kubelet[2780]: E0513 23:45:54.619348 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.620001 kubelet[2780]: E0513 23:45:54.619723 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.620125 kubelet[2780]: W0513 23:45:54.620098 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.620246 kubelet[2780]: E0513 23:45:54.620124 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.620627 kubelet[2780]: E0513 23:45:54.620607 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.620627 kubelet[2780]: W0513 23:45:54.620623 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.620710 kubelet[2780]: E0513 23:45:54.620635 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.621831 kubelet[2780]: E0513 23:45:54.621811 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.621831 kubelet[2780]: W0513 23:45:54.621830 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.621917 kubelet[2780]: E0513 23:45:54.621843 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.641818 kubelet[2780]: E0513 23:45:54.641719 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:54.641818 kubelet[2780]: W0513 23:45:54.641744 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:54.641818 kubelet[2780]: E0513 23:45:54.641765 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:54.666954 containerd[1504]: time="2025-05-13T23:45:54.666896743Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-z6rzk,Uid:61ba2088-39e0-4223-b95b-586fe99f906e,Namespace:calico-system,Attempt:0,} returns sandbox id \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\"" May 13 23:45:56.298977 kubelet[2780]: E0513 23:45:56.297371 2780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kxjnf" podUID="d52acfd2-8155-4dfe-acd7-8d0bba5d8c44" May 13 23:45:56.932117 containerd[1504]: time="2025-05-13T23:45:56.931858391Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:56.933575 containerd[1504]: time="2025-05-13T23:45:56.933472522Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=28370571" May 13 23:45:56.934271 containerd[1504]: time="2025-05-13T23:45:56.934244247Z" level=info msg="ImageCreate event name:\"sha256:26e730979a07ea7452715da6ac48076016018bc982c06ebd32d5e095f42d3d54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:56.938690 containerd[1504]: time="2025-05-13T23:45:56.937690910Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:56.938690 containerd[1504]: time="2025-05-13T23:45:56.938386395Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:26e730979a07ea7452715da6ac48076016018bc982c06ebd32d5e095f42d3d54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"29739745\" in 2.377609698s" May 13 23:45:56.938690 containerd[1504]: time="2025-05-13T23:45:56.938413195Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:26e730979a07ea7452715da6ac48076016018bc982c06ebd32d5e095f42d3d54\"" May 13 23:45:56.940396 containerd[1504]: time="2025-05-13T23:45:56.940370529Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 13 23:45:56.959021 containerd[1504]: time="2025-05-13T23:45:56.958610293Z" level=info msg="CreateContainer within sandbox \"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 13 23:45:56.976362 containerd[1504]: time="2025-05-13T23:45:56.976285853Z" level=info msg="Container b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41: CDI devices from CRI Config.CDIDevices: []" May 13 23:45:56.985575 containerd[1504]: time="2025-05-13T23:45:56.985484516Z" level=info msg="CreateContainer within sandbox \"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\"" May 13 23:45:56.987723 containerd[1504]: time="2025-05-13T23:45:56.986286762Z" level=info msg="StartContainer for \"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\"" May 13 23:45:56.987723 containerd[1504]: time="2025-05-13T23:45:56.987505610Z" level=info msg="connecting to shim b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41" address="unix:///run/containerd/s/8abba5706ed0185cec35e4dd6433e57ee3432d39117d528289d47005d2493141" protocol=ttrpc version=3 May 13 23:45:57.010846 systemd[1]: Started cri-containerd-b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41.scope - libcontainer container b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41. May 13 23:45:57.083257 containerd[1504]: time="2025-05-13T23:45:57.083213592Z" level=info msg="StartContainer for \"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\" returns successfully" May 13 23:45:57.442971 kubelet[2780]: I0513 23:45:57.442124 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5d5564856f-2jt98" podStartSLOduration=1.0628459160000001 podStartE2EDuration="3.442100266s" podCreationTimestamp="2025-05-13 23:45:54 +0000 UTC" firstStartedPulling="2025-05-13 23:45:54.560445214 +0000 UTC m=+13.378710894" lastFinishedPulling="2025-05-13 23:45:56.939699564 +0000 UTC m=+15.757965244" observedRunningTime="2025-05-13 23:45:57.438416322 +0000 UTC m=+16.256682042" watchObservedRunningTime="2025-05-13 23:45:57.442100266 +0000 UTC m=+16.260365946" May 13 23:45:57.510088 kubelet[2780]: E0513 23:45:57.509816 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.510088 kubelet[2780]: W0513 23:45:57.509842 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.510088 kubelet[2780]: E0513 23:45:57.509865 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.510088 kubelet[2780]: E0513 23:45:57.510148 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.510088 kubelet[2780]: W0513 23:45:57.510159 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.510088 kubelet[2780]: E0513 23:45:57.510200 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.510753 kubelet[2780]: E0513 23:45:57.510678 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.510753 kubelet[2780]: W0513 23:45:57.510690 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.510753 kubelet[2780]: E0513 23:45:57.510701 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.511420 kubelet[2780]: E0513 23:45:57.511301 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.511420 kubelet[2780]: W0513 23:45:57.511315 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.511420 kubelet[2780]: E0513 23:45:57.511347 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.511937 kubelet[2780]: E0513 23:45:57.511849 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.511937 kubelet[2780]: W0513 23:45:57.511888 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.511937 kubelet[2780]: E0513 23:45:57.511900 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.512689 kubelet[2780]: E0513 23:45:57.512674 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.512818 kubelet[2780]: W0513 23:45:57.512803 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.512998 kubelet[2780]: E0513 23:45:57.512905 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.513453 kubelet[2780]: E0513 23:45:57.513386 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.513453 kubelet[2780]: W0513 23:45:57.513402 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.513453 kubelet[2780]: E0513 23:45:57.513416 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.514026 kubelet[2780]: E0513 23:45:57.513880 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.514026 kubelet[2780]: W0513 23:45:57.513897 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.514026 kubelet[2780]: E0513 23:45:57.513911 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.514396 kubelet[2780]: E0513 23:45:57.514273 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.514396 kubelet[2780]: W0513 23:45:57.514302 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.514396 kubelet[2780]: E0513 23:45:57.514339 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.514782 kubelet[2780]: E0513 23:45:57.514709 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.514782 kubelet[2780]: W0513 23:45:57.514724 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.514782 kubelet[2780]: E0513 23:45:57.514737 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.515732 kubelet[2780]: E0513 23:45:57.515526 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.515732 kubelet[2780]: W0513 23:45:57.515543 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.515732 kubelet[2780]: E0513 23:45:57.515558 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.516386 kubelet[2780]: E0513 23:45:57.516373 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.516386 kubelet[2780]: W0513 23:45:57.516410 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.516386 kubelet[2780]: E0513 23:45:57.516428 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.516902 kubelet[2780]: E0513 23:45:57.516833 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.516902 kubelet[2780]: W0513 23:45:57.516845 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.516902 kubelet[2780]: E0513 23:45:57.516856 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.517388 kubelet[2780]: E0513 23:45:57.517270 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.517388 kubelet[2780]: W0513 23:45:57.517281 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.517388 kubelet[2780]: E0513 23:45:57.517305 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.517731 kubelet[2780]: E0513 23:45:57.517718 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.517916 kubelet[2780]: W0513 23:45:57.517843 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.517916 kubelet[2780]: E0513 23:45:57.517859 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.539148 kubelet[2780]: E0513 23:45:57.538925 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.539148 kubelet[2780]: W0513 23:45:57.538949 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.539148 kubelet[2780]: E0513 23:45:57.538968 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.540308 kubelet[2780]: E0513 23:45:57.540284 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.540645 kubelet[2780]: W0513 23:45:57.540474 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.540645 kubelet[2780]: E0513 23:45:57.540505 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.541264 kubelet[2780]: E0513 23:45:57.541090 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.541264 kubelet[2780]: W0513 23:45:57.541104 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.541706 kubelet[2780]: E0513 23:45:57.541467 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.541935 kubelet[2780]: E0513 23:45:57.541872 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.541935 kubelet[2780]: W0513 23:45:57.541886 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.542159 kubelet[2780]: E0513 23:45:57.541970 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.543092 kubelet[2780]: E0513 23:45:57.542553 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.543092 kubelet[2780]: W0513 23:45:57.542566 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.543092 kubelet[2780]: E0513 23:45:57.543049 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.543529 kubelet[2780]: E0513 23:45:57.543440 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.543529 kubelet[2780]: W0513 23:45:57.543453 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.543759 kubelet[2780]: E0513 23:45:57.543622 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.544240 kubelet[2780]: E0513 23:45:57.544226 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.544535 kubelet[2780]: W0513 23:45:57.544392 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.544616 kubelet[2780]: E0513 23:45:57.544565 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.545155 kubelet[2780]: E0513 23:45:57.545096 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.545155 kubelet[2780]: W0513 23:45:57.545110 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.545527 kubelet[2780]: E0513 23:45:57.545195 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.546084 kubelet[2780]: E0513 23:45:57.545754 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.546084 kubelet[2780]: W0513 23:45:57.545767 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.546237 kubelet[2780]: E0513 23:45:57.546174 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.546919 kubelet[2780]: E0513 23:45:57.546888 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.547057 kubelet[2780]: W0513 23:45:57.547003 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.547156 kubelet[2780]: E0513 23:45:57.547085 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.547797 kubelet[2780]: E0513 23:45:57.547699 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.547797 kubelet[2780]: W0513 23:45:57.547713 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.548726 kubelet[2780]: E0513 23:45:57.548616 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.548965 kubelet[2780]: E0513 23:45:57.548854 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.548965 kubelet[2780]: W0513 23:45:57.548866 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.549099 kubelet[2780]: E0513 23:45:57.549085 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.549276 kubelet[2780]: E0513 23:45:57.549200 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.549276 kubelet[2780]: W0513 23:45:57.549210 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.549917 kubelet[2780]: E0513 23:45:57.549399 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.551571 kubelet[2780]: E0513 23:45:57.551554 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.551657 kubelet[2780]: W0513 23:45:57.551645 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.551723 kubelet[2780]: E0513 23:45:57.551712 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.554758 kubelet[2780]: E0513 23:45:57.554742 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.554978 kubelet[2780]: W0513 23:45:57.554830 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.554978 kubelet[2780]: E0513 23:45:57.554848 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.555507 kubelet[2780]: E0513 23:45:57.555494 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.555581 kubelet[2780]: W0513 23:45:57.555569 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.555941 kubelet[2780]: E0513 23:45:57.555683 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.556111 kubelet[2780]: E0513 23:45:57.556099 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.556181 kubelet[2780]: W0513 23:45:57.556169 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.556330 kubelet[2780]: E0513 23:45:57.556305 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:57.556446 kubelet[2780]: E0513 23:45:57.556410 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:57.556514 kubelet[2780]: W0513 23:45:57.556503 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:57.556589 kubelet[2780]: E0513 23:45:57.556565 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.298745 kubelet[2780]: E0513 23:45:58.296864 2780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kxjnf" podUID="d52acfd2-8155-4dfe-acd7-8d0bba5d8c44" May 13 23:45:58.420304 kubelet[2780]: I0513 23:45:58.420269 2780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:45:58.426783 kubelet[2780]: E0513 23:45:58.426755 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.427170 kubelet[2780]: W0513 23:45:58.426964 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.427170 kubelet[2780]: E0513 23:45:58.426993 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.427530 kubelet[2780]: E0513 23:45:58.427416 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.427530 kubelet[2780]: W0513 23:45:58.427432 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.427530 kubelet[2780]: E0513 23:45:58.427446 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.427819 kubelet[2780]: E0513 23:45:58.427804 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.427981 kubelet[2780]: W0513 23:45:58.427872 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.427981 kubelet[2780]: E0513 23:45:58.427898 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.428441 kubelet[2780]: E0513 23:45:58.428254 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.428441 kubelet[2780]: W0513 23:45:58.428268 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.428441 kubelet[2780]: E0513 23:45:58.428280 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.428660 kubelet[2780]: E0513 23:45:58.428646 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.428733 kubelet[2780]: W0513 23:45:58.428721 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.428789 kubelet[2780]: E0513 23:45:58.428778 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.429028 kubelet[2780]: E0513 23:45:58.429016 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.429216 kubelet[2780]: W0513 23:45:58.429143 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.429401 kubelet[2780]: E0513 23:45:58.429280 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.429584 kubelet[2780]: E0513 23:45:58.429571 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.429745 kubelet[2780]: W0513 23:45:58.429649 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.429745 kubelet[2780]: E0513 23:45:58.429666 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.430030 kubelet[2780]: E0513 23:45:58.429927 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.430030 kubelet[2780]: W0513 23:45:58.429938 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.430030 kubelet[2780]: E0513 23:45:58.429947 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.430407 kubelet[2780]: E0513 23:45:58.430380 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.430475 kubelet[2780]: W0513 23:45:58.430463 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.430524 kubelet[2780]: E0513 23:45:58.430513 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.430845 kubelet[2780]: E0513 23:45:58.430764 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.430845 kubelet[2780]: W0513 23:45:58.430775 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.430845 kubelet[2780]: E0513 23:45:58.430785 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.431057 kubelet[2780]: E0513 23:45:58.431046 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.431142 kubelet[2780]: W0513 23:45:58.431130 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.431269 kubelet[2780]: E0513 23:45:58.431183 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.431482 kubelet[2780]: E0513 23:45:58.431470 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.431685 kubelet[2780]: W0513 23:45:58.431595 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.431685 kubelet[2780]: E0513 23:45:58.431617 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.432019 kubelet[2780]: E0513 23:45:58.431927 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.432019 kubelet[2780]: W0513 23:45:58.431939 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.432019 kubelet[2780]: E0513 23:45:58.431948 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.432398 kubelet[2780]: E0513 23:45:58.432279 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.432398 kubelet[2780]: W0513 23:45:58.432292 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.432398 kubelet[2780]: E0513 23:45:58.432306 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.432633 kubelet[2780]: E0513 23:45:58.432619 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.432708 kubelet[2780]: W0513 23:45:58.432688 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.432817 kubelet[2780]: E0513 23:45:58.432747 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.449800 kubelet[2780]: E0513 23:45:58.449672 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.449800 kubelet[2780]: W0513 23:45:58.449793 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.451394 kubelet[2780]: E0513 23:45:58.449819 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.452514 kubelet[2780]: E0513 23:45:58.451929 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.452514 kubelet[2780]: W0513 23:45:58.451950 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.452514 kubelet[2780]: E0513 23:45:58.451976 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.453529 kubelet[2780]: E0513 23:45:58.453393 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.453529 kubelet[2780]: W0513 23:45:58.453457 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.453815 kubelet[2780]: E0513 23:45:58.453620 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.454302 kubelet[2780]: E0513 23:45:58.454200 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.454302 kubelet[2780]: W0513 23:45:58.454220 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.454302 kubelet[2780]: E0513 23:45:58.454252 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.455177 kubelet[2780]: E0513 23:45:58.454633 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.455177 kubelet[2780]: W0513 23:45:58.454652 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.455177 kubelet[2780]: E0513 23:45:58.454947 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.455177 kubelet[2780]: E0513 23:45:58.455017 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.455177 kubelet[2780]: W0513 23:45:58.455028 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.455177 kubelet[2780]: E0513 23:45:58.455050 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.455428 kubelet[2780]: E0513 23:45:58.455370 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.455428 kubelet[2780]: W0513 23:45:58.455388 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.455428 kubelet[2780]: E0513 23:45:58.455407 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.455802 kubelet[2780]: E0513 23:45:58.455782 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.455802 kubelet[2780]: W0513 23:45:58.455800 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.456031 kubelet[2780]: E0513 23:45:58.455816 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.456453 kubelet[2780]: E0513 23:45:58.456277 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.456453 kubelet[2780]: W0513 23:45:58.456295 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.456453 kubelet[2780]: E0513 23:45:58.456316 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.457442 kubelet[2780]: E0513 23:45:58.457243 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.457442 kubelet[2780]: W0513 23:45:58.457262 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.457538 kubelet[2780]: E0513 23:45:58.457473 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.457888 kubelet[2780]: E0513 23:45:58.457611 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.457888 kubelet[2780]: W0513 23:45:58.457704 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.458017 kubelet[2780]: E0513 23:45:58.457963 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.458526 kubelet[2780]: E0513 23:45:58.458263 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.458526 kubelet[2780]: W0513 23:45:58.458380 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.458526 kubelet[2780]: E0513 23:45:58.458483 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.459034 kubelet[2780]: E0513 23:45:58.458931 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.459034 kubelet[2780]: W0513 23:45:58.458947 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.459034 kubelet[2780]: E0513 23:45:58.458968 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.459797 kubelet[2780]: E0513 23:45:58.459638 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.459797 kubelet[2780]: W0513 23:45:58.459658 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.459797 kubelet[2780]: E0513 23:45:58.459673 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.459906 kubelet[2780]: E0513 23:45:58.459845 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.459906 kubelet[2780]: W0513 23:45:58.459852 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.459906 kubelet[2780]: E0513 23:45:58.459860 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.460737 kubelet[2780]: E0513 23:45:58.460390 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.460737 kubelet[2780]: W0513 23:45:58.460408 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.460737 kubelet[2780]: E0513 23:45:58.460432 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.461028 kubelet[2780]: E0513 23:45:58.461006 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.461444 kubelet[2780]: W0513 23:45:58.461213 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.461444 kubelet[2780]: E0513 23:45:58.461240 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.462437 kubelet[2780]: E0513 23:45:58.462291 2780 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:45:58.462859 kubelet[2780]: W0513 23:45:58.462828 2780 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:45:58.462903 kubelet[2780]: E0513 23:45:58.462868 2780 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:45:58.519433 containerd[1504]: time="2025-05-13T23:45:58.519372945Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:58.522041 containerd[1504]: time="2025-05-13T23:45:58.521941841Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5122903" May 13 23:45:58.526226 containerd[1504]: time="2025-05-13T23:45:58.526187866Z" level=info msg="ImageCreate event name:\"sha256:dd8e710a588cc6f5834c4d84f7e12458efae593d3dfe527ca9e757c89239ecb8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:58.528548 containerd[1504]: time="2025-05-13T23:45:58.528474000Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:dd8e710a588cc6f5834c4d84f7e12458efae593d3dfe527ca9e757c89239ecb8\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6492045\" in 1.58793647s" May 13 23:45:58.528548 containerd[1504]: time="2025-05-13T23:45:58.528519681Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:dd8e710a588cc6f5834c4d84f7e12458efae593d3dfe527ca9e757c89239ecb8\"" May 13 23:45:58.529284 containerd[1504]: time="2025-05-13T23:45:58.528934243Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:45:58.534124 containerd[1504]: time="2025-05-13T23:45:58.533183029Z" level=info msg="CreateContainer within sandbox \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 13 23:45:58.545667 containerd[1504]: time="2025-05-13T23:45:58.545400863Z" level=info msg="Container a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2: CDI devices from CRI Config.CDIDevices: []" May 13 23:45:58.550543 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2071349916.mount: Deactivated successfully. May 13 23:45:58.564836 containerd[1504]: time="2025-05-13T23:45:58.564778261Z" level=info msg="CreateContainer within sandbox \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\"" May 13 23:45:58.566092 containerd[1504]: time="2025-05-13T23:45:58.565842948Z" level=info msg="StartContainer for \"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\"" May 13 23:45:58.569040 containerd[1504]: time="2025-05-13T23:45:58.568808366Z" level=info msg="connecting to shim a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2" address="unix:///run/containerd/s/85560a59758806642fe175fe595fce0f3e8145e74e0483d9672fc82a15766694" protocol=ttrpc version=3 May 13 23:45:58.596267 systemd[1]: Started cri-containerd-a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2.scope - libcontainer container a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2. May 13 23:45:58.665432 containerd[1504]: time="2025-05-13T23:45:58.665386914Z" level=info msg="StartContainer for \"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\" returns successfully" May 13 23:45:58.680676 systemd[1]: cri-containerd-a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2.scope: Deactivated successfully. May 13 23:45:58.686053 containerd[1504]: time="2025-05-13T23:45:58.686008959Z" level=info msg="received exit event container_id:\"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\" id:\"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\" pid:3448 exited_at:{seconds:1747179958 nanos:685112434}" May 13 23:45:58.686879 containerd[1504]: time="2025-05-13T23:45:58.686469642Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\" id:\"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\" pid:3448 exited_at:{seconds:1747179958 nanos:685112434}" May 13 23:45:58.714627 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2-rootfs.mount: Deactivated successfully. May 13 23:45:59.430147 containerd[1504]: time="2025-05-13T23:45:59.430002259Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 13 23:46:00.296581 kubelet[2780]: E0513 23:46:00.296514 2780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kxjnf" podUID="d52acfd2-8155-4dfe-acd7-8d0bba5d8c44" May 13 23:46:02.297534 kubelet[2780]: E0513 23:46:02.297458 2780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kxjnf" podUID="d52acfd2-8155-4dfe-acd7-8d0bba5d8c44" May 13 23:46:03.698090 containerd[1504]: time="2025-05-13T23:46:03.697314313Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:03.700120 containerd[1504]: time="2025-05-13T23:46:03.700032885Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=91256270" May 13 23:46:03.702225 containerd[1504]: time="2025-05-13T23:46:03.702180334Z" level=info msg="ImageCreate event name:\"sha256:add6372545fb406bb017769f222d84c50549ce13e3b19f1fbaee3d8a4aaef627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:03.708407 containerd[1504]: time="2025-05-13T23:46:03.708357122Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:03.709720 containerd[1504]: time="2025-05-13T23:46:03.709655527Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:add6372545fb406bb017769f222d84c50549ce13e3b19f1fbaee3d8a4aaef627\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"92625452\" in 4.279384187s" May 13 23:46:03.709720 containerd[1504]: time="2025-05-13T23:46:03.709712888Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:add6372545fb406bb017769f222d84c50549ce13e3b19f1fbaee3d8a4aaef627\"" May 13 23:46:03.714299 containerd[1504]: time="2025-05-13T23:46:03.714260108Z" level=info msg="CreateContainer within sandbox \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 13 23:46:03.724795 containerd[1504]: time="2025-05-13T23:46:03.724122952Z" level=info msg="Container bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:03.741167 containerd[1504]: time="2025-05-13T23:46:03.740933547Z" level=info msg="CreateContainer within sandbox \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\"" May 13 23:46:03.741971 containerd[1504]: time="2025-05-13T23:46:03.741783590Z" level=info msg="StartContainer for \"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\"" May 13 23:46:03.744168 containerd[1504]: time="2025-05-13T23:46:03.743959400Z" level=info msg="connecting to shim bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00" address="unix:///run/containerd/s/85560a59758806642fe175fe595fce0f3e8145e74e0483d9672fc82a15766694" protocol=ttrpc version=3 May 13 23:46:03.772302 systemd[1]: Started cri-containerd-bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00.scope - libcontainer container bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00. May 13 23:46:03.827994 containerd[1504]: time="2025-05-13T23:46:03.827944614Z" level=info msg="StartContainer for \"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\" returns successfully" May 13 23:46:04.297016 kubelet[2780]: E0513 23:46:04.296954 2780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kxjnf" podUID="d52acfd2-8155-4dfe-acd7-8d0bba5d8c44" May 13 23:46:04.316211 containerd[1504]: time="2025-05-13T23:46:04.316148373Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 23:46:04.320886 systemd[1]: cri-containerd-bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00.scope: Deactivated successfully. May 13 23:46:04.323198 systemd[1]: cri-containerd-bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00.scope: Consumed 472ms CPU time, 169.3M memory peak, 150.3M written to disk. May 13 23:46:04.323714 containerd[1504]: time="2025-05-13T23:46:04.323165002Z" level=info msg="received exit event container_id:\"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\" id:\"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\" pid:3507 exited_at:{seconds:1747179964 nanos:322576480}" May 13 23:46:04.323714 containerd[1504]: time="2025-05-13T23:46:04.323470644Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\" id:\"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\" pid:3507 exited_at:{seconds:1747179964 nanos:322576480}" May 13 23:46:04.357325 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00-rootfs.mount: Deactivated successfully. May 13 23:46:04.373651 kubelet[2780]: I0513 23:46:04.373613 2780 kubelet_node_status.go:502] "Fast updating node status as it just became ready" May 13 23:46:04.438313 systemd[1]: Created slice kubepods-burstable-podf581e8fe_5269_4fdd_84ed_e13bbf0a7b6f.slice - libcontainer container kubepods-burstable-podf581e8fe_5269_4fdd_84ed_e13bbf0a7b6f.slice. May 13 23:46:04.448665 kubelet[2780]: I0513 23:46:04.448587 2780 status_manager.go:890] "Failed to get status for pod" podUID="f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f" pod="kube-system/coredns-668d6bf9bc-zzzfq" err="pods \"coredns-668d6bf9bc-zzzfq\" is forbidden: User \"system:node:ci-4284-0-0-n-732e99817a\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-4284-0-0-n-732e99817a' and this object" May 13 23:46:04.449122 kubelet[2780]: W0513 23:46:04.448749 2780 reflector.go:569] object-"kube-system"/"coredns": failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:ci-4284-0-0-n-732e99817a" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'ci-4284-0-0-n-732e99817a' and this object May 13 23:46:04.449122 kubelet[2780]: E0513 23:46:04.448799 2780 reflector.go:166] "Unhandled Error" err="object-\"kube-system\"/\"coredns\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"coredns\" is forbidden: User \"system:node:ci-4284-0-0-n-732e99817a\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-4284-0-0-n-732e99817a' and this object" logger="UnhandledError" May 13 23:46:04.464601 systemd[1]: Created slice kubepods-besteffort-pod647137d8_a1af_489d_9320_1a34f6baa684.slice - libcontainer container kubepods-besteffort-pod647137d8_a1af_489d_9320_1a34f6baa684.slice. May 13 23:46:04.479202 systemd[1]: Created slice kubepods-besteffort-podbb68557b_1129_460d_9d3e_e3f0bf7e8587.slice - libcontainer container kubepods-besteffort-podbb68557b_1129_460d_9d3e_e3f0bf7e8587.slice. May 13 23:46:04.495842 kubelet[2780]: I0513 23:46:04.495548 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngb9x\" (UniqueName: \"kubernetes.io/projected/ac7eac49-3976-4e76-988d-1b85acd57174-kube-api-access-ngb9x\") pod \"calico-apiserver-9b8b8f55f-js8v9\" (UID: \"ac7eac49-3976-4e76-988d-1b85acd57174\") " pod="calico-apiserver/calico-apiserver-9b8b8f55f-js8v9" May 13 23:46:04.495842 kubelet[2780]: I0513 23:46:04.495600 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb68557b-1129-460d-9d3e-e3f0bf7e8587-tigera-ca-bundle\") pod \"calico-kube-controllers-7dd84bf879-dbp7p\" (UID: \"bb68557b-1129-460d-9d3e-e3f0bf7e8587\") " pod="calico-system/calico-kube-controllers-7dd84bf879-dbp7p" May 13 23:46:04.496245 kubelet[2780]: I0513 23:46:04.496151 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sz8c\" (UniqueName: \"kubernetes.io/projected/bb68557b-1129-460d-9d3e-e3f0bf7e8587-kube-api-access-8sz8c\") pod \"calico-kube-controllers-7dd84bf879-dbp7p\" (UID: \"bb68557b-1129-460d-9d3e-e3f0bf7e8587\") " pod="calico-system/calico-kube-controllers-7dd84bf879-dbp7p" May 13 23:46:04.496910 kubelet[2780]: I0513 23:46:04.496598 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdx55\" (UniqueName: \"kubernetes.io/projected/f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f-kube-api-access-kdx55\") pod \"coredns-668d6bf9bc-zzzfq\" (UID: \"f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f\") " pod="kube-system/coredns-668d6bf9bc-zzzfq" May 13 23:46:04.497489 kubelet[2780]: I0513 23:46:04.497252 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8jvp\" (UniqueName: \"kubernetes.io/projected/93330b1b-7534-4bf8-9e94-6cc9683a3bbb-kube-api-access-g8jvp\") pod \"coredns-668d6bf9bc-bqlxj\" (UID: \"93330b1b-7534-4bf8-9e94-6cc9683a3bbb\") " pod="kube-system/coredns-668d6bf9bc-bqlxj" May 13 23:46:04.499110 kubelet[2780]: I0513 23:46:04.499077 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93330b1b-7534-4bf8-9e94-6cc9683a3bbb-config-volume\") pod \"coredns-668d6bf9bc-bqlxj\" (UID: \"93330b1b-7534-4bf8-9e94-6cc9683a3bbb\") " pod="kube-system/coredns-668d6bf9bc-bqlxj" May 13 23:46:04.500582 kubelet[2780]: I0513 23:46:04.499249 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcsrr\" (UniqueName: \"kubernetes.io/projected/647137d8-a1af-489d-9320-1a34f6baa684-kube-api-access-mcsrr\") pod \"calico-apiserver-85dd65f9fd-fvtcs\" (UID: \"647137d8-a1af-489d-9320-1a34f6baa684\") " pod="calico-apiserver/calico-apiserver-85dd65f9fd-fvtcs" May 13 23:46:04.500582 kubelet[2780]: I0513 23:46:04.499275 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0aba1ede-c9a0-4e1e-accf-2cb417eef657-calico-apiserver-certs\") pod \"calico-apiserver-9b8b8f55f-twx7r\" (UID: \"0aba1ede-c9a0-4e1e-accf-2cb417eef657\") " pod="calico-apiserver/calico-apiserver-9b8b8f55f-twx7r" May 13 23:46:04.500582 kubelet[2780]: I0513 23:46:04.499296 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84mq7\" (UniqueName: \"kubernetes.io/projected/0aba1ede-c9a0-4e1e-accf-2cb417eef657-kube-api-access-84mq7\") pod \"calico-apiserver-9b8b8f55f-twx7r\" (UID: \"0aba1ede-c9a0-4e1e-accf-2cb417eef657\") " pod="calico-apiserver/calico-apiserver-9b8b8f55f-twx7r" May 13 23:46:04.500582 kubelet[2780]: I0513 23:46:04.499315 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ac7eac49-3976-4e76-988d-1b85acd57174-calico-apiserver-certs\") pod \"calico-apiserver-9b8b8f55f-js8v9\" (UID: \"ac7eac49-3976-4e76-988d-1b85acd57174\") " pod="calico-apiserver/calico-apiserver-9b8b8f55f-js8v9" May 13 23:46:04.500582 kubelet[2780]: I0513 23:46:04.499335 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/647137d8-a1af-489d-9320-1a34f6baa684-calico-apiserver-certs\") pod \"calico-apiserver-85dd65f9fd-fvtcs\" (UID: \"647137d8-a1af-489d-9320-1a34f6baa684\") " pod="calico-apiserver/calico-apiserver-85dd65f9fd-fvtcs" May 13 23:46:04.500760 kubelet[2780]: I0513 23:46:04.499369 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f-config-volume\") pod \"coredns-668d6bf9bc-zzzfq\" (UID: \"f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f\") " pod="kube-system/coredns-668d6bf9bc-zzzfq" May 13 23:46:04.503423 systemd[1]: Created slice kubepods-besteffort-pod0aba1ede_c9a0_4e1e_accf_2cb417eef657.slice - libcontainer container kubepods-besteffort-pod0aba1ede_c9a0_4e1e_accf_2cb417eef657.slice. May 13 23:46:04.513964 systemd[1]: Created slice kubepods-burstable-pod93330b1b_7534_4bf8_9e94_6cc9683a3bbb.slice - libcontainer container kubepods-burstable-pod93330b1b_7534_4bf8_9e94_6cc9683a3bbb.slice. May 13 23:46:04.522143 systemd[1]: Created slice kubepods-besteffort-podac7eac49_3976_4e76_988d_1b85acd57174.slice - libcontainer container kubepods-besteffort-podac7eac49_3976_4e76_988d_1b85acd57174.slice. May 13 23:46:04.772945 containerd[1504]: time="2025-05-13T23:46:04.772827030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85dd65f9fd-fvtcs,Uid:647137d8-a1af-489d-9320-1a34f6baa684,Namespace:calico-apiserver,Attempt:0,}" May 13 23:46:04.798581 containerd[1504]: time="2025-05-13T23:46:04.798511977Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7dd84bf879-dbp7p,Uid:bb68557b-1129-460d-9d3e-e3f0bf7e8587,Namespace:calico-system,Attempt:0,}" May 13 23:46:04.812243 containerd[1504]: time="2025-05-13T23:46:04.811922953Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9b8b8f55f-twx7r,Uid:0aba1ede-c9a0-4e1e-accf-2cb417eef657,Namespace:calico-apiserver,Attempt:0,}" May 13 23:46:04.828919 containerd[1504]: time="2025-05-13T23:46:04.828495781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9b8b8f55f-js8v9,Uid:ac7eac49-3976-4e76-988d-1b85acd57174,Namespace:calico-apiserver,Attempt:0,}" May 13 23:46:04.937629 containerd[1504]: time="2025-05-13T23:46:04.937576834Z" level=error msg="Failed to destroy network for sandbox \"cb240e712cab0d8332248de863f98f86502a9474ddf3698497660f3c56f9edf5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:04.939440 containerd[1504]: time="2025-05-13T23:46:04.939386002Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7dd84bf879-dbp7p,Uid:bb68557b-1129-460d-9d3e-e3f0bf7e8587,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb240e712cab0d8332248de863f98f86502a9474ddf3698497660f3c56f9edf5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:04.940122 kubelet[2780]: E0513 23:46:04.939813 2780 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb240e712cab0d8332248de863f98f86502a9474ddf3698497660f3c56f9edf5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:04.940122 kubelet[2780]: E0513 23:46:04.939894 2780 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb240e712cab0d8332248de863f98f86502a9474ddf3698497660f3c56f9edf5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7dd84bf879-dbp7p" May 13 23:46:04.940122 kubelet[2780]: E0513 23:46:04.939913 2780 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb240e712cab0d8332248de863f98f86502a9474ddf3698497660f3c56f9edf5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7dd84bf879-dbp7p" May 13 23:46:04.940271 kubelet[2780]: E0513 23:46:04.939953 2780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7dd84bf879-dbp7p_calico-system(bb68557b-1129-460d-9d3e-e3f0bf7e8587)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7dd84bf879-dbp7p_calico-system(bb68557b-1129-460d-9d3e-e3f0bf7e8587)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cb240e712cab0d8332248de863f98f86502a9474ddf3698497660f3c56f9edf5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7dd84bf879-dbp7p" podUID="bb68557b-1129-460d-9d3e-e3f0bf7e8587" May 13 23:46:04.955119 containerd[1504]: time="2025-05-13T23:46:04.954788226Z" level=error msg="Failed to destroy network for sandbox \"3197a4c1838103cd2366a996461094988d1ebdff8f67e919f05b02f37b76da85\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:04.955429 containerd[1504]: time="2025-05-13T23:46:04.955390748Z" level=error msg="Failed to destroy network for sandbox \"31bd8de04a3a133a4548d5d295f1d074bc90da7cdda95730a41bb82b346eb1e8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:04.956317 containerd[1504]: time="2025-05-13T23:46:04.956212592Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85dd65f9fd-fvtcs,Uid:647137d8-a1af-489d-9320-1a34f6baa684,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3197a4c1838103cd2366a996461094988d1ebdff8f67e919f05b02f37b76da85\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:04.956504 kubelet[2780]: E0513 23:46:04.956454 2780 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3197a4c1838103cd2366a996461094988d1ebdff8f67e919f05b02f37b76da85\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:04.956551 kubelet[2780]: E0513 23:46:04.956529 2780 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3197a4c1838103cd2366a996461094988d1ebdff8f67e919f05b02f37b76da85\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85dd65f9fd-fvtcs" May 13 23:46:04.956577 kubelet[2780]: E0513 23:46:04.956552 2780 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3197a4c1838103cd2366a996461094988d1ebdff8f67e919f05b02f37b76da85\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85dd65f9fd-fvtcs" May 13 23:46:04.956650 kubelet[2780]: E0513 23:46:04.956618 2780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85dd65f9fd-fvtcs_calico-apiserver(647137d8-a1af-489d-9320-1a34f6baa684)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85dd65f9fd-fvtcs_calico-apiserver(647137d8-a1af-489d-9320-1a34f6baa684)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3197a4c1838103cd2366a996461094988d1ebdff8f67e919f05b02f37b76da85\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85dd65f9fd-fvtcs" podUID="647137d8-a1af-489d-9320-1a34f6baa684" May 13 23:46:04.958246 containerd[1504]: time="2025-05-13T23:46:04.958056960Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9b8b8f55f-js8v9,Uid:ac7eac49-3976-4e76-988d-1b85acd57174,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"31bd8de04a3a133a4548d5d295f1d074bc90da7cdda95730a41bb82b346eb1e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:04.959334 kubelet[2780]: E0513 23:46:04.958941 2780 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31bd8de04a3a133a4548d5d295f1d074bc90da7cdda95730a41bb82b346eb1e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:04.959334 kubelet[2780]: E0513 23:46:04.959001 2780 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31bd8de04a3a133a4548d5d295f1d074bc90da7cdda95730a41bb82b346eb1e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9b8b8f55f-js8v9" May 13 23:46:04.959334 kubelet[2780]: E0513 23:46:04.959019 2780 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31bd8de04a3a133a4548d5d295f1d074bc90da7cdda95730a41bb82b346eb1e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9b8b8f55f-js8v9" May 13 23:46:04.959815 kubelet[2780]: E0513 23:46:04.959676 2780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-9b8b8f55f-js8v9_calico-apiserver(ac7eac49-3976-4e76-988d-1b85acd57174)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-9b8b8f55f-js8v9_calico-apiserver(ac7eac49-3976-4e76-988d-1b85acd57174)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"31bd8de04a3a133a4548d5d295f1d074bc90da7cdda95730a41bb82b346eb1e8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9b8b8f55f-js8v9" podUID="ac7eac49-3976-4e76-988d-1b85acd57174" May 13 23:46:04.960495 containerd[1504]: time="2025-05-13T23:46:04.960152528Z" level=error msg="Failed to destroy network for sandbox \"9fa9977596c5c02ef7063ba87b623b02f782a15cc84d1962aea5b4e5cb52ef1d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:04.961916 containerd[1504]: time="2025-05-13T23:46:04.961816375Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9b8b8f55f-twx7r,Uid:0aba1ede-c9a0-4e1e-accf-2cb417eef657,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9fa9977596c5c02ef7063ba87b623b02f782a15cc84d1962aea5b4e5cb52ef1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:04.962151 kubelet[2780]: E0513 23:46:04.962121 2780 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9fa9977596c5c02ef7063ba87b623b02f782a15cc84d1962aea5b4e5cb52ef1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:04.962269 kubelet[2780]: E0513 23:46:04.962174 2780 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9fa9977596c5c02ef7063ba87b623b02f782a15cc84d1962aea5b4e5cb52ef1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9b8b8f55f-twx7r" May 13 23:46:04.962269 kubelet[2780]: E0513 23:46:04.962193 2780 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9fa9977596c5c02ef7063ba87b623b02f782a15cc84d1962aea5b4e5cb52ef1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9b8b8f55f-twx7r" May 13 23:46:04.962329 kubelet[2780]: E0513 23:46:04.962281 2780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-9b8b8f55f-twx7r_calico-apiserver(0aba1ede-c9a0-4e1e-accf-2cb417eef657)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-9b8b8f55f-twx7r_calico-apiserver(0aba1ede-c9a0-4e1e-accf-2cb417eef657)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9fa9977596c5c02ef7063ba87b623b02f782a15cc84d1962aea5b4e5cb52ef1d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9b8b8f55f-twx7r" podUID="0aba1ede-c9a0-4e1e-accf-2cb417eef657" May 13 23:46:05.461218 containerd[1504]: time="2025-05-13T23:46:05.461048676Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 13 23:46:05.646868 containerd[1504]: time="2025-05-13T23:46:05.646798634Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zzzfq,Uid:f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f,Namespace:kube-system,Attempt:0,}" May 13 23:46:05.709150 containerd[1504]: time="2025-05-13T23:46:05.709009235Z" level=error msg="Failed to destroy network for sandbox \"879fcca6504bd1a5b97c8b179601d9cc0309afdbcd88884b346c5b08d0a1e042\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:05.714875 containerd[1504]: time="2025-05-13T23:46:05.713874694Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zzzfq,Uid:f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"879fcca6504bd1a5b97c8b179601d9cc0309afdbcd88884b346c5b08d0a1e042\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:05.715017 kubelet[2780]: E0513 23:46:05.714188 2780 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"879fcca6504bd1a5b97c8b179601d9cc0309afdbcd88884b346c5b08d0a1e042\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:05.715017 kubelet[2780]: E0513 23:46:05.714250 2780 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"879fcca6504bd1a5b97c8b179601d9cc0309afdbcd88884b346c5b08d0a1e042\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-zzzfq" May 13 23:46:05.715017 kubelet[2780]: E0513 23:46:05.714272 2780 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"879fcca6504bd1a5b97c8b179601d9cc0309afdbcd88884b346c5b08d0a1e042\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-zzzfq" May 13 23:46:05.715440 kubelet[2780]: E0513 23:46:05.714315 2780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-zzzfq_kube-system(f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-zzzfq_kube-system(f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"879fcca6504bd1a5b97c8b179601d9cc0309afdbcd88884b346c5b08d0a1e042\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-zzzfq" podUID="f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f" May 13 23:46:05.721955 containerd[1504]: time="2025-05-13T23:46:05.721636164Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bqlxj,Uid:93330b1b-7534-4bf8-9e94-6cc9683a3bbb,Namespace:kube-system,Attempt:0,}" May 13 23:46:05.726922 systemd[1]: run-netns-cni\x2d46401486\x2d623c\x2d9e3f\x2d727e\x2d06043e89ecd2.mount: Deactivated successfully. May 13 23:46:05.727018 systemd[1]: run-netns-cni\x2d1e30bdb7\x2d34b7\x2d8dcd\x2dcb11\x2de5fb91af7e79.mount: Deactivated successfully. May 13 23:46:05.727093 systemd[1]: run-netns-cni\x2d1ae807b7\x2d4d2d\x2df4a3\x2da192\x2d7ca6665e8c9f.mount: Deactivated successfully. May 13 23:46:05.727146 systemd[1]: run-netns-cni\x2d1d424e3d\x2d39c6\x2d534f\x2dc996\x2d81fa276043b3.mount: Deactivated successfully. May 13 23:46:05.778640 containerd[1504]: time="2025-05-13T23:46:05.778567624Z" level=error msg="Failed to destroy network for sandbox \"aae933d3258343b3c5549db8d1d6a6481a977208199c9ccc67aa7b8995984884\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:05.780780 systemd[1]: run-netns-cni\x2d5647518a\x2d477b\x2d635c\x2dc058\x2d3aa5a42c8caa.mount: Deactivated successfully. May 13 23:46:05.782707 containerd[1504]: time="2025-05-13T23:46:05.782542079Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bqlxj,Uid:93330b1b-7534-4bf8-9e94-6cc9683a3bbb,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"aae933d3258343b3c5549db8d1d6a6481a977208199c9ccc67aa7b8995984884\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:05.782938 kubelet[2780]: E0513 23:46:05.782870 2780 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aae933d3258343b3c5549db8d1d6a6481a977208199c9ccc67aa7b8995984884\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:05.783000 kubelet[2780]: E0513 23:46:05.782943 2780 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aae933d3258343b3c5549db8d1d6a6481a977208199c9ccc67aa7b8995984884\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-bqlxj" May 13 23:46:05.783000 kubelet[2780]: E0513 23:46:05.782963 2780 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aae933d3258343b3c5549db8d1d6a6481a977208199c9ccc67aa7b8995984884\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-bqlxj" May 13 23:46:05.783048 kubelet[2780]: E0513 23:46:05.782999 2780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-bqlxj_kube-system(93330b1b-7534-4bf8-9e94-6cc9683a3bbb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-bqlxj_kube-system(93330b1b-7534-4bf8-9e94-6cc9683a3bbb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aae933d3258343b3c5549db8d1d6a6481a977208199c9ccc67aa7b8995984884\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-bqlxj" podUID="93330b1b-7534-4bf8-9e94-6cc9683a3bbb" May 13 23:46:06.308453 systemd[1]: Created slice kubepods-besteffort-podd52acfd2_8155_4dfe_acd7_8d0bba5d8c44.slice - libcontainer container kubepods-besteffort-podd52acfd2_8155_4dfe_acd7_8d0bba5d8c44.slice. May 13 23:46:06.312734 containerd[1504]: time="2025-05-13T23:46:06.312673522Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kxjnf,Uid:d52acfd2-8155-4dfe-acd7-8d0bba5d8c44,Namespace:calico-system,Attempt:0,}" May 13 23:46:06.377981 containerd[1504]: time="2025-05-13T23:46:06.377762355Z" level=error msg="Failed to destroy network for sandbox \"7fb172ad3848e5630ffeafeede645d17d1a1fc89f526657ed6f5ad8ce0fe8f1c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:06.380256 systemd[1]: run-netns-cni\x2d75d72320\x2d4caf\x2deac3\x2d2c90\x2d824061e5c2e4.mount: Deactivated successfully. May 13 23:46:06.382180 containerd[1504]: time="2025-05-13T23:46:06.381732769Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kxjnf,Uid:d52acfd2-8155-4dfe-acd7-8d0bba5d8c44,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fb172ad3848e5630ffeafeede645d17d1a1fc89f526657ed6f5ad8ce0fe8f1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:06.383661 kubelet[2780]: E0513 23:46:06.383613 2780 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fb172ad3848e5630ffeafeede645d17d1a1fc89f526657ed6f5ad8ce0fe8f1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:46:06.383776 kubelet[2780]: E0513 23:46:06.383686 2780 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fb172ad3848e5630ffeafeede645d17d1a1fc89f526657ed6f5ad8ce0fe8f1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kxjnf" May 13 23:46:06.383776 kubelet[2780]: E0513 23:46:06.383709 2780 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fb172ad3848e5630ffeafeede645d17d1a1fc89f526657ed6f5ad8ce0fe8f1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kxjnf" May 13 23:46:06.383826 kubelet[2780]: E0513 23:46:06.383765 2780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kxjnf_calico-system(d52acfd2-8155-4dfe-acd7-8d0bba5d8c44)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kxjnf_calico-system(d52acfd2-8155-4dfe-acd7-8d0bba5d8c44)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7fb172ad3848e5630ffeafeede645d17d1a1fc89f526657ed6f5ad8ce0fe8f1c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kxjnf" podUID="d52acfd2-8155-4dfe-acd7-8d0bba5d8c44" May 13 23:46:12.026415 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2005435042.mount: Deactivated successfully. May 13 23:46:12.062457 containerd[1504]: time="2025-05-13T23:46:12.062339762Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:12.064083 containerd[1504]: time="2025-05-13T23:46:12.063608445Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=138981893" May 13 23:46:12.065121 containerd[1504]: time="2025-05-13T23:46:12.065060248Z" level=info msg="ImageCreate event name:\"sha256:cdcce3ec4624a24c28cdc07b0ee29ddf6703628edee7452a3f8a8b4816bfd057\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:12.068304 containerd[1504]: time="2025-05-13T23:46:12.068253734Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:12.069202 containerd[1504]: time="2025-05-13T23:46:12.068927496Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:cdcce3ec4624a24c28cdc07b0ee29ddf6703628edee7452a3f8a8b4816bfd057\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"138981755\" in 6.607703219s" May 13 23:46:12.069202 containerd[1504]: time="2025-05-13T23:46:12.068961616Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:cdcce3ec4624a24c28cdc07b0ee29ddf6703628edee7452a3f8a8b4816bfd057\"" May 13 23:46:12.086875 containerd[1504]: time="2025-05-13T23:46:12.086825173Z" level=info msg="CreateContainer within sandbox \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 13 23:46:12.099094 containerd[1504]: time="2025-05-13T23:46:12.098910678Z" level=info msg="Container e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:12.111883 containerd[1504]: time="2025-05-13T23:46:12.111760465Z" level=info msg="CreateContainer within sandbox \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\"" May 13 23:46:12.113235 containerd[1504]: time="2025-05-13T23:46:12.112999428Z" level=info msg="StartContainer for \"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\"" May 13 23:46:12.122280 containerd[1504]: time="2025-05-13T23:46:12.121637206Z" level=info msg="connecting to shim e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3" address="unix:///run/containerd/s/85560a59758806642fe175fe595fce0f3e8145e74e0483d9672fc82a15766694" protocol=ttrpc version=3 May 13 23:46:12.143278 systemd[1]: Started cri-containerd-e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3.scope - libcontainer container e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3. May 13 23:46:12.189412 containerd[1504]: time="2025-05-13T23:46:12.189107027Z" level=info msg="StartContainer for \"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" returns successfully" May 13 23:46:12.293333 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 13 23:46:12.293579 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 13 23:46:12.518499 kubelet[2780]: I0513 23:46:12.518416 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-z6rzk" podStartSLOduration=1.118265101 podStartE2EDuration="18.518398794s" podCreationTimestamp="2025-05-13 23:45:54 +0000 UTC" firstStartedPulling="2025-05-13 23:45:54.669716205 +0000 UTC m=+13.487981885" lastFinishedPulling="2025-05-13 23:46:12.069849898 +0000 UTC m=+30.888115578" observedRunningTime="2025-05-13 23:46:12.518116913 +0000 UTC m=+31.336382593" watchObservedRunningTime="2025-05-13 23:46:12.518398794 +0000 UTC m=+31.336664474" May 13 23:46:13.490391 kubelet[2780]: I0513 23:46:13.490313 2780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:46:15.123457 kubelet[2780]: I0513 23:46:15.121645 2780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:46:15.228623 containerd[1504]: time="2025-05-13T23:46:15.228586037Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" id:\"422e945d00c909e5a8060449614ffcd72f25b879a4eebc427bbbf6a79e539c61\" pid:3948 exit_status:1 exited_at:{seconds:1747179975 nanos:227845076}" May 13 23:46:15.314983 containerd[1504]: time="2025-05-13T23:46:15.314675361Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" id:\"ad9884cd98aa3b196be6256831ac151dcff5802289429146a5eca8673c903234\" pid:3972 exit_status:1 exited_at:{seconds:1747179975 nanos:311567717}" May 13 23:46:16.299164 containerd[1504]: time="2025-05-13T23:46:16.298113034Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9b8b8f55f-js8v9,Uid:ac7eac49-3976-4e76-988d-1b85acd57174,Namespace:calico-apiserver,Attempt:0,}" May 13 23:46:16.519208 systemd-networkd[1392]: calib02b243a5f2: Link UP May 13 23:46:16.519554 systemd-networkd[1392]: calib02b243a5f2: Gained carrier May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.325 [INFO][4006] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.365 [INFO][4006] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0 calico-apiserver-9b8b8f55f- calico-apiserver ac7eac49-3976-4e76-988d-1b85acd57174 721 0 2025-05-13 23:45:53 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:9b8b8f55f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-732e99817a calico-apiserver-9b8b8f55f-js8v9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib02b243a5f2 [] []}} ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-js8v9" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-" May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.365 [INFO][4006] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-js8v9" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.420 [INFO][4017] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" HandleID="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.445 [INFO][4017] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" HandleID="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000316d60), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-732e99817a", "pod":"calico-apiserver-9b8b8f55f-js8v9", "timestamp":"2025-05-13 23:46:16.420034264 +0000 UTC"}, Hostname:"ci-4284-0-0-n-732e99817a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.445 [INFO][4017] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.446 [INFO][4017] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.446 [INFO][4017] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-732e99817a' May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.453 [INFO][4017] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" host="ci-4284-0-0-n-732e99817a" May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.463 [INFO][4017] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-732e99817a" May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.473 [INFO][4017] ipam/ipam.go 489: Trying affinity for 192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.478 [INFO][4017] ipam/ipam.go 155: Attempting to load block cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.482 [INFO][4017] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.482 [INFO][4017] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.52.64/26 handle="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" host="ci-4284-0-0-n-732e99817a" May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.486 [INFO][4017] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.493 [INFO][4017] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.52.64/26 handle="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" host="ci-4284-0-0-n-732e99817a" May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.501 [INFO][4017] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.52.65/26] block=192.168.52.64/26 handle="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" host="ci-4284-0-0-n-732e99817a" May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.501 [INFO][4017] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.52.65/26] handle="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" host="ci-4284-0-0-n-732e99817a" May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.501 [INFO][4017] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:46:16.554368 containerd[1504]: 2025-05-13 23:46:16.501 [INFO][4017] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.52.65/26] IPv6=[] ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" HandleID="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:46:16.555214 containerd[1504]: 2025-05-13 23:46:16.508 [INFO][4006] cni-plugin/k8s.go 386: Populated endpoint ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-js8v9" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0", GenerateName:"calico-apiserver-9b8b8f55f-", Namespace:"calico-apiserver", SelfLink:"", UID:"ac7eac49-3976-4e76-988d-1b85acd57174", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 45, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9b8b8f55f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"", Pod:"calico-apiserver-9b8b8f55f-js8v9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.52.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib02b243a5f2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:16.555214 containerd[1504]: 2025-05-13 23:46:16.508 [INFO][4006] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.52.65/32] ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-js8v9" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:46:16.555214 containerd[1504]: 2025-05-13 23:46:16.509 [INFO][4006] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib02b243a5f2 ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-js8v9" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:46:16.555214 containerd[1504]: 2025-05-13 23:46:16.521 [INFO][4006] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-js8v9" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:46:16.555214 containerd[1504]: 2025-05-13 23:46:16.522 [INFO][4006] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-js8v9" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0", GenerateName:"calico-apiserver-9b8b8f55f-", Namespace:"calico-apiserver", SelfLink:"", UID:"ac7eac49-3976-4e76-988d-1b85acd57174", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 45, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9b8b8f55f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e", Pod:"calico-apiserver-9b8b8f55f-js8v9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.52.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib02b243a5f2", MAC:"a2:7a:a7:22:f3:da", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:16.555214 containerd[1504]: 2025-05-13 23:46:16.550 [INFO][4006] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-js8v9" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:46:16.585131 containerd[1504]: time="2025-05-13T23:46:16.585054868Z" level=info msg="connecting to shim 2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" address="unix:///run/containerd/s/769bb359fbe225e7997cfd73f1c2f9447cebb940401e2e0cf504de178eaaa1cc" namespace=k8s.io protocol=ttrpc version=3 May 13 23:46:16.608284 systemd[1]: Started cri-containerd-2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e.scope - libcontainer container 2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e. May 13 23:46:16.699692 containerd[1504]: time="2025-05-13T23:46:16.699598129Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9b8b8f55f-js8v9,Uid:ac7eac49-3976-4e76-988d-1b85acd57174,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\"" May 13 23:46:16.702204 containerd[1504]: time="2025-05-13T23:46:16.702138892Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 23:46:17.300610 containerd[1504]: time="2025-05-13T23:46:17.300473372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bqlxj,Uid:93330b1b-7534-4bf8-9e94-6cc9683a3bbb,Namespace:kube-system,Attempt:0,}" May 13 23:46:17.302555 containerd[1504]: time="2025-05-13T23:46:17.301750093Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kxjnf,Uid:d52acfd2-8155-4dfe-acd7-8d0bba5d8c44,Namespace:calico-system,Attempt:0,}" May 13 23:46:17.302555 containerd[1504]: time="2025-05-13T23:46:17.301888293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7dd84bf879-dbp7p,Uid:bb68557b-1129-460d-9d3e-e3f0bf7e8587,Namespace:calico-system,Attempt:0,}" May 13 23:46:17.302881 containerd[1504]: time="2025-05-13T23:46:17.302782974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85dd65f9fd-fvtcs,Uid:647137d8-a1af-489d-9320-1a34f6baa684,Namespace:calico-apiserver,Attempt:0,}" May 13 23:46:17.590986 systemd-networkd[1392]: cali80f936cec26: Link UP May 13 23:46:17.592289 systemd-networkd[1392]: cali80f936cec26: Gained carrier May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.369 [INFO][4118] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.411 [INFO][4118] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0 calico-kube-controllers-7dd84bf879- calico-system bb68557b-1129-460d-9d3e-e3f0bf7e8587 711 0 2025-05-13 23:45:54 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7dd84bf879 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284-0-0-n-732e99817a calico-kube-controllers-7dd84bf879-dbp7p eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali80f936cec26 [] []}} ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Namespace="calico-system" Pod="calico-kube-controllers-7dd84bf879-dbp7p" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-" May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.412 [INFO][4118] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Namespace="calico-system" Pod="calico-kube-controllers-7dd84bf879-dbp7p" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.479 [INFO][4159] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" HandleID="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.527 [INFO][4159] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" HandleID="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000318b30), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-732e99817a", "pod":"calico-kube-controllers-7dd84bf879-dbp7p", "timestamp":"2025-05-13 23:46:17.478034476 +0000 UTC"}, Hostname:"ci-4284-0-0-n-732e99817a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.527 [INFO][4159] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.527 [INFO][4159] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.527 [INFO][4159] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-732e99817a' May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.534 [INFO][4159] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.547 [INFO][4159] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.558 [INFO][4159] ipam/ipam.go 489: Trying affinity for 192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.561 [INFO][4159] ipam/ipam.go 155: Attempting to load block cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.565 [INFO][4159] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.565 [INFO][4159] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.52.64/26 handle="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.568 [INFO][4159] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9 May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.574 [INFO][4159] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.52.64/26 handle="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.584 [INFO][4159] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.52.66/26] block=192.168.52.64/26 handle="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.584 [INFO][4159] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.52.66/26] handle="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.611935 containerd[1504]: 2025-05-13 23:46:17.584 [INFO][4159] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:46:17.612556 containerd[1504]: 2025-05-13 23:46:17.584 [INFO][4159] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.52.66/26] IPv6=[] ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" HandleID="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:46:17.612556 containerd[1504]: 2025-05-13 23:46:17.586 [INFO][4118] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Namespace="calico-system" Pod="calico-kube-controllers-7dd84bf879-dbp7p" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0", GenerateName:"calico-kube-controllers-7dd84bf879-", Namespace:"calico-system", SelfLink:"", UID:"bb68557b-1129-460d-9d3e-e3f0bf7e8587", ResourceVersion:"711", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 45, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7dd84bf879", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"", Pod:"calico-kube-controllers-7dd84bf879-dbp7p", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.52.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali80f936cec26", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:17.612556 containerd[1504]: 2025-05-13 23:46:17.586 [INFO][4118] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.52.66/32] ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Namespace="calico-system" Pod="calico-kube-controllers-7dd84bf879-dbp7p" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:46:17.612556 containerd[1504]: 2025-05-13 23:46:17.586 [INFO][4118] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali80f936cec26 ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Namespace="calico-system" Pod="calico-kube-controllers-7dd84bf879-dbp7p" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:46:17.612556 containerd[1504]: 2025-05-13 23:46:17.589 [INFO][4118] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Namespace="calico-system" Pod="calico-kube-controllers-7dd84bf879-dbp7p" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:46:17.612556 containerd[1504]: 2025-05-13 23:46:17.589 [INFO][4118] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Namespace="calico-system" Pod="calico-kube-controllers-7dd84bf879-dbp7p" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0", GenerateName:"calico-kube-controllers-7dd84bf879-", Namespace:"calico-system", SelfLink:"", UID:"bb68557b-1129-460d-9d3e-e3f0bf7e8587", ResourceVersion:"711", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 45, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7dd84bf879", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9", Pod:"calico-kube-controllers-7dd84bf879-dbp7p", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.52.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali80f936cec26", MAC:"ea:fd:d5:50:68:97", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:17.612739 containerd[1504]: 2025-05-13 23:46:17.607 [INFO][4118] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Namespace="calico-system" Pod="calico-kube-controllers-7dd84bf879-dbp7p" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:46:17.704385 systemd-networkd[1392]: cali1a43184599b: Link UP May 13 23:46:17.704600 systemd-networkd[1392]: cali1a43184599b: Gained carrier May 13 23:46:17.707381 containerd[1504]: time="2025-05-13T23:46:17.706904153Z" level=info msg="connecting to shim e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" address="unix:///run/containerd/s/cd69b12b2e92133eb339b9296dee1b458c5ff44e1fbe969998cad00b51622c0e" namespace=k8s.io protocol=ttrpc version=3 May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.389 [INFO][4138] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.426 [INFO][4138] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--fvtcs-eth0 calico-apiserver-85dd65f9fd- calico-apiserver 647137d8-a1af-489d-9320-1a34f6baa684 719 0 2025-05-13 23:45:55 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:85dd65f9fd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-732e99817a calico-apiserver-85dd65f9fd-fvtcs eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1a43184599b [] []}} ContainerID="572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-fvtcs" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--fvtcs-" May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.426 [INFO][4138] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-fvtcs" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--fvtcs-eth0" May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.502 [INFO][4175] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" HandleID="k8s-pod-network.572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--fvtcs-eth0" May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.541 [INFO][4175] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" HandleID="k8s-pod-network.572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--fvtcs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003ba5b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-732e99817a", "pod":"calico-apiserver-85dd65f9fd-fvtcs", "timestamp":"2025-05-13 23:46:17.502553181 +0000 UTC"}, Hostname:"ci-4284-0-0-n-732e99817a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.541 [INFO][4175] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.585 [INFO][4175] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.585 [INFO][4175] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-732e99817a' May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.635 [INFO][4175] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.648 [INFO][4175] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.660 [INFO][4175] ipam/ipam.go 489: Trying affinity for 192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.667 [INFO][4175] ipam/ipam.go 155: Attempting to load block cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.675 [INFO][4175] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.675 [INFO][4175] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.52.64/26 handle="k8s-pod-network.572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.680 [INFO][4175] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035 May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.688 [INFO][4175] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.52.64/26 handle="k8s-pod-network.572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.698 [INFO][4175] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.52.67/26] block=192.168.52.64/26 handle="k8s-pod-network.572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.698 [INFO][4175] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.52.67/26] handle="k8s-pod-network.572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.698 [INFO][4175] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:46:17.725988 containerd[1504]: 2025-05-13 23:46:17.698 [INFO][4175] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.52.67/26] IPv6=[] ContainerID="572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" HandleID="k8s-pod-network.572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--fvtcs-eth0" May 13 23:46:17.726630 containerd[1504]: 2025-05-13 23:46:17.701 [INFO][4138] cni-plugin/k8s.go 386: Populated endpoint ContainerID="572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-fvtcs" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--fvtcs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--fvtcs-eth0", GenerateName:"calico-apiserver-85dd65f9fd-", Namespace:"calico-apiserver", SelfLink:"", UID:"647137d8-a1af-489d-9320-1a34f6baa684", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 45, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85dd65f9fd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"", Pod:"calico-apiserver-85dd65f9fd-fvtcs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.52.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1a43184599b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:17.726630 containerd[1504]: 2025-05-13 23:46:17.701 [INFO][4138] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.52.67/32] ContainerID="572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-fvtcs" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--fvtcs-eth0" May 13 23:46:17.726630 containerd[1504]: 2025-05-13 23:46:17.701 [INFO][4138] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1a43184599b ContainerID="572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-fvtcs" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--fvtcs-eth0" May 13 23:46:17.726630 containerd[1504]: 2025-05-13 23:46:17.704 [INFO][4138] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-fvtcs" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--fvtcs-eth0" May 13 23:46:17.726630 containerd[1504]: 2025-05-13 23:46:17.706 [INFO][4138] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-fvtcs" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--fvtcs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--fvtcs-eth0", GenerateName:"calico-apiserver-85dd65f9fd-", Namespace:"calico-apiserver", SelfLink:"", UID:"647137d8-a1af-489d-9320-1a34f6baa684", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 45, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85dd65f9fd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035", Pod:"calico-apiserver-85dd65f9fd-fvtcs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.52.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1a43184599b", MAC:"ea:ae:c6:05:cf:31", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:17.726630 containerd[1504]: 2025-05-13 23:46:17.722 [INFO][4138] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-fvtcs" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--fvtcs-eth0" May 13 23:46:17.760322 containerd[1504]: time="2025-05-13T23:46:17.760265289Z" level=info msg="connecting to shim 572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035" address="unix:///run/containerd/s/b23c1d752ca67457fa0a7e22977bf120658871f9353c7643c864862a64bc681a" namespace=k8s.io protocol=ttrpc version=3 May 13 23:46:17.788317 systemd[1]: Started cri-containerd-e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9.scope - libcontainer container e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9. May 13 23:46:17.806401 systemd[1]: Started cri-containerd-572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035.scope - libcontainer container 572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035. May 13 23:46:17.828656 systemd-networkd[1392]: cali263f3599740: Link UP May 13 23:46:17.830384 systemd-networkd[1392]: cali263f3599740: Gained carrier May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.379 [INFO][4105] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.421 [INFO][4105] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--bqlxj-eth0 coredns-668d6bf9bc- kube-system 93330b1b-7534-4bf8-9e94-6cc9683a3bbb 720 0 2025-05-13 23:45:46 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-732e99817a coredns-668d6bf9bc-bqlxj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali263f3599740 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" Namespace="kube-system" Pod="coredns-668d6bf9bc-bqlxj" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--bqlxj-" May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.421 [INFO][4105] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" Namespace="kube-system" Pod="coredns-668d6bf9bc-bqlxj" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--bqlxj-eth0" May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.504 [INFO][4166] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" HandleID="k8s-pod-network.975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" Workload="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--bqlxj-eth0" May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.546 [INFO][4166] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" HandleID="k8s-pod-network.975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" Workload="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--bqlxj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003bcad0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-732e99817a", "pod":"coredns-668d6bf9bc-bqlxj", "timestamp":"2025-05-13 23:46:17.504119903 +0000 UTC"}, Hostname:"ci-4284-0-0-n-732e99817a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.547 [INFO][4166] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.698 [INFO][4166] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.698 [INFO][4166] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-732e99817a' May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.735 [INFO][4166] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.746 [INFO][4166] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.760 [INFO][4166] ipam/ipam.go 489: Trying affinity for 192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.766 [INFO][4166] ipam/ipam.go 155: Attempting to load block cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.775 [INFO][4166] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.775 [INFO][4166] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.52.64/26 handle="k8s-pod-network.975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.781 [INFO][4166] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2 May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.798 [INFO][4166] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.52.64/26 handle="k8s-pod-network.975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.811 [INFO][4166] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.52.68/26] block=192.168.52.64/26 handle="k8s-pod-network.975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.811 [INFO][4166] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.52.68/26] handle="k8s-pod-network.975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" host="ci-4284-0-0-n-732e99817a" May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.811 [INFO][4166] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:46:17.871938 containerd[1504]: 2025-05-13 23:46:17.811 [INFO][4166] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.52.68/26] IPv6=[] ContainerID="975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" HandleID="k8s-pod-network.975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" Workload="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--bqlxj-eth0" May 13 23:46:17.872511 containerd[1504]: 2025-05-13 23:46:17.815 [INFO][4105] cni-plugin/k8s.go 386: Populated endpoint ContainerID="975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" Namespace="kube-system" Pod="coredns-668d6bf9bc-bqlxj" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--bqlxj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--bqlxj-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"93330b1b-7534-4bf8-9e94-6cc9683a3bbb", ResourceVersion:"720", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 45, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"", Pod:"coredns-668d6bf9bc-bqlxj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.52.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali263f3599740", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:17.872511 containerd[1504]: 2025-05-13 23:46:17.816 [INFO][4105] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.52.68/32] ContainerID="975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" Namespace="kube-system" Pod="coredns-668d6bf9bc-bqlxj" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--bqlxj-eth0" May 13 23:46:17.872511 containerd[1504]: 2025-05-13 23:46:17.816 [INFO][4105] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali263f3599740 ContainerID="975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" Namespace="kube-system" Pod="coredns-668d6bf9bc-bqlxj" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--bqlxj-eth0" May 13 23:46:17.872511 containerd[1504]: 2025-05-13 23:46:17.830 [INFO][4105] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" Namespace="kube-system" Pod="coredns-668d6bf9bc-bqlxj" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--bqlxj-eth0" May 13 23:46:17.872511 containerd[1504]: 2025-05-13 23:46:17.831 [INFO][4105] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" Namespace="kube-system" Pod="coredns-668d6bf9bc-bqlxj" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--bqlxj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--bqlxj-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"93330b1b-7534-4bf8-9e94-6cc9683a3bbb", ResourceVersion:"720", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 45, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2", Pod:"coredns-668d6bf9bc-bqlxj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.52.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali263f3599740", MAC:"2e:45:27:b0:3b:ba", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:17.872691 containerd[1504]: 2025-05-13 23:46:17.864 [INFO][4105] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" Namespace="kube-system" Pod="coredns-668d6bf9bc-bqlxj" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--bqlxj-eth0" May 13 23:46:17.921595 containerd[1504]: time="2025-05-13T23:46:17.921002935Z" level=info msg="connecting to shim 975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2" address="unix:///run/containerd/s/10160088ff47baed8abd0d9c913bb1c7a3f8a0e2e86acb1336a86e7b997e675b" namespace=k8s.io protocol=ttrpc version=3 May 13 23:46:17.951767 systemd-networkd[1392]: cali2257faeed85: Link UP May 13 23:46:17.955294 systemd-networkd[1392]: cali2257faeed85: Gained carrier May 13 23:46:18.002500 systemd[1]: Started cri-containerd-975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2.scope - libcontainer container 975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2. May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.407 [INFO][4126] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.440 [INFO][4126] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--732e99817a-k8s-csi--node--driver--kxjnf-eth0 csi-node-driver- calico-system d52acfd2-8155-4dfe-acd7-8d0bba5d8c44 618 0 2025-05-13 23:45:54 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:5b5cc68cd5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4284-0-0-n-732e99817a csi-node-driver-kxjnf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali2257faeed85 [] []}} ContainerID="80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" Namespace="calico-system" Pod="csi-node-driver-kxjnf" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-csi--node--driver--kxjnf-" May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.440 [INFO][4126] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" Namespace="calico-system" Pod="csi-node-driver-kxjnf" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-csi--node--driver--kxjnf-eth0" May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.526 [INFO][4174] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" HandleID="k8s-pod-network.80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" Workload="ci--4284--0--0--n--732e99817a-k8s-csi--node--driver--kxjnf-eth0" May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.548 [INFO][4174] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" HandleID="k8s-pod-network.80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" Workload="ci--4284--0--0--n--732e99817a-k8s-csi--node--driver--kxjnf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003b8910), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-732e99817a", "pod":"csi-node-driver-kxjnf", "timestamp":"2025-05-13 23:46:17.526715286 +0000 UTC"}, Hostname:"ci-4284-0-0-n-732e99817a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.548 [INFO][4174] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.811 [INFO][4174] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.811 [INFO][4174] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-732e99817a' May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.842 [INFO][4174] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.867 [INFO][4174] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.888 [INFO][4174] ipam/ipam.go 489: Trying affinity for 192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.893 [INFO][4174] ipam/ipam.go 155: Attempting to load block cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.902 [INFO][4174] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.903 [INFO][4174] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.52.64/26 handle="k8s-pod-network.80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.908 [INFO][4174] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002 May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.920 [INFO][4174] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.52.64/26 handle="k8s-pod-network.80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.936 [INFO][4174] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.52.69/26] block=192.168.52.64/26 handle="k8s-pod-network.80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.937 [INFO][4174] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.52.69/26] handle="k8s-pod-network.80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.937 [INFO][4174] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:46:18.006987 containerd[1504]: 2025-05-13 23:46:17.937 [INFO][4174] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.52.69/26] IPv6=[] ContainerID="80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" HandleID="k8s-pod-network.80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" Workload="ci--4284--0--0--n--732e99817a-k8s-csi--node--driver--kxjnf-eth0" May 13 23:46:18.008977 containerd[1504]: 2025-05-13 23:46:17.944 [INFO][4126] cni-plugin/k8s.go 386: Populated endpoint ContainerID="80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" Namespace="calico-system" Pod="csi-node-driver-kxjnf" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-csi--node--driver--kxjnf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-csi--node--driver--kxjnf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d52acfd2-8155-4dfe-acd7-8d0bba5d8c44", ResourceVersion:"618", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 45, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5b5cc68cd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"", Pod:"csi-node-driver-kxjnf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.52.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali2257faeed85", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:18.008977 containerd[1504]: 2025-05-13 23:46:17.945 [INFO][4126] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.52.69/32] ContainerID="80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" Namespace="calico-system" Pod="csi-node-driver-kxjnf" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-csi--node--driver--kxjnf-eth0" May 13 23:46:18.008977 containerd[1504]: 2025-05-13 23:46:17.945 [INFO][4126] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2257faeed85 ContainerID="80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" Namespace="calico-system" Pod="csi-node-driver-kxjnf" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-csi--node--driver--kxjnf-eth0" May 13 23:46:18.008977 containerd[1504]: 2025-05-13 23:46:17.959 [INFO][4126] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" Namespace="calico-system" Pod="csi-node-driver-kxjnf" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-csi--node--driver--kxjnf-eth0" May 13 23:46:18.008977 containerd[1504]: 2025-05-13 23:46:17.960 [INFO][4126] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" Namespace="calico-system" Pod="csi-node-driver-kxjnf" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-csi--node--driver--kxjnf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-csi--node--driver--kxjnf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d52acfd2-8155-4dfe-acd7-8d0bba5d8c44", ResourceVersion:"618", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 45, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5b5cc68cd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002", Pod:"csi-node-driver-kxjnf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.52.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali2257faeed85", MAC:"da:18:6e:ba:02:92", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:18.008977 containerd[1504]: 2025-05-13 23:46:18.003 [INFO][4126] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" Namespace="calico-system" Pod="csi-node-driver-kxjnf" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-csi--node--driver--kxjnf-eth0" May 13 23:46:18.023655 containerd[1504]: time="2025-05-13T23:46:18.023499637Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7dd84bf879-dbp7p,Uid:bb68557b-1129-460d-9d3e-e3f0bf7e8587,Namespace:calico-system,Attempt:0,} returns sandbox id \"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\"" May 13 23:46:18.046523 containerd[1504]: time="2025-05-13T23:46:18.046141896Z" level=info msg="connecting to shim 80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002" address="unix:///run/containerd/s/c403556a58fc7b775cd5c63f72fe910114b710224198ac34d577092a5b62aa78" namespace=k8s.io protocol=ttrpc version=3 May 13 23:46:18.070653 systemd[1]: Started cri-containerd-80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002.scope - libcontainer container 80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002. May 13 23:46:18.080874 containerd[1504]: time="2025-05-13T23:46:18.080809246Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85dd65f9fd-fvtcs,Uid:647137d8-a1af-489d-9320-1a34f6baa684,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035\"" May 13 23:46:18.106375 systemd-networkd[1392]: calib02b243a5f2: Gained IPv6LL May 13 23:46:18.123558 containerd[1504]: time="2025-05-13T23:46:18.122710841Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bqlxj,Uid:93330b1b-7534-4bf8-9e94-6cc9683a3bbb,Namespace:kube-system,Attempt:0,} returns sandbox id \"975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2\"" May 13 23:46:18.128504 containerd[1504]: time="2025-05-13T23:46:18.128129766Z" level=info msg="CreateContainer within sandbox \"975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 23:46:18.144699 containerd[1504]: time="2025-05-13T23:46:18.144595860Z" level=info msg="Container b83c845d67ab5dac0b65d07100383572d8f4b2429af04c48edba1566be9e9887: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:18.150107 containerd[1504]: time="2025-05-13T23:46:18.150034864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kxjnf,Uid:d52acfd2-8155-4dfe-acd7-8d0bba5d8c44,Namespace:calico-system,Attempt:0,} returns sandbox id \"80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002\"" May 13 23:46:18.154585 containerd[1504]: time="2025-05-13T23:46:18.154519908Z" level=info msg="CreateContainer within sandbox \"975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b83c845d67ab5dac0b65d07100383572d8f4b2429af04c48edba1566be9e9887\"" May 13 23:46:18.156515 containerd[1504]: time="2025-05-13T23:46:18.156474510Z" level=info msg="StartContainer for \"b83c845d67ab5dac0b65d07100383572d8f4b2429af04c48edba1566be9e9887\"" May 13 23:46:18.157626 containerd[1504]: time="2025-05-13T23:46:18.157592271Z" level=info msg="connecting to shim b83c845d67ab5dac0b65d07100383572d8f4b2429af04c48edba1566be9e9887" address="unix:///run/containerd/s/10160088ff47baed8abd0d9c913bb1c7a3f8a0e2e86acb1336a86e7b997e675b" protocol=ttrpc version=3 May 13 23:46:18.185977 systemd[1]: Started cri-containerd-b83c845d67ab5dac0b65d07100383572d8f4b2429af04c48edba1566be9e9887.scope - libcontainer container b83c845d67ab5dac0b65d07100383572d8f4b2429af04c48edba1566be9e9887. May 13 23:46:18.232598 containerd[1504]: time="2025-05-13T23:46:18.232237174Z" level=info msg="StartContainer for \"b83c845d67ab5dac0b65d07100383572d8f4b2429af04c48edba1566be9e9887\" returns successfully" May 13 23:46:18.299266 containerd[1504]: time="2025-05-13T23:46:18.298405190Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zzzfq,Uid:f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f,Namespace:kube-system,Attempt:0,}" May 13 23:46:18.545931 systemd-networkd[1392]: cali800cb562ef2: Link UP May 13 23:46:18.546528 systemd-networkd[1392]: cali800cb562ef2: Gained carrier May 13 23:46:18.555926 kubelet[2780]: I0513 23:46:18.555790 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-bqlxj" podStartSLOduration=32.555753568 podStartE2EDuration="32.555753568s" podCreationTimestamp="2025-05-13 23:45:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:46:18.552831565 +0000 UTC m=+37.371097245" watchObservedRunningTime="2025-05-13 23:46:18.555753568 +0000 UTC m=+37.374019288" May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.364 [INFO][4452] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.390 [INFO][4452] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--zzzfq-eth0 coredns-668d6bf9bc- kube-system f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f 718 0 2025-05-13 23:45:46 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-732e99817a coredns-668d6bf9bc-zzzfq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali800cb562ef2 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-zzzfq" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--zzzfq-" May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.391 [INFO][4452] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-zzzfq" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--zzzfq-eth0" May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.459 [INFO][4473] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" HandleID="k8s-pod-network.b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" Workload="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--zzzfq-eth0" May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.477 [INFO][4473] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" HandleID="k8s-pod-network.b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" Workload="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--zzzfq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d000), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-732e99817a", "pod":"coredns-668d6bf9bc-zzzfq", "timestamp":"2025-05-13 23:46:18.459336046 +0000 UTC"}, Hostname:"ci-4284-0-0-n-732e99817a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.477 [INFO][4473] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.477 [INFO][4473] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.478 [INFO][4473] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-732e99817a' May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.482 [INFO][4473] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.490 [INFO][4473] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.501 [INFO][4473] ipam/ipam.go 489: Trying affinity for 192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.505 [INFO][4473] ipam/ipam.go 155: Attempting to load block cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.509 [INFO][4473] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.510 [INFO][4473] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.52.64/26 handle="k8s-pod-network.b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.513 [INFO][4473] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.524 [INFO][4473] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.52.64/26 handle="k8s-pod-network.b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.538 [INFO][4473] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.52.70/26] block=192.168.52.64/26 handle="k8s-pod-network.b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.538 [INFO][4473] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.52.70/26] handle="k8s-pod-network.b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" host="ci-4284-0-0-n-732e99817a" May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.538 [INFO][4473] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:46:18.594973 containerd[1504]: 2025-05-13 23:46:18.538 [INFO][4473] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.52.70/26] IPv6=[] ContainerID="b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" HandleID="k8s-pod-network.b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" Workload="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--zzzfq-eth0" May 13 23:46:18.595968 containerd[1504]: 2025-05-13 23:46:18.542 [INFO][4452] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-zzzfq" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--zzzfq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--zzzfq-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f", ResourceVersion:"718", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 45, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"", Pod:"coredns-668d6bf9bc-zzzfq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.52.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali800cb562ef2", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:18.595968 containerd[1504]: 2025-05-13 23:46:18.542 [INFO][4452] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.52.70/32] ContainerID="b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-zzzfq" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--zzzfq-eth0" May 13 23:46:18.595968 containerd[1504]: 2025-05-13 23:46:18.542 [INFO][4452] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali800cb562ef2 ContainerID="b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-zzzfq" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--zzzfq-eth0" May 13 23:46:18.595968 containerd[1504]: 2025-05-13 23:46:18.544 [INFO][4452] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-zzzfq" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--zzzfq-eth0" May 13 23:46:18.595968 containerd[1504]: 2025-05-13 23:46:18.545 [INFO][4452] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-zzzfq" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--zzzfq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--zzzfq-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f", ResourceVersion:"718", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 45, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c", Pod:"coredns-668d6bf9bc-zzzfq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.52.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali800cb562ef2", MAC:"da:97:fc:1b:08:51", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:18.596208 containerd[1504]: 2025-05-13 23:46:18.590 [INFO][4452] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" Namespace="kube-system" Pod="coredns-668d6bf9bc-zzzfq" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-coredns--668d6bf9bc--zzzfq-eth0" May 13 23:46:18.647884 containerd[1504]: time="2025-05-13T23:46:18.647832766Z" level=info msg="connecting to shim b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c" address="unix:///run/containerd/s/257ee730a4b91b45f372aa83bf78ea76f9933f48bfa35fceaa6f405ac1232b5e" namespace=k8s.io protocol=ttrpc version=3 May 13 23:46:18.673370 systemd[1]: Started cri-containerd-b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c.scope - libcontainer container b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c. May 13 23:46:18.720970 containerd[1504]: time="2025-05-13T23:46:18.720919788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zzzfq,Uid:f581e8fe-5269-4fdd-84ed-e13bbf0a7b6f,Namespace:kube-system,Attempt:0,} returns sandbox id \"b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c\"" May 13 23:46:18.724943 containerd[1504]: time="2025-05-13T23:46:18.724818751Z" level=info msg="CreateContainer within sandbox \"b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 23:46:18.737555 containerd[1504]: time="2025-05-13T23:46:18.736786601Z" level=info msg="Container 23a5de2726bad04721fe406cc5ca0122144be612fd5c3075bc187f6e83bdebf6: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:18.743047 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount727745413.mount: Deactivated successfully. May 13 23:46:18.752628 containerd[1504]: time="2025-05-13T23:46:18.752553014Z" level=info msg="CreateContainer within sandbox \"b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"23a5de2726bad04721fe406cc5ca0122144be612fd5c3075bc187f6e83bdebf6\"" May 13 23:46:18.753385 containerd[1504]: time="2025-05-13T23:46:18.753342135Z" level=info msg="StartContainer for \"23a5de2726bad04721fe406cc5ca0122144be612fd5c3075bc187f6e83bdebf6\"" May 13 23:46:18.755150 containerd[1504]: time="2025-05-13T23:46:18.754953296Z" level=info msg="connecting to shim 23a5de2726bad04721fe406cc5ca0122144be612fd5c3075bc187f6e83bdebf6" address="unix:///run/containerd/s/257ee730a4b91b45f372aa83bf78ea76f9933f48bfa35fceaa6f405ac1232b5e" protocol=ttrpc version=3 May 13 23:46:18.779512 systemd[1]: Started cri-containerd-23a5de2726bad04721fe406cc5ca0122144be612fd5c3075bc187f6e83bdebf6.scope - libcontainer container 23a5de2726bad04721fe406cc5ca0122144be612fd5c3075bc187f6e83bdebf6. May 13 23:46:18.829415 containerd[1504]: time="2025-05-13T23:46:18.828394559Z" level=info msg="StartContainer for \"23a5de2726bad04721fe406cc5ca0122144be612fd5c3075bc187f6e83bdebf6\" returns successfully" May 13 23:46:18.874294 systemd-networkd[1392]: cali263f3599740: Gained IPv6LL May 13 23:46:19.066392 systemd-networkd[1392]: cali1a43184599b: Gained IPv6LL May 13 23:46:19.386360 systemd-networkd[1392]: cali80f936cec26: Gained IPv6LL May 13 23:46:19.450424 systemd-networkd[1392]: cali2257faeed85: Gained IPv6LL May 13 23:46:19.593020 kubelet[2780]: I0513 23:46:19.592930 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-zzzfq" podStartSLOduration=33.592893816 podStartE2EDuration="33.592893816s" podCreationTimestamp="2025-05-13 23:45:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:46:19.56876328 +0000 UTC m=+38.387028960" watchObservedRunningTime="2025-05-13 23:46:19.592893816 +0000 UTC m=+38.411159496" May 13 23:46:20.154461 systemd-networkd[1392]: cali800cb562ef2: Gained IPv6LL May 13 23:46:20.297409 containerd[1504]: time="2025-05-13T23:46:20.297363150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9b8b8f55f-twx7r,Uid:0aba1ede-c9a0-4e1e-accf-2cb417eef657,Namespace:calico-apiserver,Attempt:0,}" May 13 23:46:20.481169 systemd-networkd[1392]: cali19510292001: Link UP May 13 23:46:20.481443 systemd-networkd[1392]: cali19510292001: Gained carrier May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.331 [INFO][4599] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.348 [INFO][4599] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0 calico-apiserver-9b8b8f55f- calico-apiserver 0aba1ede-c9a0-4e1e-accf-2cb417eef657 716 0 2025-05-13 23:45:53 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:9b8b8f55f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-732e99817a calico-apiserver-9b8b8f55f-twx7r eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali19510292001 [] []}} ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-twx7r" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-" May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.348 [INFO][4599] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-twx7r" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.391 [INFO][4611] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" HandleID="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.411 [INFO][4611] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" HandleID="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000318a50), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-732e99817a", "pod":"calico-apiserver-9b8b8f55f-twx7r", "timestamp":"2025-05-13 23:46:20.391914395 +0000 UTC"}, Hostname:"ci-4284-0-0-n-732e99817a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.411 [INFO][4611] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.411 [INFO][4611] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.411 [INFO][4611] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-732e99817a' May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.416 [INFO][4611] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" host="ci-4284-0-0-n-732e99817a" May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.425 [INFO][4611] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-732e99817a" May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.437 [INFO][4611] ipam/ipam.go 489: Trying affinity for 192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.443 [INFO][4611] ipam/ipam.go 155: Attempting to load block cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.448 [INFO][4611] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.448 [INFO][4611] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.52.64/26 handle="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" host="ci-4284-0-0-n-732e99817a" May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.452 [INFO][4611] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.464 [INFO][4611] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.52.64/26 handle="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" host="ci-4284-0-0-n-732e99817a" May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.474 [INFO][4611] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.52.71/26] block=192.168.52.64/26 handle="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" host="ci-4284-0-0-n-732e99817a" May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.475 [INFO][4611] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.52.71/26] handle="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" host="ci-4284-0-0-n-732e99817a" May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.475 [INFO][4611] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:46:20.502946 containerd[1504]: 2025-05-13 23:46:20.475 [INFO][4611] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.52.71/26] IPv6=[] ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" HandleID="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:46:20.503895 containerd[1504]: 2025-05-13 23:46:20.478 [INFO][4599] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-twx7r" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0", GenerateName:"calico-apiserver-9b8b8f55f-", Namespace:"calico-apiserver", SelfLink:"", UID:"0aba1ede-c9a0-4e1e-accf-2cb417eef657", ResourceVersion:"716", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 45, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9b8b8f55f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"", Pod:"calico-apiserver-9b8b8f55f-twx7r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.52.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali19510292001", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:20.503895 containerd[1504]: 2025-05-13 23:46:20.478 [INFO][4599] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.52.71/32] ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-twx7r" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:46:20.503895 containerd[1504]: 2025-05-13 23:46:20.478 [INFO][4599] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali19510292001 ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-twx7r" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:46:20.503895 containerd[1504]: 2025-05-13 23:46:20.480 [INFO][4599] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-twx7r" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:46:20.503895 containerd[1504]: 2025-05-13 23:46:20.481 [INFO][4599] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-twx7r" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0", GenerateName:"calico-apiserver-9b8b8f55f-", Namespace:"calico-apiserver", SelfLink:"", UID:"0aba1ede-c9a0-4e1e-accf-2cb417eef657", ResourceVersion:"716", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 45, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9b8b8f55f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f", Pod:"calico-apiserver-9b8b8f55f-twx7r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.52.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali19510292001", MAC:"56:d5:0b:39:ee:9f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:20.503895 containerd[1504]: 2025-05-13 23:46:20.493 [INFO][4599] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Namespace="calico-apiserver" Pod="calico-apiserver-9b8b8f55f-twx7r" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:46:20.553175 containerd[1504]: time="2025-05-13T23:46:20.552116553Z" level=info msg="connecting to shim 7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" address="unix:///run/containerd/s/ac3de3152cd3f91e059d534600f2ce68b0cdea026c7074c22b19043f3eace96d" namespace=k8s.io protocol=ttrpc version=3 May 13 23:46:20.590584 kubelet[2780]: I0513 23:46:20.590545 2780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:46:20.639781 systemd[1]: Started cri-containerd-7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f.scope - libcontainer container 7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f. May 13 23:46:20.754519 containerd[1504]: time="2025-05-13T23:46:20.752904930Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9b8b8f55f-twx7r,Uid:0aba1ede-c9a0-4e1e-accf-2cb417eef657,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\"" May 13 23:46:21.488164 kernel: bpftool[4737]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set May 13 23:46:21.627236 systemd-networkd[1392]: cali19510292001: Gained IPv6LL May 13 23:46:21.765403 systemd-networkd[1392]: vxlan.calico: Link UP May 13 23:46:21.765413 systemd-networkd[1392]: vxlan.calico: Gained carrier May 13 23:46:23.034433 systemd-networkd[1392]: vxlan.calico: Gained IPv6LL May 13 23:46:23.420952 containerd[1504]: time="2025-05-13T23:46:23.420791972Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:23.422489 containerd[1504]: time="2025-05-13T23:46:23.422296092Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=40247603" May 13 23:46:23.424691 containerd[1504]: time="2025-05-13T23:46:23.423626292Z" level=info msg="ImageCreate event name:\"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:23.426680 containerd[1504]: time="2025-05-13T23:46:23.426642932Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:23.427530 containerd[1504]: time="2025-05-13T23:46:23.427480492Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"41616801\" in 6.72529796s" May 13 23:46:23.427676 containerd[1504]: time="2025-05-13T23:46:23.427657052Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\"" May 13 23:46:23.434428 containerd[1504]: time="2025-05-13T23:46:23.434386452Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 13 23:46:23.446760 containerd[1504]: time="2025-05-13T23:46:23.446720372Z" level=info msg="CreateContainer within sandbox \"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 23:46:23.456453 containerd[1504]: time="2025-05-13T23:46:23.456410251Z" level=info msg="Container 17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:23.465722 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3725564496.mount: Deactivated successfully. May 13 23:46:23.501588 containerd[1504]: time="2025-05-13T23:46:23.501527370Z" level=info msg="CreateContainer within sandbox \"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\"" May 13 23:46:23.509820 containerd[1504]: time="2025-05-13T23:46:23.508016090Z" level=info msg="StartContainer for \"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\"" May 13 23:46:23.509820 containerd[1504]: time="2025-05-13T23:46:23.509610930Z" level=info msg="connecting to shim 17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e" address="unix:///run/containerd/s/769bb359fbe225e7997cfd73f1c2f9447cebb940401e2e0cf504de178eaaa1cc" protocol=ttrpc version=3 May 13 23:46:23.537315 systemd[1]: Started cri-containerd-17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e.scope - libcontainer container 17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e. May 13 23:46:23.602824 containerd[1504]: time="2025-05-13T23:46:23.602765448Z" level=info msg="StartContainer for \"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\" returns successfully" May 13 23:46:24.620301 kubelet[2780]: I0513 23:46:24.619911 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-9b8b8f55f-js8v9" podStartSLOduration=24.887019528 podStartE2EDuration="31.619788809s" podCreationTimestamp="2025-05-13 23:45:53 +0000 UTC" firstStartedPulling="2025-05-13 23:46:16.701416971 +0000 UTC m=+35.519682651" lastFinishedPulling="2025-05-13 23:46:23.434186252 +0000 UTC m=+42.252451932" observedRunningTime="2025-05-13 23:46:24.61525897 +0000 UTC m=+43.433524650" watchObservedRunningTime="2025-05-13 23:46:24.619788809 +0000 UTC m=+43.438054489" May 13 23:46:25.597671 kubelet[2780]: I0513 23:46:25.597621 2780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:46:26.121822 containerd[1504]: time="2025-05-13T23:46:26.121764592Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:26.123814 containerd[1504]: time="2025-05-13T23:46:26.123305032Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=32554116" May 13 23:46:26.124411 containerd[1504]: time="2025-05-13T23:46:26.124376591Z" level=info msg="ImageCreate event name:\"sha256:ec7c64189a2fd01b24b044fea1840d441e9884a0df32c2e9d6982cfbbea1f814\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:26.128023 containerd[1504]: time="2025-05-13T23:46:26.127960590Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:ec7c64189a2fd01b24b044fea1840d441e9884a0df32c2e9d6982cfbbea1f814\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"33923266\" in 2.693363938s" May 13 23:46:26.128444 containerd[1504]: time="2025-05-13T23:46:26.128132749Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:26.128744 containerd[1504]: time="2025-05-13T23:46:26.128716149Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:ec7c64189a2fd01b24b044fea1840d441e9884a0df32c2e9d6982cfbbea1f814\"" May 13 23:46:26.130092 containerd[1504]: time="2025-05-13T23:46:26.129728349Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 23:46:26.169152 containerd[1504]: time="2025-05-13T23:46:26.169094170Z" level=info msg="CreateContainer within sandbox \"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 13 23:46:26.181418 containerd[1504]: time="2025-05-13T23:46:26.181369644Z" level=info msg="Container 0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:26.190143 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2320938985.mount: Deactivated successfully. May 13 23:46:26.223947 containerd[1504]: time="2025-05-13T23:46:26.223793144Z" level=info msg="CreateContainer within sandbox \"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\"" May 13 23:46:26.229177 containerd[1504]: time="2025-05-13T23:46:26.226239422Z" level=info msg="StartContainer for \"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\"" May 13 23:46:26.230629 containerd[1504]: time="2025-05-13T23:46:26.230592180Z" level=info msg="connecting to shim 0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3" address="unix:///run/containerd/s/cd69b12b2e92133eb339b9296dee1b458c5ff44e1fbe969998cad00b51622c0e" protocol=ttrpc version=3 May 13 23:46:26.260386 systemd[1]: Started cri-containerd-0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3.scope - libcontainer container 0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3. May 13 23:46:26.315048 containerd[1504]: time="2025-05-13T23:46:26.315011140Z" level=info msg="StartContainer for \"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\" returns successfully" May 13 23:46:26.534180 containerd[1504]: time="2025-05-13T23:46:26.534019275Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:26.537466 containerd[1504]: time="2025-05-13T23:46:26.537381873Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 13 23:46:26.541745 containerd[1504]: time="2025-05-13T23:46:26.541676311Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"41616801\" in 411.908682ms" May 13 23:46:26.541745 containerd[1504]: time="2025-05-13T23:46:26.541743111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\"" May 13 23:46:26.545871 containerd[1504]: time="2025-05-13T23:46:26.545188430Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 13 23:46:26.550195 containerd[1504]: time="2025-05-13T23:46:26.549902507Z" level=info msg="CreateContainer within sandbox \"572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 23:46:26.564289 containerd[1504]: time="2025-05-13T23:46:26.563080781Z" level=info msg="Container 199bb05f8ad8a2eb332670173adb7ce3d73cfa95c632b14cf273de4c253053c8: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:26.573274 containerd[1504]: time="2025-05-13T23:46:26.573225376Z" level=info msg="CreateContainer within sandbox \"572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"199bb05f8ad8a2eb332670173adb7ce3d73cfa95c632b14cf273de4c253053c8\"" May 13 23:46:26.574587 containerd[1504]: time="2025-05-13T23:46:26.574546296Z" level=info msg="StartContainer for \"199bb05f8ad8a2eb332670173adb7ce3d73cfa95c632b14cf273de4c253053c8\"" May 13 23:46:26.575824 containerd[1504]: time="2025-05-13T23:46:26.575765015Z" level=info msg="connecting to shim 199bb05f8ad8a2eb332670173adb7ce3d73cfa95c632b14cf273de4c253053c8" address="unix:///run/containerd/s/b23c1d752ca67457fa0a7e22977bf120658871f9353c7643c864862a64bc681a" protocol=ttrpc version=3 May 13 23:46:26.599843 systemd[1]: Started cri-containerd-199bb05f8ad8a2eb332670173adb7ce3d73cfa95c632b14cf273de4c253053c8.scope - libcontainer container 199bb05f8ad8a2eb332670173adb7ce3d73cfa95c632b14cf273de4c253053c8. May 13 23:46:26.633252 kubelet[2780]: I0513 23:46:26.633187 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7dd84bf879-dbp7p" podStartSLOduration=24.529066838 podStartE2EDuration="32.633157748s" podCreationTimestamp="2025-05-13 23:45:54 +0000 UTC" firstStartedPulling="2025-05-13 23:46:18.025512159 +0000 UTC m=+36.843777799" lastFinishedPulling="2025-05-13 23:46:26.129603069 +0000 UTC m=+44.947868709" observedRunningTime="2025-05-13 23:46:26.629152469 +0000 UTC m=+45.447418149" watchObservedRunningTime="2025-05-13 23:46:26.633157748 +0000 UTC m=+45.451423388" May 13 23:46:26.689850 containerd[1504]: time="2025-05-13T23:46:26.689731440Z" level=info msg="StartContainer for \"199bb05f8ad8a2eb332670173adb7ce3d73cfa95c632b14cf273de4c253053c8\" returns successfully" May 13 23:46:26.693101 containerd[1504]: time="2025-05-13T23:46:26.692113039Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\" id:\"fa697acf654426d9e48dfd501905a629dcde5fe63a55080537d39af51b7d303b\" pid:4943 exited_at:{seconds:1747179986 nanos:691155200}" May 13 23:46:27.635318 kubelet[2780]: I0513 23:46:27.635215 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-85dd65f9fd-fvtcs" podStartSLOduration=24.174392955 podStartE2EDuration="32.635191857s" podCreationTimestamp="2025-05-13 23:45:55 +0000 UTC" firstStartedPulling="2025-05-13 23:46:18.083793088 +0000 UTC m=+36.902058768" lastFinishedPulling="2025-05-13 23:46:26.54459191 +0000 UTC m=+45.362857670" observedRunningTime="2025-05-13 23:46:27.634958657 +0000 UTC m=+46.453224297" watchObservedRunningTime="2025-05-13 23:46:27.635191857 +0000 UTC m=+46.453457537" May 13 23:46:28.217392 containerd[1504]: time="2025-05-13T23:46:28.217314905Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:28.218644 containerd[1504]: time="2025-05-13T23:46:28.218307464Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7474935" May 13 23:46:28.219742 containerd[1504]: time="2025-05-13T23:46:28.219461623Z" level=info msg="ImageCreate event name:\"sha256:15faf29e8b518d846c91c15785ff89e783d356ea0f2b22826f47a556ea32645b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:28.222495 containerd[1504]: time="2025-05-13T23:46:28.222455021Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:28.222979 containerd[1504]: time="2025-05-13T23:46:28.222942660Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:15faf29e8b518d846c91c15785ff89e783d356ea0f2b22826f47a556ea32645b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"8844117\" in 1.67770075s" May 13 23:46:28.223038 containerd[1504]: time="2025-05-13T23:46:28.222979860Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:15faf29e8b518d846c91c15785ff89e783d356ea0f2b22826f47a556ea32645b\"" May 13 23:46:28.225316 containerd[1504]: time="2025-05-13T23:46:28.225270498Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 23:46:28.236815 containerd[1504]: time="2025-05-13T23:46:28.236308690Z" level=info msg="CreateContainer within sandbox \"80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 13 23:46:28.255110 containerd[1504]: time="2025-05-13T23:46:28.250853759Z" level=info msg="Container 1902b068506d018cfdef1adee0db7dffbfd62d119b4c31c4593644e79818155e: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:28.278129 containerd[1504]: time="2025-05-13T23:46:28.278018898Z" level=info msg="CreateContainer within sandbox \"80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"1902b068506d018cfdef1adee0db7dffbfd62d119b4c31c4593644e79818155e\"" May 13 23:46:28.278945 containerd[1504]: time="2025-05-13T23:46:28.278899098Z" level=info msg="StartContainer for \"1902b068506d018cfdef1adee0db7dffbfd62d119b4c31c4593644e79818155e\"" May 13 23:46:28.284183 containerd[1504]: time="2025-05-13T23:46:28.284000334Z" level=info msg="connecting to shim 1902b068506d018cfdef1adee0db7dffbfd62d119b4c31c4593644e79818155e" address="unix:///run/containerd/s/c403556a58fc7b775cd5c63f72fe910114b710224198ac34d577092a5b62aa78" protocol=ttrpc version=3 May 13 23:46:28.312453 systemd[1]: Started cri-containerd-1902b068506d018cfdef1adee0db7dffbfd62d119b4c31c4593644e79818155e.scope - libcontainer container 1902b068506d018cfdef1adee0db7dffbfd62d119b4c31c4593644e79818155e. May 13 23:46:28.377855 containerd[1504]: time="2025-05-13T23:46:28.377446823Z" level=info msg="StartContainer for \"1902b068506d018cfdef1adee0db7dffbfd62d119b4c31c4593644e79818155e\" returns successfully" May 13 23:46:28.628175 kubelet[2780]: I0513 23:46:28.627658 2780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:46:28.637509 containerd[1504]: time="2025-05-13T23:46:28.637456025Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:28.640648 containerd[1504]: time="2025-05-13T23:46:28.640095943Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 13 23:46:28.652132 containerd[1504]: time="2025-05-13T23:46:28.651552494Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"41616801\" in 426.235756ms" May 13 23:46:28.652132 containerd[1504]: time="2025-05-13T23:46:28.651608454Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\"" May 13 23:46:28.657170 containerd[1504]: time="2025-05-13T23:46:28.656940290Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 13 23:46:28.658448 containerd[1504]: time="2025-05-13T23:46:28.658319009Z" level=info msg="CreateContainer within sandbox \"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 23:46:28.668934 containerd[1504]: time="2025-05-13T23:46:28.667951162Z" level=info msg="Container e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:28.685015 containerd[1504]: time="2025-05-13T23:46:28.684551669Z" level=info msg="CreateContainer within sandbox \"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\"" May 13 23:46:28.692829 containerd[1504]: time="2025-05-13T23:46:28.692745823Z" level=info msg="StartContainer for \"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\"" May 13 23:46:28.694212 containerd[1504]: time="2025-05-13T23:46:28.694115382Z" level=info msg="connecting to shim e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023" address="unix:///run/containerd/s/ac3de3152cd3f91e059d534600f2ce68b0cdea026c7074c22b19043f3eace96d" protocol=ttrpc version=3 May 13 23:46:28.720271 systemd[1]: Started cri-containerd-e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023.scope - libcontainer container e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023. May 13 23:46:28.771724 containerd[1504]: time="2025-05-13T23:46:28.771410243Z" level=info msg="StartContainer for \"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\" returns successfully" May 13 23:46:30.635315 kubelet[2780]: I0513 23:46:30.634574 2780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:46:31.712842 containerd[1504]: time="2025-05-13T23:46:31.712764807Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:31.715099 containerd[1504]: time="2025-05-13T23:46:31.715008885Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13124299" May 13 23:46:31.717664 containerd[1504]: time="2025-05-13T23:46:31.716842363Z" level=info msg="ImageCreate event name:\"sha256:a91b1f00752edc175f270a01b33683fa80818734aa2274388785eaf3364315dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:31.719359 containerd[1504]: time="2025-05-13T23:46:31.719276800Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:46:31.720110 containerd[1504]: time="2025-05-13T23:46:31.719943759Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:a91b1f00752edc175f270a01b33683fa80818734aa2274388785eaf3364315dc\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"14493433\" in 3.062955709s" May 13 23:46:31.720110 containerd[1504]: time="2025-05-13T23:46:31.719976719Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:a91b1f00752edc175f270a01b33683fa80818734aa2274388785eaf3364315dc\"" May 13 23:46:31.726839 containerd[1504]: time="2025-05-13T23:46:31.726760831Z" level=info msg="CreateContainer within sandbox \"80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 13 23:46:31.740100 containerd[1504]: time="2025-05-13T23:46:31.738701537Z" level=info msg="Container 5b4a4ab6d900ba94645cadcf9c49c09d9c6f52e28bc6c7cdf76734c4c7743cb1: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:31.757814 containerd[1504]: time="2025-05-13T23:46:31.757734276Z" level=info msg="CreateContainer within sandbox \"80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"5b4a4ab6d900ba94645cadcf9c49c09d9c6f52e28bc6c7cdf76734c4c7743cb1\"" May 13 23:46:31.760353 containerd[1504]: time="2025-05-13T23:46:31.760293673Z" level=info msg="StartContainer for \"5b4a4ab6d900ba94645cadcf9c49c09d9c6f52e28bc6c7cdf76734c4c7743cb1\"" May 13 23:46:31.761885 containerd[1504]: time="2025-05-13T23:46:31.761850551Z" level=info msg="connecting to shim 5b4a4ab6d900ba94645cadcf9c49c09d9c6f52e28bc6c7cdf76734c4c7743cb1" address="unix:///run/containerd/s/c403556a58fc7b775cd5c63f72fe910114b710224198ac34d577092a5b62aa78" protocol=ttrpc version=3 May 13 23:46:31.789404 systemd[1]: Started cri-containerd-5b4a4ab6d900ba94645cadcf9c49c09d9c6f52e28bc6c7cdf76734c4c7743cb1.scope - libcontainer container 5b4a4ab6d900ba94645cadcf9c49c09d9c6f52e28bc6c7cdf76734c4c7743cb1. May 13 23:46:31.845962 containerd[1504]: time="2025-05-13T23:46:31.845098655Z" level=info msg="StartContainer for \"5b4a4ab6d900ba94645cadcf9c49c09d9c6f52e28bc6c7cdf76734c4c7743cb1\" returns successfully" May 13 23:46:32.463494 kubelet[2780]: I0513 23:46:32.463017 2780 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 13 23:46:32.463494 kubelet[2780]: I0513 23:46:32.463097 2780 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 13 23:46:32.670638 kubelet[2780]: I0513 23:46:32.670282 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-9b8b8f55f-twx7r" podStartSLOduration=31.770297183 podStartE2EDuration="39.670262943s" podCreationTimestamp="2025-05-13 23:45:53 +0000 UTC" firstStartedPulling="2025-05-13 23:46:20.755148211 +0000 UTC m=+39.573413891" lastFinishedPulling="2025-05-13 23:46:28.655113971 +0000 UTC m=+47.473379651" observedRunningTime="2025-05-13 23:46:29.659495839 +0000 UTC m=+48.477761519" watchObservedRunningTime="2025-05-13 23:46:32.670262943 +0000 UTC m=+51.488528623" May 13 23:46:39.331087 kubelet[2780]: I0513 23:46:39.330310 2780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:46:39.362980 kubelet[2780]: I0513 23:46:39.362904 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-kxjnf" podStartSLOduration=31.793400375 podStartE2EDuration="45.362867464s" podCreationTimestamp="2025-05-13 23:45:54 +0000 UTC" firstStartedPulling="2025-05-13 23:46:18.153142507 +0000 UTC m=+36.971408187" lastFinishedPulling="2025-05-13 23:46:31.722609596 +0000 UTC m=+50.540875276" observedRunningTime="2025-05-13 23:46:32.670049183 +0000 UTC m=+51.488314983" watchObservedRunningTime="2025-05-13 23:46:39.362867464 +0000 UTC m=+58.181133184" May 13 23:46:45.294960 containerd[1504]: time="2025-05-13T23:46:45.294911710Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" id:\"3fb58010030239f05f7eb436f01add2dd9240f6c2eebb0d7a830f47573c36907\" pid:5111 exited_at:{seconds:1747180005 nanos:294576111}" May 13 23:46:47.505434 kubelet[2780]: I0513 23:46:47.504825 2780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:46:47.597270 kubelet[2780]: I0513 23:46:47.597219 2780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:46:47.599418 containerd[1504]: time="2025-05-13T23:46:47.599207052Z" level=info msg="StopContainer for \"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\" with timeout 30 (s)" May 13 23:46:47.609799 containerd[1504]: time="2025-05-13T23:46:47.609645063Z" level=info msg="Stop container \"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\" with signal terminated" May 13 23:46:47.644619 systemd[1]: cri-containerd-e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023.scope: Deactivated successfully. May 13 23:46:47.645056 systemd[1]: cri-containerd-e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023.scope: Consumed 1.083s CPU time, 41.2M memory peak. May 13 23:46:47.663593 containerd[1504]: time="2025-05-13T23:46:47.663542797Z" level=info msg="received exit event container_id:\"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\" id:\"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\" pid:5022 exit_status:1 exited_at:{seconds:1747180007 nanos:662399760}" May 13 23:46:47.664918 containerd[1504]: time="2025-05-13T23:46:47.663802717Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\" id:\"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\" pid:5022 exit_status:1 exited_at:{seconds:1747180007 nanos:662399760}" May 13 23:46:47.675989 systemd[1]: Created slice kubepods-besteffort-pod97c15e15_76da_4040_8dc2_c62f1e57fd36.slice - libcontainer container kubepods-besteffort-pod97c15e15_76da_4040_8dc2_c62f1e57fd36.slice. May 13 23:46:47.722799 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023-rootfs.mount: Deactivated successfully. May 13 23:46:47.739251 containerd[1504]: time="2025-05-13T23:46:47.739201312Z" level=info msg="StopContainer for \"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\" returns successfully" May 13 23:46:47.746051 containerd[1504]: time="2025-05-13T23:46:47.745988374Z" level=info msg="StopPodSandbox for \"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\"" May 13 23:46:47.746855 containerd[1504]: time="2025-05-13T23:46:47.746523013Z" level=info msg="Container to stop \"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 13 23:46:47.761780 kubelet[2780]: I0513 23:46:47.761438 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/97c15e15-76da-4040-8dc2-c62f1e57fd36-calico-apiserver-certs\") pod \"calico-apiserver-85dd65f9fd-zbh5s\" (UID: \"97c15e15-76da-4040-8dc2-c62f1e57fd36\") " pod="calico-apiserver/calico-apiserver-85dd65f9fd-zbh5s" May 13 23:46:47.761780 kubelet[2780]: I0513 23:46:47.761615 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks49h\" (UniqueName: \"kubernetes.io/projected/97c15e15-76da-4040-8dc2-c62f1e57fd36-kube-api-access-ks49h\") pod \"calico-apiserver-85dd65f9fd-zbh5s\" (UID: \"97c15e15-76da-4040-8dc2-c62f1e57fd36\") " pod="calico-apiserver/calico-apiserver-85dd65f9fd-zbh5s" May 13 23:46:47.770314 systemd[1]: cri-containerd-7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f.scope: Deactivated successfully. May 13 23:46:47.770616 systemd[1]: cri-containerd-7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f.scope: Consumed 27ms CPU time, 3.9M memory peak, 1M read from disk. May 13 23:46:47.779571 containerd[1504]: time="2025-05-13T23:46:47.779495483Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\" id:\"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\" pid:4680 exit_status:137 exited_at:{seconds:1747180007 nanos:777543888}" May 13 23:46:47.818047 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f-rootfs.mount: Deactivated successfully. May 13 23:46:47.821024 containerd[1504]: time="2025-05-13T23:46:47.820767091Z" level=info msg="shim disconnected" id=7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f namespace=k8s.io May 13 23:46:47.821024 containerd[1504]: time="2025-05-13T23:46:47.820814371Z" level=warning msg="cleaning up after shim disconnected" id=7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f namespace=k8s.io May 13 23:46:47.821024 containerd[1504]: time="2025-05-13T23:46:47.820851171Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 13 23:46:47.851325 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f-shm.mount: Deactivated successfully. May 13 23:46:47.857357 containerd[1504]: time="2025-05-13T23:46:47.857276832Z" level=info msg="received exit event sandbox_id:\"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\" exit_status:137 exited_at:{seconds:1747180007 nanos:777543888}" May 13 23:46:47.966008 systemd-networkd[1392]: cali19510292001: Link DOWN May 13 23:46:47.966016 systemd-networkd[1392]: cali19510292001: Lost carrier May 13 23:46:47.989359 containerd[1504]: time="2025-05-13T23:46:47.989312395Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85dd65f9fd-zbh5s,Uid:97c15e15-76da-4040-8dc2-c62f1e57fd36,Namespace:calico-apiserver,Attempt:0,}" May 13 23:46:48.126512 containerd[1504]: 2025-05-13 23:46:47.963 [INFO][5204] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" May 13 23:46:48.126512 containerd[1504]: 2025-05-13 23:46:47.964 [INFO][5204] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" iface="eth0" netns="/var/run/netns/cni-0e30959c-1b40-8eb0-c36f-373cd01bbcff" May 13 23:46:48.126512 containerd[1504]: 2025-05-13 23:46:47.964 [INFO][5204] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" iface="eth0" netns="/var/run/netns/cni-0e30959c-1b40-8eb0-c36f-373cd01bbcff" May 13 23:46:48.126512 containerd[1504]: 2025-05-13 23:46:47.977 [INFO][5204] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" after=13.073164ms iface="eth0" netns="/var/run/netns/cni-0e30959c-1b40-8eb0-c36f-373cd01bbcff" May 13 23:46:48.126512 containerd[1504]: 2025-05-13 23:46:47.977 [INFO][5204] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" May 13 23:46:48.126512 containerd[1504]: 2025-05-13 23:46:47.977 [INFO][5204] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" May 13 23:46:48.126512 containerd[1504]: 2025-05-13 23:46:48.013 [INFO][5218] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" HandleID="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:46:48.126512 containerd[1504]: 2025-05-13 23:46:48.014 [INFO][5218] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:46:48.126512 containerd[1504]: 2025-05-13 23:46:48.014 [INFO][5218] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:46:48.126512 containerd[1504]: 2025-05-13 23:46:48.116 [INFO][5218] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" HandleID="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:46:48.126512 containerd[1504]: 2025-05-13 23:46:48.116 [INFO][5218] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" HandleID="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:46:48.126512 containerd[1504]: 2025-05-13 23:46:48.121 [INFO][5218] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:46:48.126512 containerd[1504]: 2025-05-13 23:46:48.123 [INFO][5204] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" May 13 23:46:48.128234 containerd[1504]: time="2025-05-13T23:46:48.127786410Z" level=info msg="TearDown network for sandbox \"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\" successfully" May 13 23:46:48.128234 containerd[1504]: time="2025-05-13T23:46:48.127824850Z" level=info msg="StopPodSandbox for \"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\" returns successfully" May 13 23:46:48.266744 kubelet[2780]: I0513 23:46:48.266311 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0aba1ede-c9a0-4e1e-accf-2cb417eef657-calico-apiserver-certs\") pod \"0aba1ede-c9a0-4e1e-accf-2cb417eef657\" (UID: \"0aba1ede-c9a0-4e1e-accf-2cb417eef657\") " May 13 23:46:48.266744 kubelet[2780]: I0513 23:46:48.266365 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84mq7\" (UniqueName: \"kubernetes.io/projected/0aba1ede-c9a0-4e1e-accf-2cb417eef657-kube-api-access-84mq7\") pod \"0aba1ede-c9a0-4e1e-accf-2cb417eef657\" (UID: \"0aba1ede-c9a0-4e1e-accf-2cb417eef657\") " May 13 23:46:48.271630 systemd-networkd[1392]: calif1c731aee63: Link UP May 13 23:46:48.271859 systemd-networkd[1392]: calif1c731aee63: Gained carrier May 13 23:46:48.275722 kubelet[2780]: I0513 23:46:48.275657 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aba1ede-c9a0-4e1e-accf-2cb417eef657-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "0aba1ede-c9a0-4e1e-accf-2cb417eef657" (UID: "0aba1ede-c9a0-4e1e-accf-2cb417eef657"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 13 23:46:48.277003 kubelet[2780]: I0513 23:46:48.276947 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aba1ede-c9a0-4e1e-accf-2cb417eef657-kube-api-access-84mq7" (OuterVolumeSpecName: "kube-api-access-84mq7") pod "0aba1ede-c9a0-4e1e-accf-2cb417eef657" (UID: "0aba1ede-c9a0-4e1e-accf-2cb417eef657"). InnerVolumeSpecName "kube-api-access-84mq7". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.052 [INFO][5226] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--zbh5s-eth0 calico-apiserver-85dd65f9fd- calico-apiserver 97c15e15-76da-4040-8dc2-c62f1e57fd36 957 0 2025-05-13 23:46:47 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:85dd65f9fd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-732e99817a calico-apiserver-85dd65f9fd-zbh5s eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif1c731aee63 [] []}} ContainerID="74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-zbh5s" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--zbh5s-" May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.055 [INFO][5226] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-zbh5s" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--zbh5s-eth0" May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.123 [INFO][5241] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" HandleID="k8s-pod-network.74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--zbh5s-eth0" May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.149 [INFO][5241] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" HandleID="k8s-pod-network.74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--zbh5s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400028c6b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-732e99817a", "pod":"calico-apiserver-85dd65f9fd-zbh5s", "timestamp":"2025-05-13 23:46:48.123297183 +0000 UTC"}, Hostname:"ci-4284-0-0-n-732e99817a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.150 [INFO][5241] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.150 [INFO][5241] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.150 [INFO][5241] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-732e99817a' May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.159 [INFO][5241] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" host="ci-4284-0-0-n-732e99817a" May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.213 [INFO][5241] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-732e99817a" May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.227 [INFO][5241] ipam/ipam.go 489: Trying affinity for 192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.233 [INFO][5241] ipam/ipam.go 155: Attempting to load block cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.238 [INFO][5241] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.238 [INFO][5241] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.52.64/26 handle="k8s-pod-network.74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" host="ci-4284-0-0-n-732e99817a" May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.242 [INFO][5241] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415 May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.250 [INFO][5241] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.52.64/26 handle="k8s-pod-network.74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" host="ci-4284-0-0-n-732e99817a" May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.261 [INFO][5241] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.52.72/26] block=192.168.52.64/26 handle="k8s-pod-network.74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" host="ci-4284-0-0-n-732e99817a" May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.261 [INFO][5241] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.52.72/26] handle="k8s-pod-network.74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" host="ci-4284-0-0-n-732e99817a" May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.261 [INFO][5241] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:46:48.288880 containerd[1504]: 2025-05-13 23:46:48.261 [INFO][5241] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.52.72/26] IPv6=[] ContainerID="74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" HandleID="k8s-pod-network.74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--zbh5s-eth0" May 13 23:46:48.289872 containerd[1504]: 2025-05-13 23:46:48.265 [INFO][5226] cni-plugin/k8s.go 386: Populated endpoint ContainerID="74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-zbh5s" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--zbh5s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--zbh5s-eth0", GenerateName:"calico-apiserver-85dd65f9fd-", Namespace:"calico-apiserver", SelfLink:"", UID:"97c15e15-76da-4040-8dc2-c62f1e57fd36", ResourceVersion:"957", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 46, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85dd65f9fd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"", Pod:"calico-apiserver-85dd65f9fd-zbh5s", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.52.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif1c731aee63", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:48.289872 containerd[1504]: 2025-05-13 23:46:48.265 [INFO][5226] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.52.72/32] ContainerID="74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-zbh5s" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--zbh5s-eth0" May 13 23:46:48.289872 containerd[1504]: 2025-05-13 23:46:48.265 [INFO][5226] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif1c731aee63 ContainerID="74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-zbh5s" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--zbh5s-eth0" May 13 23:46:48.289872 containerd[1504]: 2025-05-13 23:46:48.271 [INFO][5226] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-zbh5s" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--zbh5s-eth0" May 13 23:46:48.289872 containerd[1504]: 2025-05-13 23:46:48.272 [INFO][5226] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-zbh5s" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--zbh5s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--zbh5s-eth0", GenerateName:"calico-apiserver-85dd65f9fd-", Namespace:"calico-apiserver", SelfLink:"", UID:"97c15e15-76da-4040-8dc2-c62f1e57fd36", ResourceVersion:"957", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 46, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85dd65f9fd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415", Pod:"calico-apiserver-85dd65f9fd-zbh5s", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.52.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif1c731aee63", MAC:"4e:d8:fe:d3:99:23", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:48.289872 containerd[1504]: 2025-05-13 23:46:48.284 [INFO][5226] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" Namespace="calico-apiserver" Pod="calico-apiserver-85dd65f9fd-zbh5s" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--85dd65f9fd--zbh5s-eth0" May 13 23:46:48.319810 containerd[1504]: time="2025-05-13T23:46:48.319593356Z" level=info msg="connecting to shim 74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415" address="unix:///run/containerd/s/a552e6c629332f57ccb7111a735d12f37d76de4a0a5ff42c312395d85c1efc2d" namespace=k8s.io protocol=ttrpc version=3 May 13 23:46:48.348404 systemd[1]: Started cri-containerd-74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415.scope - libcontainer container 74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415. May 13 23:46:48.367101 kubelet[2780]: I0513 23:46:48.367010 2780 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0aba1ede-c9a0-4e1e-accf-2cb417eef657-calico-apiserver-certs\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:48.367101 kubelet[2780]: I0513 23:46:48.367043 2780 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-84mq7\" (UniqueName: \"kubernetes.io/projected/0aba1ede-c9a0-4e1e-accf-2cb417eef657-kube-api-access-84mq7\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:48.394109 containerd[1504]: time="2025-05-13T23:46:48.392347674Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85dd65f9fd-zbh5s,Uid:97c15e15-76da-4040-8dc2-c62f1e57fd36,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415\"" May 13 23:46:48.398592 containerd[1504]: time="2025-05-13T23:46:48.398436337Z" level=info msg="CreateContainer within sandbox \"74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 23:46:48.408614 containerd[1504]: time="2025-05-13T23:46:48.408565069Z" level=info msg="Container ccc165ff87a8ac44fa7435c42c01301d4691dcea96418b74b2f0fcd68ba284fe: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:48.420480 containerd[1504]: time="2025-05-13T23:46:48.420391596Z" level=info msg="CreateContainer within sandbox \"74418a930613f0e6dcf86d1da5cd7d898ab78aff07ad9b73f05709abae2de415\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ccc165ff87a8ac44fa7435c42c01301d4691dcea96418b74b2f0fcd68ba284fe\"" May 13 23:46:48.421796 containerd[1504]: time="2025-05-13T23:46:48.421675232Z" level=info msg="StartContainer for \"ccc165ff87a8ac44fa7435c42c01301d4691dcea96418b74b2f0fcd68ba284fe\"" May 13 23:46:48.424620 containerd[1504]: time="2025-05-13T23:46:48.423879546Z" level=info msg="connecting to shim ccc165ff87a8ac44fa7435c42c01301d4691dcea96418b74b2f0fcd68ba284fe" address="unix:///run/containerd/s/a552e6c629332f57ccb7111a735d12f37d76de4a0a5ff42c312395d85c1efc2d" protocol=ttrpc version=3 May 13 23:46:48.450474 systemd[1]: Started cri-containerd-ccc165ff87a8ac44fa7435c42c01301d4691dcea96418b74b2f0fcd68ba284fe.scope - libcontainer container ccc165ff87a8ac44fa7435c42c01301d4691dcea96418b74b2f0fcd68ba284fe. May 13 23:46:48.500709 containerd[1504]: time="2025-05-13T23:46:48.500662132Z" level=info msg="StartContainer for \"ccc165ff87a8ac44fa7435c42c01301d4691dcea96418b74b2f0fcd68ba284fe\" returns successfully" May 13 23:46:48.711433 kubelet[2780]: I0513 23:46:48.711175 2780 scope.go:117] "RemoveContainer" containerID="e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023" May 13 23:46:48.715663 containerd[1504]: time="2025-05-13T23:46:48.715211095Z" level=info msg="RemoveContainer for \"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\"" May 13 23:46:48.719508 systemd[1]: Removed slice kubepods-besteffort-pod0aba1ede_c9a0_4e1e_accf_2cb417eef657.slice - libcontainer container kubepods-besteffort-pod0aba1ede_c9a0_4e1e_accf_2cb417eef657.slice. May 13 23:46:48.720043 systemd[1]: kubepods-besteffort-pod0aba1ede_c9a0_4e1e_accf_2cb417eef657.slice: Consumed 1.110s CPU time, 41.5M memory peak, 1M read from disk. May 13 23:46:48.730043 systemd[1]: run-netns-cni\x2d0e30959c\x2d1b40\x2d8eb0\x2dc36f\x2d373cd01bbcff.mount: Deactivated successfully. May 13 23:46:48.732188 systemd[1]: var-lib-kubelet-pods-0aba1ede\x2dc9a0\x2d4e1e\x2daccf\x2d2cb417eef657-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d84mq7.mount: Deactivated successfully. May 13 23:46:48.732479 systemd[1]: var-lib-kubelet-pods-0aba1ede\x2dc9a0\x2d4e1e\x2daccf\x2d2cb417eef657-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. May 13 23:46:48.751165 containerd[1504]: time="2025-05-13T23:46:48.750831796Z" level=info msg="RemoveContainer for \"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\" returns successfully" May 13 23:46:48.752177 kubelet[2780]: I0513 23:46:48.752019 2780 scope.go:117] "RemoveContainer" containerID="e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023" May 13 23:46:48.752469 containerd[1504]: time="2025-05-13T23:46:48.752378112Z" level=error msg="ContainerStatus for \"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\": not found" May 13 23:46:48.752625 kubelet[2780]: E0513 23:46:48.752543 2780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\": not found" containerID="e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023" May 13 23:46:48.752722 kubelet[2780]: I0513 23:46:48.752573 2780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023"} err="failed to get container status \"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\": rpc error: code = NotFound desc = an error occurred when try to find container \"e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023\": not found" May 13 23:46:48.775197 kubelet[2780]: I0513 23:46:48.774808 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-85dd65f9fd-zbh5s" podStartSLOduration=1.77478541 podStartE2EDuration="1.77478541s" podCreationTimestamp="2025-05-13 23:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:46:48.770400382 +0000 UTC m=+67.588666142" watchObservedRunningTime="2025-05-13 23:46:48.77478541 +0000 UTC m=+67.593051090" May 13 23:46:49.304052 kubelet[2780]: I0513 23:46:49.303115 2780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aba1ede-c9a0-4e1e-accf-2cb417eef657" path="/var/lib/kubelet/pods/0aba1ede-c9a0-4e1e-accf-2cb417eef657/volumes" May 13 23:46:49.980082 containerd[1504]: time="2025-05-13T23:46:49.980001866Z" level=info msg="StopContainer for \"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\" with timeout 30 (s)" May 13 23:46:49.981431 containerd[1504]: time="2025-05-13T23:46:49.980468264Z" level=info msg="Stop container \"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\" with signal terminated" May 13 23:46:50.034404 containerd[1504]: time="2025-05-13T23:46:50.034194109Z" level=info msg="received exit event container_id:\"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\" id:\"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\" pid:4850 exit_status:1 exited_at:{seconds:1747180010 nanos:33461231}" May 13 23:46:50.034740 containerd[1504]: time="2025-05-13T23:46:50.034713667Z" level=info msg="TaskExit event in podsandbox handler container_id:\"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\" id:\"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\" pid:4850 exit_status:1 exited_at:{seconds:1747180010 nanos:33461231}" May 13 23:46:50.036183 systemd[1]: cri-containerd-17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e.scope: Deactivated successfully. May 13 23:46:50.037789 systemd[1]: cri-containerd-17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e.scope: Consumed 1.670s CPU time, 53.7M memory peak, 4K read from disk. May 13 23:46:50.043417 systemd-networkd[1392]: calif1c731aee63: Gained IPv6LL May 13 23:46:50.097471 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e-rootfs.mount: Deactivated successfully. May 13 23:46:50.106356 containerd[1504]: time="2025-05-13T23:46:50.106312258Z" level=info msg="StopContainer for \"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\" returns successfully" May 13 23:46:50.109460 containerd[1504]: time="2025-05-13T23:46:50.109371849Z" level=info msg="StopPodSandbox for \"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\"" May 13 23:46:50.109783 containerd[1504]: time="2025-05-13T23:46:50.109558688Z" level=info msg="Container to stop \"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 13 23:46:50.125213 systemd[1]: cri-containerd-2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e.scope: Deactivated successfully. May 13 23:46:50.129875 containerd[1504]: time="2025-05-13T23:46:50.129731509Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\" id:\"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\" pid:4068 exit_status:137 exited_at:{seconds:1747180010 nanos:125853001}" May 13 23:46:50.183900 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e-rootfs.mount: Deactivated successfully. May 13 23:46:50.186497 containerd[1504]: time="2025-05-13T23:46:50.184715709Z" level=info msg="shim disconnected" id=2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e namespace=k8s.io May 13 23:46:50.186497 containerd[1504]: time="2025-05-13T23:46:50.184752789Z" level=warning msg="cleaning up after shim disconnected" id=2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e namespace=k8s.io May 13 23:46:50.186497 containerd[1504]: time="2025-05-13T23:46:50.184783989Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 13 23:46:50.204880 containerd[1504]: time="2025-05-13T23:46:50.204638571Z" level=info msg="received exit event sandbox_id:\"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\" exit_status:137 exited_at:{seconds:1747180010 nanos:125853001}" May 13 23:46:50.211011 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e-shm.mount: Deactivated successfully. May 13 23:46:50.282273 systemd-networkd[1392]: calib02b243a5f2: Link DOWN May 13 23:46:50.283626 systemd-networkd[1392]: calib02b243a5f2: Lost carrier May 13 23:46:50.433838 containerd[1504]: 2025-05-13 23:46:50.278 [INFO][5414] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" May 13 23:46:50.433838 containerd[1504]: 2025-05-13 23:46:50.278 [INFO][5414] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" iface="eth0" netns="/var/run/netns/cni-74563712-b137-75d5-d356-ac711c8238c1" May 13 23:46:50.433838 containerd[1504]: 2025-05-13 23:46:50.279 [INFO][5414] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" iface="eth0" netns="/var/run/netns/cni-74563712-b137-75d5-d356-ac711c8238c1" May 13 23:46:50.433838 containerd[1504]: 2025-05-13 23:46:50.290 [INFO][5414] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" after=11.583967ms iface="eth0" netns="/var/run/netns/cni-74563712-b137-75d5-d356-ac711c8238c1" May 13 23:46:50.433838 containerd[1504]: 2025-05-13 23:46:50.290 [INFO][5414] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" May 13 23:46:50.433838 containerd[1504]: 2025-05-13 23:46:50.290 [INFO][5414] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" May 13 23:46:50.433838 containerd[1504]: 2025-05-13 23:46:50.325 [INFO][5422] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" HandleID="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:46:50.433838 containerd[1504]: 2025-05-13 23:46:50.325 [INFO][5422] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:46:50.433838 containerd[1504]: 2025-05-13 23:46:50.325 [INFO][5422] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:46:50.433838 containerd[1504]: 2025-05-13 23:46:50.420 [INFO][5422] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" HandleID="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:46:50.433838 containerd[1504]: 2025-05-13 23:46:50.423 [INFO][5422] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" HandleID="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:46:50.433838 containerd[1504]: 2025-05-13 23:46:50.429 [INFO][5422] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:46:50.433838 containerd[1504]: 2025-05-13 23:46:50.432 [INFO][5414] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" May 13 23:46:50.437689 containerd[1504]: time="2025-05-13T23:46:50.437032611Z" level=info msg="TearDown network for sandbox \"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\" successfully" May 13 23:46:50.437689 containerd[1504]: time="2025-05-13T23:46:50.437145371Z" level=info msg="StopPodSandbox for \"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\" returns successfully" May 13 23:46:50.440500 systemd[1]: run-netns-cni\x2d74563712\x2db137\x2d75d5\x2dd356\x2dac711c8238c1.mount: Deactivated successfully. May 13 23:46:50.574062 containerd[1504]: time="2025-05-13T23:46:50.573919611Z" level=info msg="StopContainer for \"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\" with timeout 300 (s)" May 13 23:46:50.575567 containerd[1504]: time="2025-05-13T23:46:50.575520926Z" level=info msg="Stop container \"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\" with signal terminated" May 13 23:46:50.586109 kubelet[2780]: I0513 23:46:50.582666 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngb9x\" (UniqueName: \"kubernetes.io/projected/ac7eac49-3976-4e76-988d-1b85acd57174-kube-api-access-ngb9x\") pod \"ac7eac49-3976-4e76-988d-1b85acd57174\" (UID: \"ac7eac49-3976-4e76-988d-1b85acd57174\") " May 13 23:46:50.586109 kubelet[2780]: I0513 23:46:50.584411 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ac7eac49-3976-4e76-988d-1b85acd57174-calico-apiserver-certs\") pod \"ac7eac49-3976-4e76-988d-1b85acd57174\" (UID: \"ac7eac49-3976-4e76-988d-1b85acd57174\") " May 13 23:46:50.595656 systemd[1]: var-lib-kubelet-pods-ac7eac49\x2d3976\x2d4e76\x2d988d\x2d1b85acd57174-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dngb9x.mount: Deactivated successfully. May 13 23:46:50.600460 kubelet[2780]: I0513 23:46:50.600396 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac7eac49-3976-4e76-988d-1b85acd57174-kube-api-access-ngb9x" (OuterVolumeSpecName: "kube-api-access-ngb9x") pod "ac7eac49-3976-4e76-988d-1b85acd57174" (UID: "ac7eac49-3976-4e76-988d-1b85acd57174"). InnerVolumeSpecName "kube-api-access-ngb9x". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 13 23:46:50.600954 kubelet[2780]: I0513 23:46:50.600907 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac7eac49-3976-4e76-988d-1b85acd57174-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "ac7eac49-3976-4e76-988d-1b85acd57174" (UID: "ac7eac49-3976-4e76-988d-1b85acd57174"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 13 23:46:50.685670 kubelet[2780]: I0513 23:46:50.685620 2780 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ngb9x\" (UniqueName: \"kubernetes.io/projected/ac7eac49-3976-4e76-988d-1b85acd57174-kube-api-access-ngb9x\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:50.685670 kubelet[2780]: I0513 23:46:50.685664 2780 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ac7eac49-3976-4e76-988d-1b85acd57174-calico-apiserver-certs\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:50.732736 kubelet[2780]: I0513 23:46:50.732699 2780 scope.go:117] "RemoveContainer" containerID="17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e" May 13 23:46:50.738710 containerd[1504]: time="2025-05-13T23:46:50.738582850Z" level=info msg="RemoveContainer for \"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\"" May 13 23:46:50.746550 containerd[1504]: time="2025-05-13T23:46:50.746450627Z" level=info msg="RemoveContainer for \"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\" returns successfully" May 13 23:46:50.748164 systemd[1]: Removed slice kubepods-besteffort-podac7eac49_3976_4e76_988d_1b85acd57174.slice - libcontainer container kubepods-besteffort-podac7eac49_3976_4e76_988d_1b85acd57174.slice. May 13 23:46:50.748355 systemd[1]: kubepods-besteffort-podac7eac49_3976_4e76_988d_1b85acd57174.slice: Consumed 1.689s CPU time, 54M memory peak, 4K read from disk. May 13 23:46:50.748750 kubelet[2780]: I0513 23:46:50.748710 2780 scope.go:117] "RemoveContainer" containerID="17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e" May 13 23:46:50.749345 kubelet[2780]: E0513 23:46:50.749311 2780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\": not found" containerID="17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e" May 13 23:46:50.749422 containerd[1504]: time="2025-05-13T23:46:50.749038059Z" level=error msg="ContainerStatus for \"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\": not found" May 13 23:46:50.749456 kubelet[2780]: I0513 23:46:50.749344 2780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e"} err="failed to get container status \"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\": rpc error: code = NotFound desc = an error occurred when try to find container \"17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e\": not found" May 13 23:46:50.816356 containerd[1504]: time="2025-05-13T23:46:50.816308582Z" level=info msg="StopContainer for \"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\" with timeout 30 (s)" May 13 23:46:50.816958 containerd[1504]: time="2025-05-13T23:46:50.816925621Z" level=info msg="Stop container \"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\" with signal terminated" May 13 23:46:50.846197 systemd[1]: cri-containerd-0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3.scope: Deactivated successfully. May 13 23:46:50.849556 containerd[1504]: time="2025-05-13T23:46:50.849501685Z" level=info msg="received exit event container_id:\"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\" id:\"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\" pid:4892 exit_status:2 exited_at:{seconds:1747180010 nanos:848564888}" May 13 23:46:50.849703 containerd[1504]: time="2025-05-13T23:46:50.849628445Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\" id:\"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\" pid:4892 exit_status:2 exited_at:{seconds:1747180010 nanos:848564888}" May 13 23:46:50.903086 containerd[1504]: time="2025-05-13T23:46:50.902842689Z" level=info msg="StopContainer for \"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\" returns successfully" May 13 23:46:50.903784 containerd[1504]: time="2025-05-13T23:46:50.903746807Z" level=info msg="StopPodSandbox for \"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\"" May 13 23:46:50.903853 containerd[1504]: time="2025-05-13T23:46:50.903830007Z" level=info msg="Container to stop \"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 13 23:46:50.918500 containerd[1504]: time="2025-05-13T23:46:50.918444084Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\" id:\"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\" pid:4270 exit_status:137 exited_at:{seconds:1747180010 nanos:917790206}" May 13 23:46:50.919240 systemd[1]: cri-containerd-e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9.scope: Deactivated successfully. May 13 23:46:50.968330 containerd[1504]: time="2025-05-13T23:46:50.968288338Z" level=info msg="shim disconnected" id=e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9 namespace=k8s.io May 13 23:46:50.968463 containerd[1504]: time="2025-05-13T23:46:50.968324418Z" level=warning msg="cleaning up after shim disconnected" id=e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9 namespace=k8s.io May 13 23:46:50.968463 containerd[1504]: time="2025-05-13T23:46:50.968357498Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 13 23:46:51.001856 containerd[1504]: time="2025-05-13T23:46:51.001639481Z" level=info msg="received exit event sandbox_id:\"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\" exit_status:137 exited_at:{seconds:1747180010 nanos:917790206}" May 13 23:46:51.001856 containerd[1504]: time="2025-05-13T23:46:51.001737840Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" id:\"ebad20e720ab76f62c2e74b58cfeedcc2f7acb97ec1979785716a361c901a26b\" pid:5457 exited_at:{seconds:1747180010 nanos:999918766}" May 13 23:46:51.007785 containerd[1504]: time="2025-05-13T23:46:51.007381583Z" level=info msg="StopContainer for \"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" with timeout 5 (s)" May 13 23:46:51.007923 containerd[1504]: time="2025-05-13T23:46:51.007814742Z" level=info msg="Stop container \"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" with signal terminated" May 13 23:46:51.071959 systemd[1]: cri-containerd-e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3.scope: Deactivated successfully. May 13 23:46:51.072435 systemd[1]: cri-containerd-e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3.scope: Consumed 3.049s CPU time, 161.1M memory peak, 4K read from disk, 1.4M written to disk. May 13 23:46:51.081522 containerd[1504]: time="2025-05-13T23:46:51.080948683Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" id:\"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" pid:3771 exited_at:{seconds:1747180011 nanos:79839887}" May 13 23:46:51.081522 containerd[1504]: time="2025-05-13T23:46:51.081047963Z" level=info msg="received exit event container_id:\"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" id:\"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" pid:3771 exited_at:{seconds:1747180011 nanos:79839887}" May 13 23:46:51.096880 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3-rootfs.mount: Deactivated successfully. May 13 23:46:51.098114 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9-rootfs.mount: Deactivated successfully. May 13 23:46:51.098227 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9-shm.mount: Deactivated successfully. May 13 23:46:51.098308 systemd[1]: var-lib-kubelet-pods-ac7eac49\x2d3976\x2d4e76\x2d988d\x2d1b85acd57174-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. May 13 23:46:51.147812 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3-rootfs.mount: Deactivated successfully. May 13 23:46:51.173307 containerd[1504]: time="2025-05-13T23:46:51.173214888Z" level=info msg="StopContainer for \"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" returns successfully" May 13 23:46:51.174027 containerd[1504]: time="2025-05-13T23:46:51.173988045Z" level=info msg="StopPodSandbox for \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\"" May 13 23:46:51.174114 containerd[1504]: time="2025-05-13T23:46:51.174060485Z" level=info msg="Container to stop \"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 13 23:46:51.174114 containerd[1504]: time="2025-05-13T23:46:51.174090805Z" level=info msg="Container to stop \"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 13 23:46:51.174114 containerd[1504]: time="2025-05-13T23:46:51.174100565Z" level=info msg="Container to stop \"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 13 23:46:51.177862 systemd-networkd[1392]: cali80f936cec26: Link DOWN May 13 23:46:51.177872 systemd-networkd[1392]: cali80f936cec26: Lost carrier May 13 23:46:51.204563 systemd[1]: cri-containerd-1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca.scope: Deactivated successfully. May 13 23:46:51.205902 containerd[1504]: time="2025-05-13T23:46:51.205086512Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" id:\"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" pid:3305 exit_status:137 exited_at:{seconds:1747180011 nanos:204533514}" May 13 23:46:51.267005 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca-rootfs.mount: Deactivated successfully. May 13 23:46:51.272422 containerd[1504]: time="2025-05-13T23:46:51.272176072Z" level=info msg="shim disconnected" id=1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca namespace=k8s.io May 13 23:46:51.272794 containerd[1504]: time="2025-05-13T23:46:51.272214272Z" level=warning msg="cleaning up after shim disconnected" id=1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca namespace=k8s.io May 13 23:46:51.272794 containerd[1504]: time="2025-05-13T23:46:51.272742430Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 13 23:46:51.302697 kubelet[2780]: I0513 23:46:51.302385 2780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac7eac49-3976-4e76-988d-1b85acd57174" path="/var/lib/kubelet/pods/ac7eac49-3976-4e76-988d-1b85acd57174/volumes" May 13 23:46:51.312340 containerd[1504]: time="2025-05-13T23:46:51.311691834Z" level=info msg="received exit event sandbox_id:\"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" exit_status:137 exited_at:{seconds:1747180011 nanos:204533514}" May 13 23:46:51.315716 containerd[1504]: time="2025-05-13T23:46:51.315210823Z" level=info msg="TearDown network for sandbox \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" successfully" May 13 23:46:51.315716 containerd[1504]: time="2025-05-13T23:46:51.315301383Z" level=info msg="StopPodSandbox for \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" returns successfully" May 13 23:46:51.315771 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca-shm.mount: Deactivated successfully. May 13 23:46:51.407234 containerd[1504]: 2025-05-13 23:46:51.172 [INFO][5546] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" May 13 23:46:51.407234 containerd[1504]: 2025-05-13 23:46:51.175 [INFO][5546] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" iface="eth0" netns="/var/run/netns/cni-1e8f6f27-d744-9216-5da9-dc612ab1ab94" May 13 23:46:51.407234 containerd[1504]: 2025-05-13 23:46:51.177 [INFO][5546] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" iface="eth0" netns="/var/run/netns/cni-1e8f6f27-d744-9216-5da9-dc612ab1ab94" May 13 23:46:51.407234 containerd[1504]: 2025-05-13 23:46:51.196 [INFO][5546] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" after=19.707461ms iface="eth0" netns="/var/run/netns/cni-1e8f6f27-d744-9216-5da9-dc612ab1ab94" May 13 23:46:51.407234 containerd[1504]: 2025-05-13 23:46:51.197 [INFO][5546] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" May 13 23:46:51.407234 containerd[1504]: 2025-05-13 23:46:51.197 [INFO][5546] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" May 13 23:46:51.407234 containerd[1504]: 2025-05-13 23:46:51.251 [INFO][5574] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" HandleID="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:46:51.407234 containerd[1504]: 2025-05-13 23:46:51.252 [INFO][5574] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:46:51.407234 containerd[1504]: 2025-05-13 23:46:51.252 [INFO][5574] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:46:51.407234 containerd[1504]: 2025-05-13 23:46:51.394 [INFO][5574] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" HandleID="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:46:51.407234 containerd[1504]: 2025-05-13 23:46:51.394 [INFO][5574] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" HandleID="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:46:51.407234 containerd[1504]: 2025-05-13 23:46:51.401 [INFO][5574] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:46:51.407234 containerd[1504]: 2025-05-13 23:46:51.404 [INFO][5546] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" May 13 23:46:51.410640 containerd[1504]: time="2025-05-13T23:46:51.407919786Z" level=info msg="TearDown network for sandbox \"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\" successfully" May 13 23:46:51.410640 containerd[1504]: time="2025-05-13T23:46:51.407951546Z" level=info msg="StopPodSandbox for \"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\" returns successfully" May 13 23:46:51.412231 systemd[1]: run-netns-cni\x2d1e8f6f27\x2dd744\x2d9216\x2d5da9\x2ddc612ab1ab94.mount: Deactivated successfully. May 13 23:46:51.419126 kubelet[2780]: I0513 23:46:51.418521 2780 memory_manager.go:355] "RemoveStaleState removing state" podUID="ac7eac49-3976-4e76-988d-1b85acd57174" containerName="calico-apiserver" May 13 23:46:51.419126 kubelet[2780]: I0513 23:46:51.418570 2780 memory_manager.go:355] "RemoveStaleState removing state" podUID="0aba1ede-c9a0-4e1e-accf-2cb417eef657" containerName="calico-apiserver" May 13 23:46:51.419126 kubelet[2780]: I0513 23:46:51.418577 2780 memory_manager.go:355] "RemoveStaleState removing state" podUID="61ba2088-39e0-4223-b95b-586fe99f906e" containerName="calico-node" May 13 23:46:51.431981 systemd[1]: Created slice kubepods-besteffort-pod77347435_1e1d_4da2_ae23_fef51df27783.slice - libcontainer container kubepods-besteffort-pod77347435_1e1d_4da2_ae23_fef51df27783.slice. May 13 23:46:51.495483 kubelet[2780]: I0513 23:46:51.494176 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-policysync\") pod \"61ba2088-39e0-4223-b95b-586fe99f906e\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " May 13 23:46:51.495483 kubelet[2780]: I0513 23:46:51.494233 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-flexvol-driver-host\") pod \"61ba2088-39e0-4223-b95b-586fe99f906e\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " May 13 23:46:51.495483 kubelet[2780]: I0513 23:46:51.494313 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/61ba2088-39e0-4223-b95b-586fe99f906e-node-certs\") pod \"61ba2088-39e0-4223-b95b-586fe99f906e\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " May 13 23:46:51.495483 kubelet[2780]: I0513 23:46:51.494310 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-policysync" (OuterVolumeSpecName: "policysync") pod "61ba2088-39e0-4223-b95b-586fe99f906e" (UID: "61ba2088-39e0-4223-b95b-586fe99f906e"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGIDValue "" May 13 23:46:51.495483 kubelet[2780]: I0513 23:46:51.494334 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-cni-bin-dir\") pod \"61ba2088-39e0-4223-b95b-586fe99f906e\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " May 13 23:46:51.495483 kubelet[2780]: I0513 23:46:51.494361 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-var-lib-calico\") pod \"61ba2088-39e0-4223-b95b-586fe99f906e\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " May 13 23:46:51.495818 kubelet[2780]: I0513 23:46:51.494389 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-cni-log-dir\") pod \"61ba2088-39e0-4223-b95b-586fe99f906e\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " May 13 23:46:51.495818 kubelet[2780]: I0513 23:46:51.494410 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-lib-modules\") pod \"61ba2088-39e0-4223-b95b-586fe99f906e\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " May 13 23:46:51.495818 kubelet[2780]: I0513 23:46:51.494435 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-cni-net-dir\") pod \"61ba2088-39e0-4223-b95b-586fe99f906e\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " May 13 23:46:51.495818 kubelet[2780]: I0513 23:46:51.494454 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-var-run-calico\") pod \"61ba2088-39e0-4223-b95b-586fe99f906e\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " May 13 23:46:51.495818 kubelet[2780]: I0513 23:46:51.494473 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-xtables-lock\") pod \"61ba2088-39e0-4223-b95b-586fe99f906e\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " May 13 23:46:51.495818 kubelet[2780]: I0513 23:46:51.494500 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clm64\" (UniqueName: \"kubernetes.io/projected/61ba2088-39e0-4223-b95b-586fe99f906e-kube-api-access-clm64\") pod \"61ba2088-39e0-4223-b95b-586fe99f906e\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " May 13 23:46:51.496040 kubelet[2780]: I0513 23:46:51.494524 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61ba2088-39e0-4223-b95b-586fe99f906e-tigera-ca-bundle\") pod \"61ba2088-39e0-4223-b95b-586fe99f906e\" (UID: \"61ba2088-39e0-4223-b95b-586fe99f906e\") " May 13 23:46:51.496040 kubelet[2780]: I0513 23:46:51.494589 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dg5m\" (UniqueName: \"kubernetes.io/projected/77347435-1e1d-4da2-ae23-fef51df27783-kube-api-access-7dg5m\") pod \"calico-node-7h55p\" (UID: \"77347435-1e1d-4da2-ae23-fef51df27783\") " pod="calico-system/calico-node-7h55p" May 13 23:46:51.496040 kubelet[2780]: I0513 23:46:51.494618 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/77347435-1e1d-4da2-ae23-fef51df27783-cni-bin-dir\") pod \"calico-node-7h55p\" (UID: \"77347435-1e1d-4da2-ae23-fef51df27783\") " pod="calico-system/calico-node-7h55p" May 13 23:46:51.496040 kubelet[2780]: I0513 23:46:51.494646 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/77347435-1e1d-4da2-ae23-fef51df27783-var-run-calico\") pod \"calico-node-7h55p\" (UID: \"77347435-1e1d-4da2-ae23-fef51df27783\") " pod="calico-system/calico-node-7h55p" May 13 23:46:51.496040 kubelet[2780]: I0513 23:46:51.494671 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/77347435-1e1d-4da2-ae23-fef51df27783-policysync\") pod \"calico-node-7h55p\" (UID: \"77347435-1e1d-4da2-ae23-fef51df27783\") " pod="calico-system/calico-node-7h55p" May 13 23:46:51.496217 kubelet[2780]: I0513 23:46:51.494718 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/77347435-1e1d-4da2-ae23-fef51df27783-cni-log-dir\") pod \"calico-node-7h55p\" (UID: \"77347435-1e1d-4da2-ae23-fef51df27783\") " pod="calico-system/calico-node-7h55p" May 13 23:46:51.496217 kubelet[2780]: I0513 23:46:51.494742 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/77347435-1e1d-4da2-ae23-fef51df27783-xtables-lock\") pod \"calico-node-7h55p\" (UID: \"77347435-1e1d-4da2-ae23-fef51df27783\") " pod="calico-system/calico-node-7h55p" May 13 23:46:51.496217 kubelet[2780]: I0513 23:46:51.494765 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/77347435-1e1d-4da2-ae23-fef51df27783-var-lib-calico\") pod \"calico-node-7h55p\" (UID: \"77347435-1e1d-4da2-ae23-fef51df27783\") " pod="calico-system/calico-node-7h55p" May 13 23:46:51.496217 kubelet[2780]: I0513 23:46:51.494791 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/77347435-1e1d-4da2-ae23-fef51df27783-node-certs\") pod \"calico-node-7h55p\" (UID: \"77347435-1e1d-4da2-ae23-fef51df27783\") " pod="calico-system/calico-node-7h55p" May 13 23:46:51.496217 kubelet[2780]: I0513 23:46:51.494818 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/77347435-1e1d-4da2-ae23-fef51df27783-cni-net-dir\") pod \"calico-node-7h55p\" (UID: \"77347435-1e1d-4da2-ae23-fef51df27783\") " pod="calico-system/calico-node-7h55p" May 13 23:46:51.496494 kubelet[2780]: I0513 23:46:51.494843 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77347435-1e1d-4da2-ae23-fef51df27783-lib-modules\") pod \"calico-node-7h55p\" (UID: \"77347435-1e1d-4da2-ae23-fef51df27783\") " pod="calico-system/calico-node-7h55p" May 13 23:46:51.496494 kubelet[2780]: I0513 23:46:51.494866 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77347435-1e1d-4da2-ae23-fef51df27783-tigera-ca-bundle\") pod \"calico-node-7h55p\" (UID: \"77347435-1e1d-4da2-ae23-fef51df27783\") " pod="calico-system/calico-node-7h55p" May 13 23:46:51.496494 kubelet[2780]: I0513 23:46:51.494887 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/77347435-1e1d-4da2-ae23-fef51df27783-flexvol-driver-host\") pod \"calico-node-7h55p\" (UID: \"77347435-1e1d-4da2-ae23-fef51df27783\") " pod="calico-system/calico-node-7h55p" May 13 23:46:51.496494 kubelet[2780]: I0513 23:46:51.494911 2780 reconciler_common.go:299] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-policysync\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:51.496494 kubelet[2780]: I0513 23:46:51.494983 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "61ba2088-39e0-4223-b95b-586fe99f906e" (UID: "61ba2088-39e0-4223-b95b-586fe99f906e"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" May 13 23:46:51.496705 kubelet[2780]: I0513 23:46:51.495014 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "61ba2088-39e0-4223-b95b-586fe99f906e" (UID: "61ba2088-39e0-4223-b95b-586fe99f906e"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGIDValue "" May 13 23:46:51.496705 kubelet[2780]: I0513 23:46:51.495040 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "61ba2088-39e0-4223-b95b-586fe99f906e" (UID: "61ba2088-39e0-4223-b95b-586fe99f906e"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" May 13 23:46:51.496705 kubelet[2780]: I0513 23:46:51.495095 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "61ba2088-39e0-4223-b95b-586fe99f906e" (UID: "61ba2088-39e0-4223-b95b-586fe99f906e"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGIDValue "" May 13 23:46:51.496705 kubelet[2780]: I0513 23:46:51.495127 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "61ba2088-39e0-4223-b95b-586fe99f906e" (UID: "61ba2088-39e0-4223-b95b-586fe99f906e"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" May 13 23:46:51.496705 kubelet[2780]: I0513 23:46:51.495154 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "61ba2088-39e0-4223-b95b-586fe99f906e" (UID: "61ba2088-39e0-4223-b95b-586fe99f906e"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGIDValue "" May 13 23:46:51.497661 kubelet[2780]: I0513 23:46:51.495178 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "61ba2088-39e0-4223-b95b-586fe99f906e" (UID: "61ba2088-39e0-4223-b95b-586fe99f906e"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGIDValue "" May 13 23:46:51.498340 kubelet[2780]: I0513 23:46:51.498276 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "61ba2088-39e0-4223-b95b-586fe99f906e" (UID: "61ba2088-39e0-4223-b95b-586fe99f906e"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGIDValue "" May 13 23:46:51.503257 kubelet[2780]: I0513 23:46:51.502110 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61ba2088-39e0-4223-b95b-586fe99f906e-node-certs" (OuterVolumeSpecName: "node-certs") pod "61ba2088-39e0-4223-b95b-586fe99f906e" (UID: "61ba2088-39e0-4223-b95b-586fe99f906e"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 13 23:46:51.503912 kubelet[2780]: I0513 23:46:51.503881 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61ba2088-39e0-4223-b95b-586fe99f906e-kube-api-access-clm64" (OuterVolumeSpecName: "kube-api-access-clm64") pod "61ba2088-39e0-4223-b95b-586fe99f906e" (UID: "61ba2088-39e0-4223-b95b-586fe99f906e"). InnerVolumeSpecName "kube-api-access-clm64". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 13 23:46:51.507128 kubelet[2780]: I0513 23:46:51.506724 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61ba2088-39e0-4223-b95b-586fe99f906e-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "61ba2088-39e0-4223-b95b-586fe99f906e" (UID: "61ba2088-39e0-4223-b95b-586fe99f906e"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 13 23:46:51.596159 kubelet[2780]: I0513 23:46:51.595801 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sz8c\" (UniqueName: \"kubernetes.io/projected/bb68557b-1129-460d-9d3e-e3f0bf7e8587-kube-api-access-8sz8c\") pod \"bb68557b-1129-460d-9d3e-e3f0bf7e8587\" (UID: \"bb68557b-1129-460d-9d3e-e3f0bf7e8587\") " May 13 23:46:51.596159 kubelet[2780]: I0513 23:46:51.595858 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb68557b-1129-460d-9d3e-e3f0bf7e8587-tigera-ca-bundle\") pod \"bb68557b-1129-460d-9d3e-e3f0bf7e8587\" (UID: \"bb68557b-1129-460d-9d3e-e3f0bf7e8587\") " May 13 23:46:51.596159 kubelet[2780]: I0513 23:46:51.596047 2780 reconciler_common.go:299] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-flexvol-driver-host\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:51.596159 kubelet[2780]: I0513 23:46:51.596059 2780 reconciler_common.go:299] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/61ba2088-39e0-4223-b95b-586fe99f906e-node-certs\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:51.596159 kubelet[2780]: I0513 23:46:51.596083 2780 reconciler_common.go:299] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-cni-log-dir\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:51.596159 kubelet[2780]: I0513 23:46:51.596095 2780 reconciler_common.go:299] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-lib-modules\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:51.596159 kubelet[2780]: I0513 23:46:51.596103 2780 reconciler_common.go:299] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-cni-net-dir\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:51.598142 kubelet[2780]: I0513 23:46:51.596111 2780 reconciler_common.go:299] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-var-run-calico\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:51.598142 kubelet[2780]: I0513 23:46:51.596119 2780 reconciler_common.go:299] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-cni-bin-dir\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:51.598142 kubelet[2780]: I0513 23:46:51.596129 2780 reconciler_common.go:299] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-var-lib-calico\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:51.598142 kubelet[2780]: I0513 23:46:51.596138 2780 reconciler_common.go:299] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/61ba2088-39e0-4223-b95b-586fe99f906e-xtables-lock\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:51.598142 kubelet[2780]: I0513 23:46:51.596152 2780 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-clm64\" (UniqueName: \"kubernetes.io/projected/61ba2088-39e0-4223-b95b-586fe99f906e-kube-api-access-clm64\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:51.598142 kubelet[2780]: I0513 23:46:51.596161 2780 reconciler_common.go:299] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61ba2088-39e0-4223-b95b-586fe99f906e-tigera-ca-bundle\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:51.601453 kubelet[2780]: I0513 23:46:51.601361 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb68557b-1129-460d-9d3e-e3f0bf7e8587-kube-api-access-8sz8c" (OuterVolumeSpecName: "kube-api-access-8sz8c") pod "bb68557b-1129-460d-9d3e-e3f0bf7e8587" (UID: "bb68557b-1129-460d-9d3e-e3f0bf7e8587"). InnerVolumeSpecName "kube-api-access-8sz8c". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 13 23:46:51.612120 kubelet[2780]: I0513 23:46:51.611666 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb68557b-1129-460d-9d3e-e3f0bf7e8587-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "bb68557b-1129-460d-9d3e-e3f0bf7e8587" (UID: "bb68557b-1129-460d-9d3e-e3f0bf7e8587"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 13 23:46:51.696791 kubelet[2780]: I0513 23:46:51.696646 2780 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8sz8c\" (UniqueName: \"kubernetes.io/projected/bb68557b-1129-460d-9d3e-e3f0bf7e8587-kube-api-access-8sz8c\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:51.696791 kubelet[2780]: I0513 23:46:51.696687 2780 reconciler_common.go:299] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb68557b-1129-460d-9d3e-e3f0bf7e8587-tigera-ca-bundle\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:51.740062 containerd[1504]: time="2025-05-13T23:46:51.739856153Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-7h55p,Uid:77347435-1e1d-4da2-ae23-fef51df27783,Namespace:calico-system,Attempt:0,}" May 13 23:46:51.745930 kubelet[2780]: I0513 23:46:51.745715 2780 scope.go:117] "RemoveContainer" containerID="e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3" May 13 23:46:51.754840 systemd[1]: Removed slice kubepods-besteffort-pod61ba2088_39e0_4223_b95b_586fe99f906e.slice - libcontainer container kubepods-besteffort-pod61ba2088_39e0_4223_b95b_586fe99f906e.slice. May 13 23:46:51.754941 systemd[1]: kubepods-besteffort-pod61ba2088_39e0_4223_b95b_586fe99f906e.slice: Consumed 3.571s CPU time, 283.8M memory peak, 4K read from disk, 158M written to disk. May 13 23:46:51.766237 systemd[1]: Removed slice kubepods-besteffort-podbb68557b_1129_460d_9d3e_e3f0bf7e8587.slice - libcontainer container kubepods-besteffort-podbb68557b_1129_460d_9d3e_e3f0bf7e8587.slice. May 13 23:46:51.769569 containerd[1504]: time="2025-05-13T23:46:51.769524225Z" level=info msg="RemoveContainer for \"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\"" May 13 23:46:51.784434 containerd[1504]: time="2025-05-13T23:46:51.784387420Z" level=info msg="RemoveContainer for \"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" returns successfully" May 13 23:46:51.786183 kubelet[2780]: I0513 23:46:51.785647 2780 scope.go:117] "RemoveContainer" containerID="bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00" May 13 23:46:51.789598 containerd[1504]: time="2025-05-13T23:46:51.789259086Z" level=info msg="connecting to shim 6615dce3938a17d185777a2b8070aba6d716a4127cff967343c3a582750343fc" address="unix:///run/containerd/s/d4e6d13fff3f47ff4bd0f5a2cea08f0b4efb251be96f02d066f67eb1cf1d6b51" namespace=k8s.io protocol=ttrpc version=3 May 13 23:46:51.791216 containerd[1504]: time="2025-05-13T23:46:51.791146840Z" level=info msg="RemoveContainer for \"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\"" May 13 23:46:51.807549 containerd[1504]: time="2025-05-13T23:46:51.807407951Z" level=info msg="RemoveContainer for \"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\" returns successfully" May 13 23:46:51.808188 kubelet[2780]: I0513 23:46:51.807923 2780 scope.go:117] "RemoveContainer" containerID="a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2" May 13 23:46:51.822032 containerd[1504]: time="2025-05-13T23:46:51.821347990Z" level=info msg="RemoveContainer for \"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\"" May 13 23:46:51.832526 containerd[1504]: time="2025-05-13T23:46:51.832386237Z" level=info msg="RemoveContainer for \"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\" returns successfully" May 13 23:46:51.835721 kubelet[2780]: I0513 23:46:51.835678 2780 scope.go:117] "RemoveContainer" containerID="e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3" May 13 23:46:51.836179 containerd[1504]: time="2025-05-13T23:46:51.836088666Z" level=error msg="ContainerStatus for \"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\": not found" May 13 23:46:51.836445 kubelet[2780]: E0513 23:46:51.836419 2780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\": not found" containerID="e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3" May 13 23:46:51.836502 kubelet[2780]: I0513 23:46:51.836449 2780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3"} err="failed to get container status \"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\": rpc error: code = NotFound desc = an error occurred when try to find container \"e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3\": not found" May 13 23:46:51.836502 kubelet[2780]: I0513 23:46:51.836474 2780 scope.go:117] "RemoveContainer" containerID="bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00" May 13 23:46:51.836727 containerd[1504]: time="2025-05-13T23:46:51.836695344Z" level=error msg="ContainerStatus for \"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\": not found" May 13 23:46:51.836929 kubelet[2780]: E0513 23:46:51.836901 2780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\": not found" containerID="bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00" May 13 23:46:51.837137 kubelet[2780]: I0513 23:46:51.836928 2780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00"} err="failed to get container status \"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\": rpc error: code = NotFound desc = an error occurred when try to find container \"bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00\": not found" May 13 23:46:51.837137 kubelet[2780]: I0513 23:46:51.836941 2780 scope.go:117] "RemoveContainer" containerID="a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2" May 13 23:46:51.837215 containerd[1504]: time="2025-05-13T23:46:51.837087703Z" level=error msg="ContainerStatus for \"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\": not found" May 13 23:46:51.839302 kubelet[2780]: E0513 23:46:51.837444 2780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\": not found" containerID="a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2" May 13 23:46:51.840722 kubelet[2780]: I0513 23:46:51.840665 2780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2"} err="failed to get container status \"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\": rpc error: code = NotFound desc = an error occurred when try to find container \"a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2\": not found" May 13 23:46:51.841032 kubelet[2780]: I0513 23:46:51.840926 2780 scope.go:117] "RemoveContainer" containerID="0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3" May 13 23:46:51.855782 containerd[1504]: time="2025-05-13T23:46:51.855603087Z" level=info msg="RemoveContainer for \"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\"" May 13 23:46:51.868426 containerd[1504]: time="2025-05-13T23:46:51.868388049Z" level=info msg="RemoveContainer for \"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\" returns successfully" May 13 23:46:51.872522 kubelet[2780]: I0513 23:46:51.872484 2780 scope.go:117] "RemoveContainer" containerID="0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3" May 13 23:46:51.873204 containerd[1504]: time="2025-05-13T23:46:51.873026275Z" level=error msg="ContainerStatus for \"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\": not found" May 13 23:46:51.873535 kubelet[2780]: E0513 23:46:51.873478 2780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\": not found" containerID="0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3" May 13 23:46:51.873535 kubelet[2780]: I0513 23:46:51.873509 2780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3"} err="failed to get container status \"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\": rpc error: code = NotFound desc = an error occurred when try to find container \"0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3\": not found" May 13 23:46:51.884303 systemd[1]: Started cri-containerd-6615dce3938a17d185777a2b8070aba6d716a4127cff967343c3a582750343fc.scope - libcontainer container 6615dce3938a17d185777a2b8070aba6d716a4127cff967343c3a582750343fc. May 13 23:46:51.958444 systemd[1]: cri-containerd-b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41.scope: Deactivated successfully. May 13 23:46:51.966287 containerd[1504]: time="2025-05-13T23:46:51.965487559Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\" id:\"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\" pid:3341 exit_status:1 exited_at:{seconds:1747180011 nanos:964677641}" May 13 23:46:51.966287 containerd[1504]: time="2025-05-13T23:46:51.965566198Z" level=info msg="received exit event container_id:\"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\" id:\"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\" pid:3341 exit_status:1 exited_at:{seconds:1747180011 nanos:964677641}" May 13 23:46:52.038733 containerd[1504]: time="2025-05-13T23:46:52.038452618Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-7h55p,Uid:77347435-1e1d-4da2-ae23-fef51df27783,Namespace:calico-system,Attempt:0,} returns sandbox id \"6615dce3938a17d185777a2b8070aba6d716a4127cff967343c3a582750343fc\"" May 13 23:46:52.040454 containerd[1504]: time="2025-05-13T23:46:52.039617294Z" level=info msg="StopContainer for \"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\" returns successfully" May 13 23:46:52.040800 containerd[1504]: time="2025-05-13T23:46:52.040720211Z" level=info msg="StopPodSandbox for \"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\"" May 13 23:46:52.040800 containerd[1504]: time="2025-05-13T23:46:52.040791651Z" level=info msg="Container to stop \"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 13 23:46:52.047315 containerd[1504]: time="2025-05-13T23:46:52.045909675Z" level=info msg="CreateContainer within sandbox \"6615dce3938a17d185777a2b8070aba6d716a4127cff967343c3a582750343fc\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 13 23:46:52.056347 systemd[1]: cri-containerd-496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653.scope: Deactivated successfully. May 13 23:46:52.064026 containerd[1504]: time="2025-05-13T23:46:52.063871900Z" level=info msg="TaskExit event in podsandbox handler container_id:\"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\" id:\"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\" pid:3222 exit_status:137 exited_at:{seconds:1747180012 nanos:62499945}" May 13 23:46:52.071296 containerd[1504]: time="2025-05-13T23:46:52.071236518Z" level=info msg="Container d61f2ac14e9f138d17c89b00b452ac086ff6b79dd30c9b1d6b425d42099ea454: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:52.089625 containerd[1504]: time="2025-05-13T23:46:52.089490782Z" level=info msg="CreateContainer within sandbox \"6615dce3938a17d185777a2b8070aba6d716a4127cff967343c3a582750343fc\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d61f2ac14e9f138d17c89b00b452ac086ff6b79dd30c9b1d6b425d42099ea454\"" May 13 23:46:52.092730 containerd[1504]: time="2025-05-13T23:46:52.092690892Z" level=info msg="StartContainer for \"d61f2ac14e9f138d17c89b00b452ac086ff6b79dd30c9b1d6b425d42099ea454\"" May 13 23:46:52.101684 systemd[1]: var-lib-kubelet-pods-bb68557b\x2d1129\x2d460d\x2d9d3e\x2de3f0bf7e8587-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dkube\x2dcontrollers-1.mount: Deactivated successfully. May 13 23:46:52.102148 systemd[1]: var-lib-kubelet-pods-61ba2088\x2d39e0\x2d4223\x2db95b\x2d586fe99f906e-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. May 13 23:46:52.102289 systemd[1]: var-lib-kubelet-pods-bb68557b\x2d1129\x2d460d\x2d9d3e\x2de3f0bf7e8587-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d8sz8c.mount: Deactivated successfully. May 13 23:46:52.102354 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41-rootfs.mount: Deactivated successfully. May 13 23:46:52.102407 systemd[1]: var-lib-kubelet-pods-61ba2088\x2d39e0\x2d4223\x2db95b\x2d586fe99f906e-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dclm64.mount: Deactivated successfully. May 13 23:46:52.102456 systemd[1]: var-lib-kubelet-pods-61ba2088\x2d39e0\x2d4223\x2db95b\x2d586fe99f906e-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. May 13 23:46:52.106650 containerd[1504]: time="2025-05-13T23:46:52.105780492Z" level=info msg="connecting to shim d61f2ac14e9f138d17c89b00b452ac086ff6b79dd30c9b1d6b425d42099ea454" address="unix:///run/containerd/s/d4e6d13fff3f47ff4bd0f5a2cea08f0b4efb251be96f02d066f67eb1cf1d6b51" protocol=ttrpc version=3 May 13 23:46:52.156299 systemd[1]: Started cri-containerd-d61f2ac14e9f138d17c89b00b452ac086ff6b79dd30c9b1d6b425d42099ea454.scope - libcontainer container d61f2ac14e9f138d17c89b00b452ac086ff6b79dd30c9b1d6b425d42099ea454. May 13 23:46:52.160036 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653-rootfs.mount: Deactivated successfully. May 13 23:46:52.166118 containerd[1504]: time="2025-05-13T23:46:52.165257831Z" level=info msg="shim disconnected" id=496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653 namespace=k8s.io May 13 23:46:52.166118 containerd[1504]: time="2025-05-13T23:46:52.165305190Z" level=warning msg="cleaning up after shim disconnected" id=496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653 namespace=k8s.io May 13 23:46:52.166118 containerd[1504]: time="2025-05-13T23:46:52.165340750Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 13 23:46:52.223366 containerd[1504]: time="2025-05-13T23:46:52.223160734Z" level=info msg="received exit event sandbox_id:\"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\" exit_status:137 exited_at:{seconds:1747180012 nanos:62499945}" May 13 23:46:52.224861 containerd[1504]: time="2025-05-13T23:46:52.224832049Z" level=info msg="TaskExit event in podsandbox handler exit_status:137 exited_at:{seconds:1747180010 nanos:125853001}" May 13 23:46:52.231981 containerd[1504]: time="2025-05-13T23:46:52.231945347Z" level=info msg="StartContainer for \"d61f2ac14e9f138d17c89b00b452ac086ff6b79dd30c9b1d6b425d42099ea454\" returns successfully" May 13 23:46:52.232954 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653-shm.mount: Deactivated successfully. May 13 23:46:52.233583 containerd[1504]: time="2025-05-13T23:46:52.233451382Z" level=info msg="TearDown network for sandbox \"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\" successfully" May 13 23:46:52.233583 containerd[1504]: time="2025-05-13T23:46:52.233511422Z" level=info msg="StopPodSandbox for \"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\" returns successfully" May 13 23:46:52.261653 systemd[1]: cri-containerd-d61f2ac14e9f138d17c89b00b452ac086ff6b79dd30c9b1d6b425d42099ea454.scope: Deactivated successfully. May 13 23:46:52.262294 systemd[1]: cri-containerd-d61f2ac14e9f138d17c89b00b452ac086ff6b79dd30c9b1d6b425d42099ea454.scope: Consumed 34ms CPU time, 7.8M memory peak, 12K read from disk, 6.2M written to disk. May 13 23:46:52.270739 containerd[1504]: time="2025-05-13T23:46:52.270570429Z" level=info msg="received exit event container_id:\"d61f2ac14e9f138d17c89b00b452ac086ff6b79dd30c9b1d6b425d42099ea454\" id:\"d61f2ac14e9f138d17c89b00b452ac086ff6b79dd30c9b1d6b425d42099ea454\" pid:5716 exited_at:{seconds:1747180012 nanos:269726232}" May 13 23:46:52.270739 containerd[1504]: time="2025-05-13T23:46:52.270705509Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d61f2ac14e9f138d17c89b00b452ac086ff6b79dd30c9b1d6b425d42099ea454\" id:\"d61f2ac14e9f138d17c89b00b452ac086ff6b79dd30c9b1d6b425d42099ea454\" pid:5716 exited_at:{seconds:1747180012 nanos:269726232}" May 13 23:46:52.312291 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d61f2ac14e9f138d17c89b00b452ac086ff6b79dd30c9b1d6b425d42099ea454-rootfs.mount: Deactivated successfully. May 13 23:46:52.406436 kubelet[2780]: I0513 23:46:52.406404 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d078a62c-ac98-4918-b0ab-53fa4ca1a484-typha-certs\") pod \"d078a62c-ac98-4918-b0ab-53fa4ca1a484\" (UID: \"d078a62c-ac98-4918-b0ab-53fa4ca1a484\") " May 13 23:46:52.407895 kubelet[2780]: I0513 23:46:52.407868 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d078a62c-ac98-4918-b0ab-53fa4ca1a484-tigera-ca-bundle\") pod \"d078a62c-ac98-4918-b0ab-53fa4ca1a484\" (UID: \"d078a62c-ac98-4918-b0ab-53fa4ca1a484\") " May 13 23:46:52.408124 kubelet[2780]: I0513 23:46:52.407996 2780 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dtrj\" (UniqueName: \"kubernetes.io/projected/d078a62c-ac98-4918-b0ab-53fa4ca1a484-kube-api-access-8dtrj\") pod \"d078a62c-ac98-4918-b0ab-53fa4ca1a484\" (UID: \"d078a62c-ac98-4918-b0ab-53fa4ca1a484\") " May 13 23:46:52.412064 kubelet[2780]: I0513 23:46:52.411673 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d078a62c-ac98-4918-b0ab-53fa4ca1a484-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "d078a62c-ac98-4918-b0ab-53fa4ca1a484" (UID: "d078a62c-ac98-4918-b0ab-53fa4ca1a484"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 13 23:46:52.415038 kubelet[2780]: I0513 23:46:52.414624 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d078a62c-ac98-4918-b0ab-53fa4ca1a484-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "d078a62c-ac98-4918-b0ab-53fa4ca1a484" (UID: "d078a62c-ac98-4918-b0ab-53fa4ca1a484"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 13 23:46:52.415443 kubelet[2780]: I0513 23:46:52.415416 2780 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d078a62c-ac98-4918-b0ab-53fa4ca1a484-kube-api-access-8dtrj" (OuterVolumeSpecName: "kube-api-access-8dtrj") pod "d078a62c-ac98-4918-b0ab-53fa4ca1a484" (UID: "d078a62c-ac98-4918-b0ab-53fa4ca1a484"). InnerVolumeSpecName "kube-api-access-8dtrj". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 13 23:46:52.424512 kubelet[2780]: I0513 23:46:52.424454 2780 memory_manager.go:355] "RemoveStaleState removing state" podUID="d078a62c-ac98-4918-b0ab-53fa4ca1a484" containerName="calico-typha" May 13 23:46:52.424512 kubelet[2780]: I0513 23:46:52.424499 2780 memory_manager.go:355] "RemoveStaleState removing state" podUID="bb68557b-1129-460d-9d3e-e3f0bf7e8587" containerName="calico-kube-controllers" May 13 23:46:52.438376 systemd[1]: Created slice kubepods-besteffort-podc602bde4_e7ee_4f94_b5d7_87a908c6ee05.slice - libcontainer container kubepods-besteffort-podc602bde4_e7ee_4f94_b5d7_87a908c6ee05.slice. May 13 23:46:52.510112 kubelet[2780]: I0513 23:46:52.509586 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgggc\" (UniqueName: \"kubernetes.io/projected/c602bde4-e7ee-4f94-b5d7-87a908c6ee05-kube-api-access-xgggc\") pod \"calico-typha-86966df679-d2dkb\" (UID: \"c602bde4-e7ee-4f94-b5d7-87a908c6ee05\") " pod="calico-system/calico-typha-86966df679-d2dkb" May 13 23:46:52.510112 kubelet[2780]: I0513 23:46:52.509640 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c602bde4-e7ee-4f94-b5d7-87a908c6ee05-tigera-ca-bundle\") pod \"calico-typha-86966df679-d2dkb\" (UID: \"c602bde4-e7ee-4f94-b5d7-87a908c6ee05\") " pod="calico-system/calico-typha-86966df679-d2dkb" May 13 23:46:52.510112 kubelet[2780]: I0513 23:46:52.509660 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/c602bde4-e7ee-4f94-b5d7-87a908c6ee05-typha-certs\") pod \"calico-typha-86966df679-d2dkb\" (UID: \"c602bde4-e7ee-4f94-b5d7-87a908c6ee05\") " pod="calico-system/calico-typha-86966df679-d2dkb" May 13 23:46:52.510112 kubelet[2780]: I0513 23:46:52.509695 2780 reconciler_common.go:299] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d078a62c-ac98-4918-b0ab-53fa4ca1a484-typha-certs\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:52.510515 kubelet[2780]: I0513 23:46:52.510457 2780 reconciler_common.go:299] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d078a62c-ac98-4918-b0ab-53fa4ca1a484-tigera-ca-bundle\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:52.510515 kubelet[2780]: I0513 23:46:52.510483 2780 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8dtrj\" (UniqueName: \"kubernetes.io/projected/d078a62c-ac98-4918-b0ab-53fa4ca1a484-kube-api-access-8dtrj\") on node \"ci-4284-0-0-n-732e99817a\" DevicePath \"\"" May 13 23:46:52.745775 containerd[1504]: time="2025-05-13T23:46:52.745299259Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-86966df679-d2dkb,Uid:c602bde4-e7ee-4f94-b5d7-87a908c6ee05,Namespace:calico-system,Attempt:0,}" May 13 23:46:52.767264 containerd[1504]: time="2025-05-13T23:46:52.766775473Z" level=info msg="connecting to shim 5a2103ce593adea61d1c72d4ba4d2367c602727614cf2839e74510c06673e5a7" address="unix:///run/containerd/s/4385df206445a1c8805303d0cc0a6a1be67ac28e178fa196b21e8150e3a320e8" namespace=k8s.io protocol=ttrpc version=3 May 13 23:46:52.784375 kubelet[2780]: I0513 23:46:52.784335 2780 scope.go:117] "RemoveContainer" containerID="b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41" May 13 23:46:52.792894 systemd[1]: Started cri-containerd-5a2103ce593adea61d1c72d4ba4d2367c602727614cf2839e74510c06673e5a7.scope - libcontainer container 5a2103ce593adea61d1c72d4ba4d2367c602727614cf2839e74510c06673e5a7. May 13 23:46:52.795852 systemd[1]: Removed slice kubepods-besteffort-podd078a62c_ac98_4918_b0ab_53fa4ca1a484.slice - libcontainer container kubepods-besteffort-podd078a62c_ac98_4918_b0ab_53fa4ca1a484.slice. May 13 23:46:52.807998 containerd[1504]: time="2025-05-13T23:46:52.807943707Z" level=info msg="RemoveContainer for \"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\"" May 13 23:46:52.822722 containerd[1504]: time="2025-05-13T23:46:52.821769705Z" level=info msg="RemoveContainer for \"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\" returns successfully" May 13 23:46:52.823966 kubelet[2780]: I0513 23:46:52.823939 2780 scope.go:117] "RemoveContainer" containerID="b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41" May 13 23:46:52.827160 containerd[1504]: time="2025-05-13T23:46:52.826619090Z" level=error msg="ContainerStatus for \"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\": not found" May 13 23:46:52.827160 containerd[1504]: time="2025-05-13T23:46:52.826860970Z" level=info msg="CreateContainer within sandbox \"6615dce3938a17d185777a2b8070aba6d716a4127cff967343c3a582750343fc\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 13 23:46:52.829582 kubelet[2780]: E0513 23:46:52.829503 2780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\": not found" containerID="b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41" May 13 23:46:52.829582 kubelet[2780]: I0513 23:46:52.829542 2780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41"} err="failed to get container status \"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\": rpc error: code = NotFound desc = an error occurred when try to find container \"b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41\": not found" May 13 23:46:52.840655 containerd[1504]: time="2025-05-13T23:46:52.840453648Z" level=info msg="Container f9ca23f95d79f650364b3df04cdc97f70e74b579ec31c8668738156f6ac86b6c: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:52.856505 containerd[1504]: time="2025-05-13T23:46:52.856355719Z" level=info msg="CreateContainer within sandbox \"6615dce3938a17d185777a2b8070aba6d716a4127cff967343c3a582750343fc\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"f9ca23f95d79f650364b3df04cdc97f70e74b579ec31c8668738156f6ac86b6c\"" May 13 23:46:52.857537 containerd[1504]: time="2025-05-13T23:46:52.857443396Z" level=info msg="StartContainer for \"f9ca23f95d79f650364b3df04cdc97f70e74b579ec31c8668738156f6ac86b6c\"" May 13 23:46:52.860331 containerd[1504]: time="2025-05-13T23:46:52.860288827Z" level=info msg="connecting to shim f9ca23f95d79f650364b3df04cdc97f70e74b579ec31c8668738156f6ac86b6c" address="unix:///run/containerd/s/d4e6d13fff3f47ff4bd0f5a2cea08f0b4efb251be96f02d066f67eb1cf1d6b51" protocol=ttrpc version=3 May 13 23:46:52.891485 containerd[1504]: time="2025-05-13T23:46:52.891120293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-86966df679-d2dkb,Uid:c602bde4-e7ee-4f94-b5d7-87a908c6ee05,Namespace:calico-system,Attempt:0,} returns sandbox id \"5a2103ce593adea61d1c72d4ba4d2367c602727614cf2839e74510c06673e5a7\"" May 13 23:46:52.901351 systemd[1]: Started cri-containerd-f9ca23f95d79f650364b3df04cdc97f70e74b579ec31c8668738156f6ac86b6c.scope - libcontainer container f9ca23f95d79f650364b3df04cdc97f70e74b579ec31c8668738156f6ac86b6c. May 13 23:46:52.910746 containerd[1504]: time="2025-05-13T23:46:52.910705193Z" level=info msg="CreateContainer within sandbox \"5a2103ce593adea61d1c72d4ba4d2367c602727614cf2839e74510c06673e5a7\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 13 23:46:52.935493 containerd[1504]: time="2025-05-13T23:46:52.935441358Z" level=info msg="Container 79906ebd27ef2930d3b21628be40ab5d3ebbf574199daa19e9ba2cfab5e3fecb: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:52.950876 containerd[1504]: time="2025-05-13T23:46:52.950514272Z" level=info msg="CreateContainer within sandbox \"5a2103ce593adea61d1c72d4ba4d2367c602727614cf2839e74510c06673e5a7\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"79906ebd27ef2930d3b21628be40ab5d3ebbf574199daa19e9ba2cfab5e3fecb\"" May 13 23:46:52.951967 containerd[1504]: time="2025-05-13T23:46:52.951383429Z" level=info msg="StartContainer for \"79906ebd27ef2930d3b21628be40ab5d3ebbf574199daa19e9ba2cfab5e3fecb\"" May 13 23:46:52.953791 containerd[1504]: time="2025-05-13T23:46:52.953753502Z" level=info msg="connecting to shim 79906ebd27ef2930d3b21628be40ab5d3ebbf574199daa19e9ba2cfab5e3fecb" address="unix:///run/containerd/s/4385df206445a1c8805303d0cc0a6a1be67ac28e178fa196b21e8150e3a320e8" protocol=ttrpc version=3 May 13 23:46:52.979653 systemd[1]: Started cri-containerd-79906ebd27ef2930d3b21628be40ab5d3ebbf574199daa19e9ba2cfab5e3fecb.scope - libcontainer container 79906ebd27ef2930d3b21628be40ab5d3ebbf574199daa19e9ba2cfab5e3fecb. May 13 23:46:52.988443 containerd[1504]: time="2025-05-13T23:46:52.988404316Z" level=info msg="StartContainer for \"f9ca23f95d79f650364b3df04cdc97f70e74b579ec31c8668738156f6ac86b6c\" returns successfully" May 13 23:46:53.040752 containerd[1504]: time="2025-05-13T23:46:53.039548997Z" level=info msg="StartContainer for \"79906ebd27ef2930d3b21628be40ab5d3ebbf574199daa19e9ba2cfab5e3fecb\" returns successfully" May 13 23:46:53.101111 systemd[1]: var-lib-kubelet-pods-d078a62c\x2dac98\x2d4918\x2db0ab\x2d53fa4ca1a484-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. May 13 23:46:53.101427 systemd[1]: var-lib-kubelet-pods-d078a62c\x2dac98\x2d4918\x2db0ab\x2d53fa4ca1a484-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d8dtrj.mount: Deactivated successfully. May 13 23:46:53.101485 systemd[1]: var-lib-kubelet-pods-d078a62c\x2dac98\x2d4918\x2db0ab\x2d53fa4ca1a484-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. May 13 23:46:53.303296 kubelet[2780]: I0513 23:46:53.302559 2780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61ba2088-39e0-4223-b95b-586fe99f906e" path="/var/lib/kubelet/pods/61ba2088-39e0-4223-b95b-586fe99f906e/volumes" May 13 23:46:53.305519 kubelet[2780]: I0513 23:46:53.303926 2780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb68557b-1129-460d-9d3e-e3f0bf7e8587" path="/var/lib/kubelet/pods/bb68557b-1129-460d-9d3e-e3f0bf7e8587/volumes" May 13 23:46:53.305519 kubelet[2780]: I0513 23:46:53.304816 2780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d078a62c-ac98-4918-b0ab-53fa4ca1a484" path="/var/lib/kubelet/pods/d078a62c-ac98-4918-b0ab-53fa4ca1a484/volumes" May 13 23:46:53.803763 systemd[1]: cri-containerd-f9ca23f95d79f650364b3df04cdc97f70e74b579ec31c8668738156f6ac86b6c.scope: Deactivated successfully. May 13 23:46:53.805731 systemd[1]: cri-containerd-f9ca23f95d79f650364b3df04cdc97f70e74b579ec31c8668738156f6ac86b6c.scope: Consumed 675ms CPU time, 60M memory peak, 35M read from disk. May 13 23:46:53.806590 containerd[1504]: time="2025-05-13T23:46:53.805888648Z" level=info msg="received exit event container_id:\"f9ca23f95d79f650364b3df04cdc97f70e74b579ec31c8668738156f6ac86b6c\" id:\"f9ca23f95d79f650364b3df04cdc97f70e74b579ec31c8668738156f6ac86b6c\" pid:5837 exited_at:{seconds:1747180013 nanos:805455730}" May 13 23:46:53.806953 containerd[1504]: time="2025-05-13T23:46:53.806691726Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f9ca23f95d79f650364b3df04cdc97f70e74b579ec31c8668738156f6ac86b6c\" id:\"f9ca23f95d79f650364b3df04cdc97f70e74b579ec31c8668738156f6ac86b6c\" pid:5837 exited_at:{seconds:1747180013 nanos:805455730}" May 13 23:46:53.840828 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f9ca23f95d79f650364b3df04cdc97f70e74b579ec31c8668738156f6ac86b6c-rootfs.mount: Deactivated successfully. May 13 23:46:53.910494 kubelet[2780]: I0513 23:46:53.910160 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-86966df679-d2dkb" podStartSLOduration=3.909668005 podStartE2EDuration="3.909668005s" podCreationTimestamp="2025-05-13 23:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:46:53.908367169 +0000 UTC m=+72.726632849" watchObservedRunningTime="2025-05-13 23:46:53.909668005 +0000 UTC m=+72.727933685" May 13 23:46:54.903183 containerd[1504]: time="2025-05-13T23:46:54.903137133Z" level=info msg="CreateContainer within sandbox \"6615dce3938a17d185777a2b8070aba6d716a4127cff967343c3a582750343fc\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 13 23:46:54.915413 containerd[1504]: time="2025-05-13T23:46:54.914286737Z" level=info msg="Container 70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:54.941554 containerd[1504]: time="2025-05-13T23:46:54.941480691Z" level=info msg="CreateContainer within sandbox \"6615dce3938a17d185777a2b8070aba6d716a4127cff967343c3a582750343fc\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078\"" May 13 23:46:54.942525 containerd[1504]: time="2025-05-13T23:46:54.942494848Z" level=info msg="StartContainer for \"70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078\"" May 13 23:46:54.944470 containerd[1504]: time="2025-05-13T23:46:54.944440081Z" level=info msg="connecting to shim 70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078" address="unix:///run/containerd/s/d4e6d13fff3f47ff4bd0f5a2cea08f0b4efb251be96f02d066f67eb1cf1d6b51" protocol=ttrpc version=3 May 13 23:46:54.972415 systemd[1]: Started cri-containerd-70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078.scope - libcontainer container 70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078. May 13 23:46:55.033359 containerd[1504]: time="2025-05-13T23:46:55.033184517Z" level=info msg="StartContainer for \"70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078\" returns successfully" May 13 23:46:55.232307 systemd[1]: Created slice kubepods-besteffort-podfc63cc85_fedf_446c_83a7_f25a459268fa.slice - libcontainer container kubepods-besteffort-podfc63cc85_fedf_446c_83a7_f25a459268fa.slice. May 13 23:46:55.332485 kubelet[2780]: I0513 23:46:55.332210 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n9kj\" (UniqueName: \"kubernetes.io/projected/fc63cc85-fedf-446c-83a7-f25a459268fa-kube-api-access-2n9kj\") pod \"calico-kube-controllers-7df7ff6dfb-b29xg\" (UID: \"fc63cc85-fedf-446c-83a7-f25a459268fa\") " pod="calico-system/calico-kube-controllers-7df7ff6dfb-b29xg" May 13 23:46:55.332485 kubelet[2780]: I0513 23:46:55.332390 2780 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc63cc85-fedf-446c-83a7-f25a459268fa-tigera-ca-bundle\") pod \"calico-kube-controllers-7df7ff6dfb-b29xg\" (UID: \"fc63cc85-fedf-446c-83a7-f25a459268fa\") " pod="calico-system/calico-kube-controllers-7df7ff6dfb-b29xg" May 13 23:46:55.539413 containerd[1504]: time="2025-05-13T23:46:55.539316559Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7df7ff6dfb-b29xg,Uid:fc63cc85-fedf-446c-83a7-f25a459268fa,Namespace:calico-system,Attempt:0,}" May 13 23:46:55.754504 systemd-networkd[1392]: cali2d98a4c4715: Link UP May 13 23:46:55.755893 systemd-networkd[1392]: cali2d98a4c4715: Gained carrier May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.606 [INFO][5958] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7df7ff6dfb--b29xg-eth0 calico-kube-controllers-7df7ff6dfb- calico-system fc63cc85-fedf-446c-83a7-f25a459268fa 1142 0 2025-05-13 23:46:51 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7df7ff6dfb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284-0-0-n-732e99817a calico-kube-controllers-7df7ff6dfb-b29xg eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali2d98a4c4715 [] []}} ContainerID="a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" Namespace="calico-system" Pod="calico-kube-controllers-7df7ff6dfb-b29xg" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7df7ff6dfb--b29xg-" May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.606 [INFO][5958] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" Namespace="calico-system" Pod="calico-kube-controllers-7df7ff6dfb-b29xg" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7df7ff6dfb--b29xg-eth0" May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.660 [INFO][5970] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" HandleID="k8s-pod-network.a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7df7ff6dfb--b29xg-eth0" May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.683 [INFO][5970] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" HandleID="k8s-pod-network.a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7df7ff6dfb--b29xg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001fa1a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-732e99817a", "pod":"calico-kube-controllers-7df7ff6dfb-b29xg", "timestamp":"2025-05-13 23:46:55.660540006 +0000 UTC"}, Hostname:"ci-4284-0-0-n-732e99817a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.683 [INFO][5970] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.683 [INFO][5970] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.683 [INFO][5970] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-732e99817a' May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.691 [INFO][5970] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" host="ci-4284-0-0-n-732e99817a" May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.699 [INFO][5970] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-732e99817a" May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.710 [INFO][5970] ipam/ipam.go 489: Trying affinity for 192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.716 [INFO][5970] ipam/ipam.go 155: Attempting to load block cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.725 [INFO][5970] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.52.64/26 host="ci-4284-0-0-n-732e99817a" May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.725 [INFO][5970] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.52.64/26 handle="k8s-pod-network.a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" host="ci-4284-0-0-n-732e99817a" May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.729 [INFO][5970] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.736 [INFO][5970] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.52.64/26 handle="k8s-pod-network.a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" host="ci-4284-0-0-n-732e99817a" May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.748 [INFO][5970] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.52.73/26] block=192.168.52.64/26 handle="k8s-pod-network.a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" host="ci-4284-0-0-n-732e99817a" May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.748 [INFO][5970] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.52.73/26] handle="k8s-pod-network.a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" host="ci-4284-0-0-n-732e99817a" May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.748 [INFO][5970] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:46:55.774895 containerd[1504]: 2025-05-13 23:46:55.748 [INFO][5970] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.52.73/26] IPv6=[] ContainerID="a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" HandleID="k8s-pod-network.a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7df7ff6dfb--b29xg-eth0" May 13 23:46:55.778592 containerd[1504]: 2025-05-13 23:46:55.751 [INFO][5958] cni-plugin/k8s.go 386: Populated endpoint ContainerID="a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" Namespace="calico-system" Pod="calico-kube-controllers-7df7ff6dfb-b29xg" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7df7ff6dfb--b29xg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7df7ff6dfb--b29xg-eth0", GenerateName:"calico-kube-controllers-7df7ff6dfb-", Namespace:"calico-system", SelfLink:"", UID:"fc63cc85-fedf-446c-83a7-f25a459268fa", ResourceVersion:"1142", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 46, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7df7ff6dfb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"", Pod:"calico-kube-controllers-7df7ff6dfb-b29xg", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.52.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2d98a4c4715", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:55.778592 containerd[1504]: 2025-05-13 23:46:55.751 [INFO][5958] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.52.73/32] ContainerID="a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" Namespace="calico-system" Pod="calico-kube-controllers-7df7ff6dfb-b29xg" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7df7ff6dfb--b29xg-eth0" May 13 23:46:55.778592 containerd[1504]: 2025-05-13 23:46:55.751 [INFO][5958] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2d98a4c4715 ContainerID="a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" Namespace="calico-system" Pod="calico-kube-controllers-7df7ff6dfb-b29xg" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7df7ff6dfb--b29xg-eth0" May 13 23:46:55.778592 containerd[1504]: 2025-05-13 23:46:55.755 [INFO][5958] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" Namespace="calico-system" Pod="calico-kube-controllers-7df7ff6dfb-b29xg" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7df7ff6dfb--b29xg-eth0" May 13 23:46:55.778592 containerd[1504]: 2025-05-13 23:46:55.756 [INFO][5958] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" Namespace="calico-system" Pod="calico-kube-controllers-7df7ff6dfb-b29xg" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7df7ff6dfb--b29xg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7df7ff6dfb--b29xg-eth0", GenerateName:"calico-kube-controllers-7df7ff6dfb-", Namespace:"calico-system", SelfLink:"", UID:"fc63cc85-fedf-446c-83a7-f25a459268fa", ResourceVersion:"1142", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 46, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7df7ff6dfb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-732e99817a", ContainerID:"a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed", Pod:"calico-kube-controllers-7df7ff6dfb-b29xg", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.52.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2d98a4c4715", MAC:"ba:27:8e:95:3c:45", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:46:55.778592 containerd[1504]: 2025-05-13 23:46:55.771 [INFO][5958] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" Namespace="calico-system" Pod="calico-kube-controllers-7df7ff6dfb-b29xg" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7df7ff6dfb--b29xg-eth0" May 13 23:46:55.805723 containerd[1504]: time="2025-05-13T23:46:55.804485020Z" level=info msg="connecting to shim a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed" address="unix:///run/containerd/s/2afe0c294b11adb810263265fbddb24a4d84c9f67871d86ee79bd960ee55520f" namespace=k8s.io protocol=ttrpc version=3 May 13 23:46:55.833415 systemd[1]: Started cri-containerd-a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed.scope - libcontainer container a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed. May 13 23:46:55.879922 containerd[1504]: time="2025-05-13T23:46:55.879753537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7df7ff6dfb-b29xg,Uid:fc63cc85-fedf-446c-83a7-f25a459268fa,Namespace:calico-system,Attempt:0,} returns sandbox id \"a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed\"" May 13 23:46:55.917122 containerd[1504]: time="2025-05-13T23:46:55.916280819Z" level=info msg="CreateContainer within sandbox \"a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 13 23:46:55.937040 containerd[1504]: time="2025-05-13T23:46:55.936993912Z" level=info msg="Container 8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0: CDI devices from CRI Config.CDIDevices: []" May 13 23:46:55.946826 kubelet[2780]: I0513 23:46:55.946737 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-7h55p" podStartSLOduration=4.94671504 podStartE2EDuration="4.94671504s" podCreationTimestamp="2025-05-13 23:46:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:46:55.944247368 +0000 UTC m=+74.762513088" watchObservedRunningTime="2025-05-13 23:46:55.94671504 +0000 UTC m=+74.764980720" May 13 23:46:55.967313 containerd[1504]: time="2025-05-13T23:46:55.967149294Z" level=info msg="CreateContainer within sandbox \"a9464685a95d5afa8bdc0bcf0a10ba3f85a89f9d4f083321a3a24b5183b96fed\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\"" May 13 23:46:55.969483 containerd[1504]: time="2025-05-13T23:46:55.969414647Z" level=info msg="StartContainer for \"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\"" May 13 23:46:55.972840 containerd[1504]: time="2025-05-13T23:46:55.972267957Z" level=info msg="connecting to shim 8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0" address="unix:///run/containerd/s/2afe0c294b11adb810263265fbddb24a4d84c9f67871d86ee79bd960ee55520f" protocol=ttrpc version=3 May 13 23:46:56.015317 systemd[1]: Started cri-containerd-8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0.scope - libcontainer container 8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0. May 13 23:46:56.109694 containerd[1504]: time="2025-05-13T23:46:56.109347747Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078\" id:\"df72de2aad5b30292afde9205e9c407099b4750ed0bd8b20cad8d465d7898274\" pid:6050 exit_status:1 exited_at:{seconds:1747180016 nanos:108394711}" May 13 23:46:56.128047 containerd[1504]: time="2025-05-13T23:46:56.128001086Z" level=info msg="StartContainer for \"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" returns successfully" May 13 23:46:56.960715 kubelet[2780]: I0513 23:46:56.960654 2780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7df7ff6dfb-b29xg" podStartSLOduration=5.960632543 podStartE2EDuration="5.960632543s" podCreationTimestamp="2025-05-13 23:46:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:46:56.959295067 +0000 UTC m=+75.777560787" watchObservedRunningTime="2025-05-13 23:46:56.960632543 +0000 UTC m=+75.778898223" May 13 23:46:57.027267 containerd[1504]: time="2025-05-13T23:46:57.026474245Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"ae64896c6d0f6af17f01a644eaf35c22bbf69984e1f186afaa4f0f4d92081727\" pid:6225 exit_status:1 exited_at:{seconds:1747180017 nanos:25008250}" May 13 23:46:57.150149 containerd[1504]: time="2025-05-13T23:46:57.150062831Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078\" id:\"ce4315f20cd3f4c2aa66fbc9e92dadc8bcd65863fc09b3a5906e0c283d8713cc\" pid:6223 exit_status:1 exited_at:{seconds:1747180017 nanos:149777952}" May 13 23:46:57.724720 systemd-networkd[1392]: cali2d98a4c4715: Gained IPv6LL May 13 23:46:57.974762 containerd[1504]: time="2025-05-13T23:46:57.974486309Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"12362c5b66afec7bde78bb151ac94ae05224a4e306eae37010d9283d721fb6e9\" pid:6341 exit_status:1 exited_at:{seconds:1747180017 nanos:973637552}" May 13 23:46:58.991388 containerd[1504]: time="2025-05-13T23:46:58.991313450Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"b7ab83999b088dbba127d32769b24ac9b97406c0a7e5352b43cbcd2a1ed9c753\" pid:6364 exited_at:{seconds:1747180018 nanos:990877372}" May 13 23:47:27.003772 containerd[1504]: time="2025-05-13T23:47:27.003720908Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078\" id:\"de157d5fd56c5805d2a00daa465f2c3c50f072cb8ddb5ca09cde926445de918f\" pid:6418 exited_at:{seconds:1747180047 nanos:3320990}" May 13 23:47:28.979664 containerd[1504]: time="2025-05-13T23:47:28.979020753Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"2141dbab44a5f198ce5b514ff7edeedffc01ebd79d4306ad1470cf9305dcd73b\" pid:6443 exited_at:{seconds:1747180048 nanos:978695554}" May 13 23:47:41.323303 containerd[1504]: time="2025-05-13T23:47:41.323237674Z" level=info msg="StopPodSandbox for \"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\"" May 13 23:47:41.426820 containerd[1504]: 2025-05-13 23:47:41.377 [WARNING][6474] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:47:41.426820 containerd[1504]: 2025-05-13 23:47:41.377 [INFO][6474] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" May 13 23:47:41.426820 containerd[1504]: 2025-05-13 23:47:41.377 [INFO][6474] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" iface="eth0" netns="" May 13 23:47:41.426820 containerd[1504]: 2025-05-13 23:47:41.377 [INFO][6474] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" May 13 23:47:41.426820 containerd[1504]: 2025-05-13 23:47:41.377 [INFO][6474] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" May 13 23:47:41.426820 containerd[1504]: 2025-05-13 23:47:41.401 [INFO][6481] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" HandleID="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:47:41.426820 containerd[1504]: 2025-05-13 23:47:41.401 [INFO][6481] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:47:41.426820 containerd[1504]: 2025-05-13 23:47:41.401 [INFO][6481] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:47:41.426820 containerd[1504]: 2025-05-13 23:47:41.419 [WARNING][6481] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" HandleID="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:47:41.426820 containerd[1504]: 2025-05-13 23:47:41.419 [INFO][6481] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" HandleID="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:47:41.426820 containerd[1504]: 2025-05-13 23:47:41.422 [INFO][6481] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:47:41.426820 containerd[1504]: 2025-05-13 23:47:41.424 [INFO][6474] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" May 13 23:47:41.427764 containerd[1504]: time="2025-05-13T23:47:41.426841403Z" level=info msg="TearDown network for sandbox \"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\" successfully" May 13 23:47:41.427764 containerd[1504]: time="2025-05-13T23:47:41.426866444Z" level=info msg="StopPodSandbox for \"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\" returns successfully" May 13 23:47:41.427764 containerd[1504]: time="2025-05-13T23:47:41.427481972Z" level=info msg="RemovePodSandbox for \"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\"" May 13 23:47:41.429278 containerd[1504]: time="2025-05-13T23:47:41.428997552Z" level=info msg="Forcibly stopping sandbox \"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\"" May 13 23:47:41.529745 containerd[1504]: 2025-05-13 23:47:41.481 [WARNING][6500] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:47:41.529745 containerd[1504]: 2025-05-13 23:47:41.481 [INFO][6500] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" May 13 23:47:41.529745 containerd[1504]: 2025-05-13 23:47:41.481 [INFO][6500] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" iface="eth0" netns="" May 13 23:47:41.529745 containerd[1504]: 2025-05-13 23:47:41.481 [INFO][6500] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" May 13 23:47:41.529745 containerd[1504]: 2025-05-13 23:47:41.481 [INFO][6500] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" May 13 23:47:41.529745 containerd[1504]: 2025-05-13 23:47:41.504 [INFO][6507] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" HandleID="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:47:41.529745 containerd[1504]: 2025-05-13 23:47:41.504 [INFO][6507] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:47:41.529745 containerd[1504]: 2025-05-13 23:47:41.504 [INFO][6507] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:47:41.529745 containerd[1504]: 2025-05-13 23:47:41.520 [WARNING][6507] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" HandleID="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:47:41.529745 containerd[1504]: 2025-05-13 23:47:41.520 [INFO][6507] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" HandleID="k8s-pod-network.2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--js8v9-eth0" May 13 23:47:41.529745 containerd[1504]: 2025-05-13 23:47:41.524 [INFO][6507] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:47:41.529745 containerd[1504]: 2025-05-13 23:47:41.527 [INFO][6500] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e" May 13 23:47:41.532028 containerd[1504]: time="2025-05-13T23:47:41.530127049Z" level=info msg="TearDown network for sandbox \"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\" successfully" May 13 23:47:41.534186 containerd[1504]: time="2025-05-13T23:47:41.534129061Z" level=info msg="Ensure that sandbox 2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e in task-service has been cleanup successfully" May 13 23:47:41.540637 containerd[1504]: time="2025-05-13T23:47:41.540565026Z" level=info msg="RemovePodSandbox \"2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e\" returns successfully" May 13 23:47:41.541333 containerd[1504]: time="2025-05-13T23:47:41.541306396Z" level=info msg="StopPodSandbox for \"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\"" May 13 23:47:41.637918 containerd[1504]: 2025-05-13 23:47:41.596 [WARNING][6525] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:47:41.637918 containerd[1504]: 2025-05-13 23:47:41.597 [INFO][6525] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" May 13 23:47:41.637918 containerd[1504]: 2025-05-13 23:47:41.597 [INFO][6525] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" iface="eth0" netns="" May 13 23:47:41.637918 containerd[1504]: 2025-05-13 23:47:41.597 [INFO][6525] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" May 13 23:47:41.637918 containerd[1504]: 2025-05-13 23:47:41.597 [INFO][6525] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" May 13 23:47:41.637918 containerd[1504]: 2025-05-13 23:47:41.618 [INFO][6532] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" HandleID="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:47:41.637918 containerd[1504]: 2025-05-13 23:47:41.618 [INFO][6532] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:47:41.637918 containerd[1504]: 2025-05-13 23:47:41.618 [INFO][6532] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:47:41.637918 containerd[1504]: 2025-05-13 23:47:41.630 [WARNING][6532] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" HandleID="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:47:41.637918 containerd[1504]: 2025-05-13 23:47:41.630 [INFO][6532] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" HandleID="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:47:41.637918 containerd[1504]: 2025-05-13 23:47:41.634 [INFO][6532] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:47:41.637918 containerd[1504]: 2025-05-13 23:47:41.635 [INFO][6525] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" May 13 23:47:41.637918 containerd[1504]: time="2025-05-13T23:47:41.637908313Z" level=info msg="TearDown network for sandbox \"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\" successfully" May 13 23:47:41.638511 containerd[1504]: time="2025-05-13T23:47:41.637936073Z" level=info msg="StopPodSandbox for \"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\" returns successfully" May 13 23:47:41.639533 containerd[1504]: time="2025-05-13T23:47:41.638673523Z" level=info msg="RemovePodSandbox for \"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\"" May 13 23:47:41.639533 containerd[1504]: time="2025-05-13T23:47:41.638714764Z" level=info msg="Forcibly stopping sandbox \"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\"" May 13 23:47:41.737404 containerd[1504]: 2025-05-13 23:47:41.694 [WARNING][6551] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:47:41.737404 containerd[1504]: 2025-05-13 23:47:41.695 [INFO][6551] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" May 13 23:47:41.737404 containerd[1504]: 2025-05-13 23:47:41.695 [INFO][6551] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" iface="eth0" netns="" May 13 23:47:41.737404 containerd[1504]: 2025-05-13 23:47:41.695 [INFO][6551] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" May 13 23:47:41.737404 containerd[1504]: 2025-05-13 23:47:41.696 [INFO][6551] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" May 13 23:47:41.737404 containerd[1504]: 2025-05-13 23:47:41.719 [INFO][6558] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" HandleID="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:47:41.737404 containerd[1504]: 2025-05-13 23:47:41.719 [INFO][6558] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:47:41.737404 containerd[1504]: 2025-05-13 23:47:41.719 [INFO][6558] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:47:41.737404 containerd[1504]: 2025-05-13 23:47:41.730 [WARNING][6558] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" HandleID="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:47:41.737404 containerd[1504]: 2025-05-13 23:47:41.730 [INFO][6558] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" HandleID="k8s-pod-network.e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--kube--controllers--7dd84bf879--dbp7p-eth0" May 13 23:47:41.737404 containerd[1504]: 2025-05-13 23:47:41.734 [INFO][6558] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:47:41.737404 containerd[1504]: 2025-05-13 23:47:41.735 [INFO][6551] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9" May 13 23:47:41.737795 containerd[1504]: time="2025-05-13T23:47:41.737474269Z" level=info msg="TearDown network for sandbox \"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\" successfully" May 13 23:47:41.739705 containerd[1504]: time="2025-05-13T23:47:41.739664498Z" level=info msg="Ensure that sandbox e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9 in task-service has been cleanup successfully" May 13 23:47:41.744677 containerd[1504]: time="2025-05-13T23:47:41.744587523Z" level=info msg="RemovePodSandbox \"e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9\" returns successfully" May 13 23:47:41.745423 containerd[1504]: time="2025-05-13T23:47:41.745391374Z" level=info msg="StopPodSandbox for \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\"" May 13 23:47:41.745563 containerd[1504]: time="2025-05-13T23:47:41.745542656Z" level=info msg="TearDown network for sandbox \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" successfully" May 13 23:47:41.745601 containerd[1504]: time="2025-05-13T23:47:41.745561096Z" level=info msg="StopPodSandbox for \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" returns successfully" May 13 23:47:41.745998 containerd[1504]: time="2025-05-13T23:47:41.745974581Z" level=info msg="RemovePodSandbox for \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\"" May 13 23:47:41.746091 containerd[1504]: time="2025-05-13T23:47:41.746008302Z" level=info msg="Forcibly stopping sandbox \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\"" May 13 23:47:41.746149 containerd[1504]: time="2025-05-13T23:47:41.746123263Z" level=info msg="TearDown network for sandbox \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" successfully" May 13 23:47:41.747736 containerd[1504]: time="2025-05-13T23:47:41.747706804Z" level=info msg="Ensure that sandbox 1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca in task-service has been cleanup successfully" May 13 23:47:41.751571 containerd[1504]: time="2025-05-13T23:47:41.751520695Z" level=info msg="RemovePodSandbox \"1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca\" returns successfully" May 13 23:47:41.752129 containerd[1504]: time="2025-05-13T23:47:41.752027621Z" level=info msg="StopPodSandbox for \"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\"" May 13 23:47:41.752251 containerd[1504]: time="2025-05-13T23:47:41.752231304Z" level=info msg="TearDown network for sandbox \"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\" successfully" May 13 23:47:41.752251 containerd[1504]: time="2025-05-13T23:47:41.752249144Z" level=info msg="StopPodSandbox for \"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\" returns successfully" May 13 23:47:41.752620 containerd[1504]: time="2025-05-13T23:47:41.752593869Z" level=info msg="RemovePodSandbox for \"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\"" May 13 23:47:41.752652 containerd[1504]: time="2025-05-13T23:47:41.752625949Z" level=info msg="Forcibly stopping sandbox \"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\"" May 13 23:47:41.752744 containerd[1504]: time="2025-05-13T23:47:41.752724631Z" level=info msg="TearDown network for sandbox \"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\" successfully" May 13 23:47:41.754380 containerd[1504]: time="2025-05-13T23:47:41.754351652Z" level=info msg="Ensure that sandbox 496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653 in task-service has been cleanup successfully" May 13 23:47:41.757679 containerd[1504]: time="2025-05-13T23:47:41.757626415Z" level=info msg="RemovePodSandbox \"496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653\" returns successfully" May 13 23:47:41.758192 containerd[1504]: time="2025-05-13T23:47:41.758142062Z" level=info msg="StopPodSandbox for \"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\"" May 13 23:47:41.882925 containerd[1504]: 2025-05-13 23:47:41.827 [WARNING][6576] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:47:41.882925 containerd[1504]: 2025-05-13 23:47:41.828 [INFO][6576] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" May 13 23:47:41.882925 containerd[1504]: 2025-05-13 23:47:41.828 [INFO][6576] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" iface="eth0" netns="" May 13 23:47:41.882925 containerd[1504]: 2025-05-13 23:47:41.828 [INFO][6576] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" May 13 23:47:41.882925 containerd[1504]: 2025-05-13 23:47:41.828 [INFO][6576] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" May 13 23:47:41.882925 containerd[1504]: 2025-05-13 23:47:41.861 [INFO][6583] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" HandleID="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:47:41.882925 containerd[1504]: 2025-05-13 23:47:41.861 [INFO][6583] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:47:41.882925 containerd[1504]: 2025-05-13 23:47:41.861 [INFO][6583] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:47:41.882925 containerd[1504]: 2025-05-13 23:47:41.874 [WARNING][6583] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" HandleID="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:47:41.882925 containerd[1504]: 2025-05-13 23:47:41.875 [INFO][6583] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" HandleID="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:47:41.882925 containerd[1504]: 2025-05-13 23:47:41.879 [INFO][6583] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:47:41.882925 containerd[1504]: 2025-05-13 23:47:41.881 [INFO][6576] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" May 13 23:47:41.883831 containerd[1504]: time="2025-05-13T23:47:41.882975272Z" level=info msg="TearDown network for sandbox \"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\" successfully" May 13 23:47:41.883831 containerd[1504]: time="2025-05-13T23:47:41.883006793Z" level=info msg="StopPodSandbox for \"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\" returns successfully" May 13 23:47:41.883831 containerd[1504]: time="2025-05-13T23:47:41.883686761Z" level=info msg="RemovePodSandbox for \"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\"" May 13 23:47:41.883831 containerd[1504]: time="2025-05-13T23:47:41.883718882Z" level=info msg="Forcibly stopping sandbox \"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\"" May 13 23:47:41.978222 containerd[1504]: 2025-05-13 23:47:41.930 [WARNING][6602] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" WorkloadEndpoint="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:47:41.978222 containerd[1504]: 2025-05-13 23:47:41.930 [INFO][6602] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" May 13 23:47:41.978222 containerd[1504]: 2025-05-13 23:47:41.930 [INFO][6602] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" iface="eth0" netns="" May 13 23:47:41.978222 containerd[1504]: 2025-05-13 23:47:41.930 [INFO][6602] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" May 13 23:47:41.978222 containerd[1504]: 2025-05-13 23:47:41.930 [INFO][6602] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" May 13 23:47:41.978222 containerd[1504]: 2025-05-13 23:47:41.953 [INFO][6609] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" HandleID="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:47:41.978222 containerd[1504]: 2025-05-13 23:47:41.953 [INFO][6609] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:47:41.978222 containerd[1504]: 2025-05-13 23:47:41.953 [INFO][6609] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:47:41.978222 containerd[1504]: 2025-05-13 23:47:41.967 [WARNING][6609] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" HandleID="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:47:41.978222 containerd[1504]: 2025-05-13 23:47:41.967 [INFO][6609] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" HandleID="k8s-pod-network.7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" Workload="ci--4284--0--0--n--732e99817a-k8s-calico--apiserver--9b8b8f55f--twx7r-eth0" May 13 23:47:41.978222 containerd[1504]: 2025-05-13 23:47:41.971 [INFO][6609] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:47:41.978222 containerd[1504]: 2025-05-13 23:47:41.973 [INFO][6602] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f" May 13 23:47:41.978222 containerd[1504]: time="2025-05-13T23:47:41.976605470Z" level=info msg="TearDown network for sandbox \"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\" successfully" May 13 23:47:41.980804 containerd[1504]: time="2025-05-13T23:47:41.980551402Z" level=info msg="Ensure that sandbox 7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f in task-service has been cleanup successfully" May 13 23:47:41.984246 containerd[1504]: time="2025-05-13T23:47:41.984023608Z" level=info msg="RemovePodSandbox \"7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f\" returns successfully" May 13 23:47:55.576465 containerd[1504]: time="2025-05-13T23:47:55.576354500Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"a5d743ff3cfb6dc5334b5a334c12fdc3f3e943d1d083d29e2b6880c8e2d753a8\" pid:6631 exited_at:{seconds:1747180075 nanos:576020937}" May 13 23:47:57.005154 containerd[1504]: time="2025-05-13T23:47:57.005044379Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078\" id:\"4a331b2eca84f5998b045dfc6d5a3467f9f4a98d851636b9db07b0b5aa5b6456\" pid:6652 exited_at:{seconds:1747180077 nanos:4499974}" May 13 23:47:58.970536 containerd[1504]: time="2025-05-13T23:47:58.970490425Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"625ec067dbe9b4f2af394ae6132d8a614a6ff5b1cbe2361257e30b8a6847b4d5\" pid:6677 exited_at:{seconds:1747180078 nanos:970004700}" May 13 23:48:27.008179 containerd[1504]: time="2025-05-13T23:48:27.008054489Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078\" id:\"d68d9a8043b27c5a72236567e2102a9666c68fb25b457a91c101dcbb505679a6\" pid:6711 exited_at:{seconds:1747180107 nanos:7733927}" May 13 23:48:28.981670 containerd[1504]: time="2025-05-13T23:48:28.981598565Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"cf003f64203cce7da7de9c46e017293b28ed449e196620d09373775b0d22fd9a\" pid:6735 exited_at:{seconds:1747180108 nanos:980924801}" May 13 23:48:55.577609 containerd[1504]: time="2025-05-13T23:48:55.577556742Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"cbfa2cdf9a250b5ecc30705f662173d9e8c7d5cc4f5a3898752ed17ff2ca8427\" pid:6783 exited_at:{seconds:1747180135 nanos:577217382}" May 13 23:48:56.993411 containerd[1504]: time="2025-05-13T23:48:56.993347739Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078\" id:\"cad4056e409f5f9dc07dca4320f52e84dd2ec946eb09739a42c3a8b8120a3b9e\" pid:6804 exited_at:{seconds:1747180136 nanos:992924898}" May 13 23:48:58.980966 containerd[1504]: time="2025-05-13T23:48:58.980837838Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"097e036c0eb81d2ea2df667c2c7d8e220910d9cc463276a9c527d54d91af07b0\" pid:6829 exited_at:{seconds:1747180138 nanos:980485638}" May 13 23:49:27.010542 containerd[1504]: time="2025-05-13T23:49:27.010279602Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078\" id:\"05549687a1fecc4cfa13b3f1cac4d492fe6e87fb0cef4591590abf00a6bef339\" pid:6854 exited_at:{seconds:1747180167 nanos:9244322}" May 13 23:49:28.979097 containerd[1504]: time="2025-05-13T23:49:28.979017039Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"485e402060d5dd06f90474c5f9b44ed62122adb0139a64dd3b8c486dbd33cb90\" pid:6878 exited_at:{seconds:1747180168 nanos:978147679}" May 13 23:49:55.571732 containerd[1504]: time="2025-05-13T23:49:55.571678904Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"ce79730c7d65d669ebc4b07f641ef24caababb95ae738617642ae5881aa36405\" pid:6911 exited_at:{seconds:1747180195 nanos:570935665}" May 13 23:49:57.007673 containerd[1504]: time="2025-05-13T23:49:57.007623015Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078\" id:\"c7a6ede348fc1e3f07a2c6de8feef08e110ac9839ea70d9bd669eb9afe390612\" pid:6932 exited_at:{seconds:1747180197 nanos:7199135}" May 13 23:49:58.976852 containerd[1504]: time="2025-05-13T23:49:58.976796217Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"289af68a61344f701c2151219d1a2e2d29d837c94003715856c3ae5a22acd992\" pid:6957 exited_at:{seconds:1747180198 nanos:976596497}" May 13 23:50:07.886970 update_engine[1475]: I20250513 23:50:07.886202 1475 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 13 23:50:07.886970 update_engine[1475]: I20250513 23:50:07.886307 1475 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 13 23:50:07.888853 update_engine[1475]: I20250513 23:50:07.888028 1475 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 13 23:50:07.889628 update_engine[1475]: I20250513 23:50:07.889478 1475 omaha_request_params.cc:62] Current group set to alpha May 13 23:50:07.894137 update_engine[1475]: I20250513 23:50:07.890827 1475 update_attempter.cc:499] Already updated boot flags. Skipping. May 13 23:50:07.894137 update_engine[1475]: I20250513 23:50:07.890863 1475 update_attempter.cc:643] Scheduling an action processor start. May 13 23:50:07.894137 update_engine[1475]: I20250513 23:50:07.890893 1475 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 13 23:50:07.895751 update_engine[1475]: I20250513 23:50:07.895691 1475 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 13 23:50:07.895890 update_engine[1475]: I20250513 23:50:07.895852 1475 omaha_request_action.cc:271] Posting an Omaha request to disabled May 13 23:50:07.895890 update_engine[1475]: I20250513 23:50:07.895872 1475 omaha_request_action.cc:272] Request: May 13 23:50:07.895890 update_engine[1475]: May 13 23:50:07.895890 update_engine[1475]: May 13 23:50:07.895890 update_engine[1475]: May 13 23:50:07.895890 update_engine[1475]: May 13 23:50:07.895890 update_engine[1475]: May 13 23:50:07.895890 update_engine[1475]: May 13 23:50:07.895890 update_engine[1475]: May 13 23:50:07.895890 update_engine[1475]: May 13 23:50:07.895890 update_engine[1475]: I20250513 23:50:07.895885 1475 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:50:07.903448 update_engine[1475]: I20250513 23:50:07.902933 1475 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:50:07.903448 update_engine[1475]: I20250513 23:50:07.903391 1475 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:50:07.908157 update_engine[1475]: E20250513 23:50:07.907626 1475 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:50:07.908157 update_engine[1475]: I20250513 23:50:07.907768 1475 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 13 23:50:07.908657 locksmithd[1515]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 13 23:50:17.795181 update_engine[1475]: I20250513 23:50:17.794533 1475 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:50:17.795181 update_engine[1475]: I20250513 23:50:17.794909 1475 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:50:17.795830 update_engine[1475]: I20250513 23:50:17.795416 1475 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:50:17.797270 update_engine[1475]: E20250513 23:50:17.797188 1475 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:50:17.797383 update_engine[1475]: I20250513 23:50:17.797307 1475 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 13 23:50:23.851826 kernel: hrtimer: interrupt took 4750633 ns May 13 23:50:26.372373 systemd[1]: Started sshd@8-138.199.236.81:22-139.178.89.65:41108.service - OpenSSH per-connection server daemon (139.178.89.65:41108). May 13 23:50:27.022888 containerd[1504]: time="2025-05-13T23:50:27.022792871Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078\" id:\"71b3876472a7bdc70aafd2aa3f0093de27a422743c3cb603e5eb5cb9c71ee757\" pid:7005 exited_at:{seconds:1747180227 nanos:22289632}" May 13 23:50:27.400813 sshd[6989]: Accepted publickey for core from 139.178.89.65 port 41108 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:27.404340 sshd-session[6989]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:27.413731 systemd-logind[1474]: New session 8 of user core. May 13 23:50:27.421346 systemd[1]: Started session-8.scope - Session 8 of User core. May 13 23:50:27.789095 update_engine[1475]: I20250513 23:50:27.787112 1475 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:50:27.789095 update_engine[1475]: I20250513 23:50:27.787470 1475 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:50:27.789095 update_engine[1475]: I20250513 23:50:27.787766 1475 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:50:27.789616 update_engine[1475]: E20250513 23:50:27.789103 1475 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:50:27.789616 update_engine[1475]: I20250513 23:50:27.789196 1475 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 13 23:50:28.206220 sshd[7017]: Connection closed by 139.178.89.65 port 41108 May 13 23:50:28.206756 sshd-session[6989]: pam_unix(sshd:session): session closed for user core May 13 23:50:28.213035 systemd[1]: sshd@8-138.199.236.81:22-139.178.89.65:41108.service: Deactivated successfully. May 13 23:50:28.216556 systemd[1]: session-8.scope: Deactivated successfully. May 13 23:50:28.217752 systemd-logind[1474]: Session 8 logged out. Waiting for processes to exit. May 13 23:50:28.219459 systemd-logind[1474]: Removed session 8. May 13 23:50:28.977026 containerd[1504]: time="2025-05-13T23:50:28.976802505Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"5d2c42cea9795845c1d8d861d47daa9564ed626876370ad169f92b121db855a7\" pid:7041 exited_at:{seconds:1747180228 nanos:976570545}" May 13 23:50:33.386334 systemd[1]: Started sshd@9-138.199.236.81:22-139.178.89.65:39344.service - OpenSSH per-connection server daemon (139.178.89.65:39344). May 13 23:50:34.410128 sshd[7051]: Accepted publickey for core from 139.178.89.65 port 39344 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:34.411549 sshd-session[7051]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:34.418060 systemd-logind[1474]: New session 9 of user core. May 13 23:50:34.426401 systemd[1]: Started session-9.scope - Session 9 of User core. May 13 23:50:35.202170 sshd[7053]: Connection closed by 139.178.89.65 port 39344 May 13 23:50:35.202999 sshd-session[7051]: pam_unix(sshd:session): session closed for user core May 13 23:50:35.208459 systemd[1]: sshd@9-138.199.236.81:22-139.178.89.65:39344.service: Deactivated successfully. May 13 23:50:35.211252 systemd[1]: session-9.scope: Deactivated successfully. May 13 23:50:35.212922 systemd-logind[1474]: Session 9 logged out. Waiting for processes to exit. May 13 23:50:35.214076 systemd-logind[1474]: Removed session 9. May 13 23:50:35.765747 containerd[1504]: time="2025-05-13T23:50:35.765630945Z" level=warning msg="container event discarded" container=db3b737e89b7f17d83d874b3ee74520d1f351a648c95a706901194847310610c type=CONTAINER_CREATED_EVENT May 13 23:50:35.776054 containerd[1504]: time="2025-05-13T23:50:35.775971326Z" level=warning msg="container event discarded" container=db3b737e89b7f17d83d874b3ee74520d1f351a648c95a706901194847310610c type=CONTAINER_STARTED_EVENT May 13 23:50:35.805417 containerd[1504]: time="2025-05-13T23:50:35.805338951Z" level=warning msg="container event discarded" container=64210b927d7725f252706030e2b5f1e64c6ce39a8b910aefd8c6c7e881abf001 type=CONTAINER_CREATED_EVENT May 13 23:50:35.805417 containerd[1504]: time="2025-05-13T23:50:35.805405110Z" level=warning msg="container event discarded" container=64210b927d7725f252706030e2b5f1e64c6ce39a8b910aefd8c6c7e881abf001 type=CONTAINER_STARTED_EVENT May 13 23:50:35.805417 containerd[1504]: time="2025-05-13T23:50:35.805422190Z" level=warning msg="container event discarded" container=152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55 type=CONTAINER_CREATED_EVENT May 13 23:50:35.820225 containerd[1504]: time="2025-05-13T23:50:35.820028603Z" level=warning msg="container event discarded" container=d05c4f82ed799321aaf1608ef36f8f707e728b40d6b65935e37253dacdc1bf51 type=CONTAINER_CREATED_EVENT May 13 23:50:35.820225 containerd[1504]: time="2025-05-13T23:50:35.820109323Z" level=warning msg="container event discarded" container=d05c4f82ed799321aaf1608ef36f8f707e728b40d6b65935e37253dacdc1bf51 type=CONTAINER_STARTED_EVENT May 13 23:50:35.820225 containerd[1504]: time="2025-05-13T23:50:35.820125163Z" level=warning msg="container event discarded" container=54354e152d93b1dc3e36e54a98d41d1c660a0b859f8396cd30b90d3a55bbffbb type=CONTAINER_CREATED_EVENT May 13 23:50:35.868815 containerd[1504]: time="2025-05-13T23:50:35.868593072Z" level=warning msg="container event discarded" container=33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9 type=CONTAINER_CREATED_EVENT May 13 23:50:35.937025 containerd[1504]: time="2025-05-13T23:50:35.936937304Z" level=warning msg="container event discarded" container=54354e152d93b1dc3e36e54a98d41d1c660a0b859f8396cd30b90d3a55bbffbb type=CONTAINER_STARTED_EVENT May 13 23:50:35.968374 containerd[1504]: time="2025-05-13T23:50:35.968293805Z" level=warning msg="container event discarded" container=152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55 type=CONTAINER_STARTED_EVENT May 13 23:50:35.996729 containerd[1504]: time="2025-05-13T23:50:35.996645632Z" level=warning msg="container event discarded" container=33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9 type=CONTAINER_STARTED_EVENT May 13 23:50:37.787264 update_engine[1475]: I20250513 23:50:37.787135 1475 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:50:37.787800 update_engine[1475]: I20250513 23:50:37.787532 1475 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:50:37.787968 update_engine[1475]: I20250513 23:50:37.787863 1475 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:50:37.789302 update_engine[1475]: E20250513 23:50:37.789228 1475 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:50:37.789415 update_engine[1475]: I20250513 23:50:37.789351 1475 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 13 23:50:37.789415 update_engine[1475]: I20250513 23:50:37.789377 1475 omaha_request_action.cc:617] Omaha request response: May 13 23:50:37.789540 update_engine[1475]: E20250513 23:50:37.789510 1475 omaha_request_action.cc:636] Omaha request network transfer failed. May 13 23:50:37.789898 update_engine[1475]: I20250513 23:50:37.789832 1475 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 13 23:50:37.789898 update_engine[1475]: I20250513 23:50:37.789882 1475 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 13 23:50:37.789972 update_engine[1475]: I20250513 23:50:37.789901 1475 update_attempter.cc:306] Processing Done. May 13 23:50:37.789972 update_engine[1475]: E20250513 23:50:37.789928 1475 update_attempter.cc:619] Update failed. May 13 23:50:37.789972 update_engine[1475]: I20250513 23:50:37.789942 1475 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 13 23:50:37.789972 update_engine[1475]: I20250513 23:50:37.789953 1475 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 13 23:50:37.790084 update_engine[1475]: I20250513 23:50:37.789968 1475 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 13 23:50:37.790329 update_engine[1475]: I20250513 23:50:37.790158 1475 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 13 23:50:37.790329 update_engine[1475]: I20250513 23:50:37.790220 1475 omaha_request_action.cc:271] Posting an Omaha request to disabled May 13 23:50:37.790329 update_engine[1475]: I20250513 23:50:37.790236 1475 omaha_request_action.cc:272] Request: May 13 23:50:37.790329 update_engine[1475]: May 13 23:50:37.790329 update_engine[1475]: May 13 23:50:37.790329 update_engine[1475]: May 13 23:50:37.790329 update_engine[1475]: May 13 23:50:37.790329 update_engine[1475]: May 13 23:50:37.790329 update_engine[1475]: May 13 23:50:37.790329 update_engine[1475]: I20250513 23:50:37.790249 1475 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:50:37.790626 update_engine[1475]: I20250513 23:50:37.790591 1475 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:50:37.791394 update_engine[1475]: I20250513 23:50:37.791010 1475 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:50:37.791483 locksmithd[1515]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 13 23:50:37.791938 update_engine[1475]: E20250513 23:50:37.791873 1475 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:50:37.791938 update_engine[1475]: I20250513 23:50:37.791930 1475 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 13 23:50:37.792035 update_engine[1475]: I20250513 23:50:37.791942 1475 omaha_request_action.cc:617] Omaha request response: May 13 23:50:37.792035 update_engine[1475]: I20250513 23:50:37.791949 1475 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 13 23:50:37.792035 update_engine[1475]: I20250513 23:50:37.791954 1475 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 13 23:50:37.792035 update_engine[1475]: I20250513 23:50:37.791958 1475 update_attempter.cc:306] Processing Done. May 13 23:50:37.792035 update_engine[1475]: I20250513 23:50:37.791965 1475 update_attempter.cc:310] Error event sent. May 13 23:50:37.792035 update_engine[1475]: I20250513 23:50:37.791972 1475 update_check_scheduler.cc:74] Next update check in 40m45s May 13 23:50:37.792326 locksmithd[1515]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 13 23:50:40.386356 systemd[1]: Started sshd@10-138.199.236.81:22-139.178.89.65:36168.service - OpenSSH per-connection server daemon (139.178.89.65:36168). May 13 23:50:41.427347 sshd[7065]: Accepted publickey for core from 139.178.89.65 port 36168 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:41.429258 sshd-session[7065]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:41.437079 systemd-logind[1474]: New session 10 of user core. May 13 23:50:41.444599 systemd[1]: Started session-10.scope - Session 10 of User core. May 13 23:50:42.222781 sshd[7069]: Connection closed by 139.178.89.65 port 36168 May 13 23:50:42.223912 sshd-session[7065]: pam_unix(sshd:session): session closed for user core May 13 23:50:42.230451 systemd[1]: sshd@10-138.199.236.81:22-139.178.89.65:36168.service: Deactivated successfully. May 13 23:50:42.233062 systemd[1]: session-10.scope: Deactivated successfully. May 13 23:50:42.234892 systemd-logind[1474]: Session 10 logged out. Waiting for processes to exit. May 13 23:50:42.236550 systemd-logind[1474]: Removed session 10. May 13 23:50:42.405013 systemd[1]: Started sshd@11-138.199.236.81:22-139.178.89.65:36170.service - OpenSSH per-connection server daemon (139.178.89.65:36170). May 13 23:50:43.434854 sshd[7082]: Accepted publickey for core from 139.178.89.65 port 36170 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:43.436010 sshd-session[7082]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:43.442350 systemd-logind[1474]: New session 11 of user core. May 13 23:50:43.452599 systemd[1]: Started session-11.scope - Session 11 of User core. May 13 23:50:44.269304 sshd[7084]: Connection closed by 139.178.89.65 port 36170 May 13 23:50:44.267988 sshd-session[7082]: pam_unix(sshd:session): session closed for user core May 13 23:50:44.274525 systemd[1]: sshd@11-138.199.236.81:22-139.178.89.65:36170.service: Deactivated successfully. May 13 23:50:44.278316 systemd[1]: session-11.scope: Deactivated successfully. May 13 23:50:44.281737 systemd-logind[1474]: Session 11 logged out. Waiting for processes to exit. May 13 23:50:44.284599 systemd-logind[1474]: Removed session 11. May 13 23:50:44.447568 systemd[1]: Started sshd@12-138.199.236.81:22-139.178.89.65:36184.service - OpenSSH per-connection server daemon (139.178.89.65:36184). May 13 23:50:45.456836 sshd[7098]: Accepted publickey for core from 139.178.89.65 port 36184 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:45.459684 sshd-session[7098]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:45.466332 systemd-logind[1474]: New session 12 of user core. May 13 23:50:45.471289 systemd[1]: Started session-12.scope - Session 12 of User core. May 13 23:50:46.227107 sshd[7100]: Connection closed by 139.178.89.65 port 36184 May 13 23:50:46.228313 sshd-session[7098]: pam_unix(sshd:session): session closed for user core May 13 23:50:46.232563 systemd[1]: sshd@12-138.199.236.81:22-139.178.89.65:36184.service: Deactivated successfully. May 13 23:50:46.235959 systemd[1]: session-12.scope: Deactivated successfully. May 13 23:50:46.238435 systemd-logind[1474]: Session 12 logged out. Waiting for processes to exit. May 13 23:50:46.240616 systemd-logind[1474]: Removed session 12. May 13 23:50:47.150902 containerd[1504]: time="2025-05-13T23:50:47.150804207Z" level=warning msg="container event discarded" container=e634c64f7da54236c82a7aec1392e7db8aa16538dbda7a0f52335ba2b04bebf1 type=CONTAINER_CREATED_EVENT May 13 23:50:47.150902 containerd[1504]: time="2025-05-13T23:50:47.150867087Z" level=warning msg="container event discarded" container=e634c64f7da54236c82a7aec1392e7db8aa16538dbda7a0f52335ba2b04bebf1 type=CONTAINER_STARTED_EVENT May 13 23:50:47.183240 containerd[1504]: time="2025-05-13T23:50:47.183146741Z" level=warning msg="container event discarded" container=45c165aa5a49d9af786ef6f45f0669395e300e33682f7c674f263fe5c04c6907 type=CONTAINER_CREATED_EVENT May 13 23:50:47.256632 containerd[1504]: time="2025-05-13T23:50:47.256514630Z" level=warning msg="container event discarded" container=45c165aa5a49d9af786ef6f45f0669395e300e33682f7c674f263fe5c04c6907 type=CONTAINER_STARTED_EVENT May 13 23:50:47.530179 containerd[1504]: time="2025-05-13T23:50:47.530112468Z" level=warning msg="container event discarded" container=5d0ffeaab52566e7db58f605bce7024eff088e7f2b3e7eddedec93db5aacd7a4 type=CONTAINER_CREATED_EVENT May 13 23:50:47.530179 containerd[1504]: time="2025-05-13T23:50:47.530169228Z" level=warning msg="container event discarded" container=5d0ffeaab52566e7db58f605bce7024eff088e7f2b3e7eddedec93db5aacd7a4 type=CONTAINER_STARTED_EVENT May 13 23:50:49.848314 containerd[1504]: time="2025-05-13T23:50:49.847987634Z" level=warning msg="container event discarded" container=c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878 type=CONTAINER_CREATED_EVENT May 13 23:50:49.921122 containerd[1504]: time="2025-05-13T23:50:49.920979442Z" level=warning msg="container event discarded" container=c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878 type=CONTAINER_STARTED_EVENT May 13 23:50:51.401875 systemd[1]: Started sshd@13-138.199.236.81:22-139.178.89.65:59386.service - OpenSSH per-connection server daemon (139.178.89.65:59386). May 13 23:50:52.429046 sshd[7114]: Accepted publickey for core from 139.178.89.65 port 59386 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:52.432581 sshd-session[7114]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:52.440534 systemd-logind[1474]: New session 13 of user core. May 13 23:50:52.446490 systemd[1]: Started session-13.scope - Session 13 of User core. May 13 23:50:53.248334 sshd[7116]: Connection closed by 139.178.89.65 port 59386 May 13 23:50:53.249386 sshd-session[7114]: pam_unix(sshd:session): session closed for user core May 13 23:50:53.255509 systemd[1]: sshd@13-138.199.236.81:22-139.178.89.65:59386.service: Deactivated successfully. May 13 23:50:53.258443 systemd[1]: session-13.scope: Deactivated successfully. May 13 23:50:53.259417 systemd-logind[1474]: Session 13 logged out. Waiting for processes to exit. May 13 23:50:53.260888 systemd-logind[1474]: Removed session 13. May 13 23:50:53.373327 systemd[1]: Started sshd@14-138.199.236.81:22-139.178.89.65:59388.service - OpenSSH per-connection server daemon (139.178.89.65:59388). May 13 23:50:54.384779 sshd[7128]: Accepted publickey for core from 139.178.89.65 port 59388 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:54.388814 sshd-session[7128]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:54.396263 systemd-logind[1474]: New session 14 of user core. May 13 23:50:54.404335 systemd[1]: Started session-14.scope - Session 14 of User core. May 13 23:50:54.568627 containerd[1504]: time="2025-05-13T23:50:54.568521292Z" level=warning msg="container event discarded" container=496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653 type=CONTAINER_CREATED_EVENT May 13 23:50:54.568627 containerd[1504]: time="2025-05-13T23:50:54.568598612Z" level=warning msg="container event discarded" container=496850ebcf933a2410bada152913948c89160e296b24cffb08adddb545bbf653 type=CONTAINER_STARTED_EVENT May 13 23:50:54.677735 containerd[1504]: time="2025-05-13T23:50:54.677491139Z" level=warning msg="container event discarded" container=1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca type=CONTAINER_CREATED_EVENT May 13 23:50:54.677735 containerd[1504]: time="2025-05-13T23:50:54.677565779Z" level=warning msg="container event discarded" container=1c0636db1bac6764ce578a6f4f8f3015abd96cfa2620aa14d4981ae6d76be7ca type=CONTAINER_STARTED_EVENT May 13 23:50:55.288614 sshd[7130]: Connection closed by 139.178.89.65 port 59388 May 13 23:50:55.289283 sshd-session[7128]: pam_unix(sshd:session): session closed for user core May 13 23:50:55.296994 systemd[1]: sshd@14-138.199.236.81:22-139.178.89.65:59388.service: Deactivated successfully. May 13 23:50:55.301726 systemd[1]: session-14.scope: Deactivated successfully. May 13 23:50:55.303046 systemd-logind[1474]: Session 14 logged out. Waiting for processes to exit. May 13 23:50:55.304867 systemd-logind[1474]: Removed session 14. May 13 23:50:55.459565 systemd[1]: Started sshd@15-138.199.236.81:22-139.178.89.65:59392.service - OpenSSH per-connection server daemon (139.178.89.65:59392). May 13 23:50:55.574342 containerd[1504]: time="2025-05-13T23:50:55.573921411Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"379abc032b9adb0e47abced4f5ea2ab73ed79e2201fcd12e3ece77272edfa83f\" pid:7155 exited_at:{seconds:1747180255 nanos:573596771}" May 13 23:50:56.459145 sshd[7140]: Accepted publickey for core from 139.178.89.65 port 59392 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:56.460861 sshd-session[7140]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:56.473332 systemd-logind[1474]: New session 15 of user core. May 13 23:50:56.478450 systemd[1]: Started session-15.scope - Session 15 of User core. May 13 23:50:56.995156 containerd[1504]: time="2025-05-13T23:50:56.995060376Z" level=warning msg="container event discarded" container=b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41 type=CONTAINER_CREATED_EVENT May 13 23:50:57.003320 containerd[1504]: time="2025-05-13T23:50:57.002599800Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078\" id:\"371c9a9262881151360b1612ff790ce7c36072729aeb6b7081bfce82cb9c07db\" pid:7177 exited_at:{seconds:1747180257 nanos:1893562}" May 13 23:50:57.092777 containerd[1504]: time="2025-05-13T23:50:57.092697924Z" level=warning msg="container event discarded" container=b7254936ca66bd099845438ea9155603d4f7b625bdd75802f72a2409e5194d41 type=CONTAINER_STARTED_EVENT May 13 23:50:58.304397 sshd[7164]: Connection closed by 139.178.89.65 port 59392 May 13 23:50:58.305041 sshd-session[7140]: pam_unix(sshd:session): session closed for user core May 13 23:50:58.312243 systemd-logind[1474]: Session 15 logged out. Waiting for processes to exit. May 13 23:50:58.312642 systemd[1]: sshd@15-138.199.236.81:22-139.178.89.65:59392.service: Deactivated successfully. May 13 23:50:58.314903 systemd[1]: session-15.scope: Deactivated successfully. May 13 23:50:58.316899 systemd-logind[1474]: Removed session 15. May 13 23:50:58.482488 systemd[1]: Started sshd@16-138.199.236.81:22-139.178.89.65:32860.service - OpenSSH per-connection server daemon (139.178.89.65:32860). May 13 23:50:58.573900 containerd[1504]: time="2025-05-13T23:50:58.573702651Z" level=warning msg="container event discarded" container=a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2 type=CONTAINER_CREATED_EVENT May 13 23:50:58.674024 containerd[1504]: time="2025-05-13T23:50:58.673932991Z" level=warning msg="container event discarded" container=a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2 type=CONTAINER_STARTED_EVENT May 13 23:50:58.839394 containerd[1504]: time="2025-05-13T23:50:58.839255389Z" level=warning msg="container event discarded" container=a682e9411b2514ed00478433b454f1249bf0614a839c548d9917650ca5fcb2f2 type=CONTAINER_STOPPED_EVENT May 13 23:50:58.979758 containerd[1504]: time="2025-05-13T23:50:58.979712802Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"e820b44d0ed337895d933f8dd904ef03d3d7c8a118976f6f360a7adce075ba65\" pid:7220 exited_at:{seconds:1747180258 nanos:979458642}" May 13 23:50:59.489890 sshd[7206]: Accepted publickey for core from 139.178.89.65 port 32860 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:59.492036 sshd-session[7206]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:59.497736 systemd-logind[1474]: New session 16 of user core. May 13 23:50:59.503270 systemd[1]: Started session-16.scope - Session 16 of User core. May 13 23:51:00.399554 sshd[7229]: Connection closed by 139.178.89.65 port 32860 May 13 23:51:00.400043 sshd-session[7206]: pam_unix(sshd:session): session closed for user core May 13 23:51:00.405516 systemd[1]: sshd@16-138.199.236.81:22-139.178.89.65:32860.service: Deactivated successfully. May 13 23:51:00.408242 systemd[1]: session-16.scope: Deactivated successfully. May 13 23:51:00.409433 systemd-logind[1474]: Session 16 logged out. Waiting for processes to exit. May 13 23:51:00.411166 systemd-logind[1474]: Removed session 16. May 13 23:51:00.573996 systemd[1]: Started sshd@17-138.199.236.81:22-139.178.89.65:32866.service - OpenSSH per-connection server daemon (139.178.89.65:32866). May 13 23:51:01.573477 sshd[7239]: Accepted publickey for core from 139.178.89.65 port 32866 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:51:01.575488 sshd-session[7239]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:51:01.584561 systemd-logind[1474]: New session 17 of user core. May 13 23:51:01.591372 systemd[1]: Started session-17.scope - Session 17 of User core. May 13 23:51:02.337148 sshd[7241]: Connection closed by 139.178.89.65 port 32866 May 13 23:51:02.337753 sshd-session[7239]: pam_unix(sshd:session): session closed for user core May 13 23:51:02.343658 systemd[1]: sshd@17-138.199.236.81:22-139.178.89.65:32866.service: Deactivated successfully. May 13 23:51:02.346514 systemd[1]: session-17.scope: Deactivated successfully. May 13 23:51:02.347921 systemd-logind[1474]: Session 17 logged out. Waiting for processes to exit. May 13 23:51:02.349210 systemd-logind[1474]: Removed session 17. May 13 23:51:03.746983 containerd[1504]: time="2025-05-13T23:51:03.746870609Z" level=warning msg="container event discarded" container=bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00 type=CONTAINER_CREATED_EVENT May 13 23:51:03.837341 containerd[1504]: time="2025-05-13T23:51:03.837277926Z" level=warning msg="container event discarded" container=bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00 type=CONTAINER_STARTED_EVENT May 13 23:51:04.524445 containerd[1504]: time="2025-05-13T23:51:04.523953219Z" level=warning msg="container event discarded" container=bf0c60f823038799f0584b6a6a9d4193e6c01546aad1620101d014f21ccd3c00 type=CONTAINER_STOPPED_EVENT May 13 23:51:07.520094 systemd[1]: Started sshd@18-138.199.236.81:22-139.178.89.65:51482.service - OpenSSH per-connection server daemon (139.178.89.65:51482). May 13 23:51:08.554094 sshd[7255]: Accepted publickey for core from 139.178.89.65 port 51482 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:51:08.556366 sshd-session[7255]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:51:08.562016 systemd-logind[1474]: New session 18 of user core. May 13 23:51:08.567382 systemd[1]: Started session-18.scope - Session 18 of User core. May 13 23:51:09.356535 sshd[7257]: Connection closed by 139.178.89.65 port 51482 May 13 23:51:09.356340 sshd-session[7255]: pam_unix(sshd:session): session closed for user core May 13 23:51:09.362341 systemd-logind[1474]: Session 18 logged out. Waiting for processes to exit. May 13 23:51:09.362656 systemd[1]: sshd@18-138.199.236.81:22-139.178.89.65:51482.service: Deactivated successfully. May 13 23:51:09.366672 systemd[1]: session-18.scope: Deactivated successfully. May 13 23:51:09.368802 systemd-logind[1474]: Removed session 18. May 13 23:51:12.120334 containerd[1504]: time="2025-05-13T23:51:12.120203317Z" level=warning msg="container event discarded" container=e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3 type=CONTAINER_CREATED_EVENT May 13 23:51:12.197867 containerd[1504]: time="2025-05-13T23:51:12.197738256Z" level=warning msg="container event discarded" container=e193bbed7bbb0a23b444d7a11f2712d80da8137f814562f65e3ef623277d05d3 type=CONTAINER_STARTED_EVENT May 13 23:51:14.536852 systemd[1]: Started sshd@19-138.199.236.81:22-139.178.89.65:51496.service - OpenSSH per-connection server daemon (139.178.89.65:51496). May 13 23:51:15.549036 sshd[7269]: Accepted publickey for core from 139.178.89.65 port 51496 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:51:15.551870 sshd-session[7269]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:51:15.558641 systemd-logind[1474]: New session 19 of user core. May 13 23:51:15.565427 systemd[1]: Started session-19.scope - Session 19 of User core. May 13 23:51:16.394118 sshd[7271]: Connection closed by 139.178.89.65 port 51496 May 13 23:51:16.394453 sshd-session[7269]: pam_unix(sshd:session): session closed for user core May 13 23:51:16.402183 systemd[1]: sshd@19-138.199.236.81:22-139.178.89.65:51496.service: Deactivated successfully. May 13 23:51:16.405781 systemd[1]: session-19.scope: Deactivated successfully. May 13 23:51:16.407649 systemd-logind[1474]: Session 19 logged out. Waiting for processes to exit. May 13 23:51:16.408867 systemd-logind[1474]: Removed session 19. May 13 23:51:16.711013 containerd[1504]: time="2025-05-13T23:51:16.710455406Z" level=warning msg="container event discarded" container=2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e type=CONTAINER_CREATED_EVENT May 13 23:51:16.711013 containerd[1504]: time="2025-05-13T23:51:16.710692606Z" level=warning msg="container event discarded" container=2f6034c548763f7e9da17be533856a228b3522c35921c8f9922da047adce896e type=CONTAINER_STARTED_EVENT May 13 23:51:18.034578 containerd[1504]: time="2025-05-13T23:51:18.034508381Z" level=warning msg="container event discarded" container=e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9 type=CONTAINER_CREATED_EVENT May 13 23:51:18.034578 containerd[1504]: time="2025-05-13T23:51:18.034568381Z" level=warning msg="container event discarded" container=e959ef0c8ae745e6c8a4f95fd5f9ad55996e4d71c5b028639020b789ac0dddb9 type=CONTAINER_STARTED_EVENT May 13 23:51:18.090734 containerd[1504]: time="2025-05-13T23:51:18.090666767Z" level=warning msg="container event discarded" container=572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035 type=CONTAINER_CREATED_EVENT May 13 23:51:18.090734 containerd[1504]: time="2025-05-13T23:51:18.090728087Z" level=warning msg="container event discarded" container=572c5913148713dfaef89b629b3256e06452dc430c552caeb9496d32638fe035 type=CONTAINER_STARTED_EVENT May 13 23:51:18.133300 containerd[1504]: time="2025-05-13T23:51:18.133127825Z" level=warning msg="container event discarded" container=975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2 type=CONTAINER_CREATED_EVENT May 13 23:51:18.133300 containerd[1504]: time="2025-05-13T23:51:18.133204465Z" level=warning msg="container event discarded" container=975a2de5d23629fcb7d6a7b636110996342cac32d2a7890c013806722f9602f2 type=CONTAINER_STARTED_EVENT May 13 23:51:18.160826 containerd[1504]: time="2025-05-13T23:51:18.160757919Z" level=warning msg="container event discarded" container=80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002 type=CONTAINER_CREATED_EVENT May 13 23:51:18.160826 containerd[1504]: time="2025-05-13T23:51:18.160819679Z" level=warning msg="container event discarded" container=80c77e9de30dc1368dcf43606fcdcbb906f0be979032cffb3f5f59fbf903a002 type=CONTAINER_STARTED_EVENT May 13 23:51:18.160826 containerd[1504]: time="2025-05-13T23:51:18.160834199Z" level=warning msg="container event discarded" container=b83c845d67ab5dac0b65d07100383572d8f4b2429af04c48edba1566be9e9887 type=CONTAINER_CREATED_EVENT May 13 23:51:18.242033 containerd[1504]: time="2025-05-13T23:51:18.241495967Z" level=warning msg="container event discarded" container=b83c845d67ab5dac0b65d07100383572d8f4b2429af04c48edba1566be9e9887 type=CONTAINER_STARTED_EVENT May 13 23:51:18.731759 containerd[1504]: time="2025-05-13T23:51:18.731681677Z" level=warning msg="container event discarded" container=b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c type=CONTAINER_CREATED_EVENT May 13 23:51:18.731759 containerd[1504]: time="2025-05-13T23:51:18.731751757Z" level=warning msg="container event discarded" container=b4c2331454f00cca5753a5727bcbfe1f9b58c2de51d732e8285043835e2cac0c type=CONTAINER_STARTED_EVENT May 13 23:51:18.762158 containerd[1504]: time="2025-05-13T23:51:18.762056245Z" level=warning msg="container event discarded" container=23a5de2726bad04721fe406cc5ca0122144be612fd5c3075bc187f6e83bdebf6 type=CONTAINER_CREATED_EVENT May 13 23:51:18.837502 containerd[1504]: time="2025-05-13T23:51:18.837405505Z" level=warning msg="container event discarded" container=23a5de2726bad04721fe406cc5ca0122144be612fd5c3075bc187f6e83bdebf6 type=CONTAINER_STARTED_EVENT May 13 23:51:20.763466 containerd[1504]: time="2025-05-13T23:51:20.763344729Z" level=warning msg="container event discarded" container=7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f type=CONTAINER_CREATED_EVENT May 13 23:51:20.763466 containerd[1504]: time="2025-05-13T23:51:20.763455169Z" level=warning msg="container event discarded" container=7af002d48b9a8c6908edf744019456d23400407cec425abf1c6fceb6273a524f type=CONTAINER_STARTED_EVENT May 13 23:51:23.511320 containerd[1504]: time="2025-05-13T23:51:23.511231532Z" level=warning msg="container event discarded" container=17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e type=CONTAINER_CREATED_EVENT May 13 23:51:23.611766 containerd[1504]: time="2025-05-13T23:51:23.611579208Z" level=warning msg="container event discarded" container=17721c4c9281f270e199ac2b2611fe80db238ac614052b90702d3b5951fcf23e type=CONTAINER_STARTED_EVENT May 13 23:51:26.212740 containerd[1504]: time="2025-05-13T23:51:26.212607871Z" level=warning msg="container event discarded" container=0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3 type=CONTAINER_CREATED_EVENT May 13 23:51:26.322480 containerd[1504]: time="2025-05-13T23:51:26.322405042Z" level=warning msg="container event discarded" container=0168eaf6dcba16060723c6087bf71b32abd4cba6cee295b8898a720ca373a2c3 type=CONTAINER_STARTED_EVENT May 13 23:51:26.583311 containerd[1504]: time="2025-05-13T23:51:26.583214883Z" level=warning msg="container event discarded" container=199bb05f8ad8a2eb332670173adb7ce3d73cfa95c632b14cf273de4c253053c8 type=CONTAINER_CREATED_EVENT May 13 23:51:26.685671 containerd[1504]: time="2025-05-13T23:51:26.685555633Z" level=warning msg="container event discarded" container=199bb05f8ad8a2eb332670173adb7ce3d73cfa95c632b14cf273de4c253053c8 type=CONTAINER_STARTED_EVENT May 13 23:51:26.999965 containerd[1504]: time="2025-05-13T23:51:26.999906143Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70c043295a87f409579ec5b8aac78376e11f9fcb4ba5e8558445b0125e618078\" id:\"c2631c47b35e213601228596eb80f4197a2425fbfa031daf5f2d1fe08e52eedb\" pid:7296 exited_at:{seconds:1747180286 nanos:999530904}" May 13 23:51:28.286526 containerd[1504]: time="2025-05-13T23:51:28.286431821Z" level=warning msg="container event discarded" container=1902b068506d018cfdef1adee0db7dffbfd62d119b4c31c4593644e79818155e type=CONTAINER_CREATED_EVENT May 13 23:51:28.384447 containerd[1504]: time="2025-05-13T23:51:28.384349420Z" level=warning msg="container event discarded" container=1902b068506d018cfdef1adee0db7dffbfd62d119b4c31c4593644e79818155e type=CONTAINER_STARTED_EVENT May 13 23:51:28.693031 containerd[1504]: time="2025-05-13T23:51:28.692843900Z" level=warning msg="container event discarded" container=e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023 type=CONTAINER_CREATED_EVENT May 13 23:51:28.780760 containerd[1504]: time="2025-05-13T23:51:28.780639084Z" level=warning msg="container event discarded" container=e968bab96892f11323fab93e7ae290ee604d8c00357230fbb93b8df91a72f023 type=CONTAINER_STARTED_EVENT May 13 23:51:28.994760 containerd[1504]: time="2025-05-13T23:51:28.994539677Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c512fe91b5205a967be0dff815519966f0a72836267f9cf198c6fcd62c35ec0\" id:\"df229697b77f1c8d1c994331efe109e4fcfe48b831bee387a3e8789643d8c155\" pid:7318 exited_at:{seconds:1747180288 nanos:994173678}" May 13 23:51:31.766579 containerd[1504]: time="2025-05-13T23:51:31.766466733Z" level=warning msg="container event discarded" container=5b4a4ab6d900ba94645cadcf9c49c09d9c6f52e28bc6c7cdf76734c4c7743cb1 type=CONTAINER_CREATED_EVENT May 13 23:51:31.849128 containerd[1504]: time="2025-05-13T23:51:31.849032088Z" level=warning msg="container event discarded" container=5b4a4ab6d900ba94645cadcf9c49c09d9c6f52e28bc6c7cdf76734c4c7743cb1 type=CONTAINER_STARTED_EVENT May 13 23:51:32.222733 kubelet[2780]: E0513 23:51:32.222229 2780 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:48132->10.0.0.2:2379: read: connection timed out" May 13 23:51:32.231186 systemd[1]: cri-containerd-152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55.scope: Deactivated successfully. May 13 23:51:32.232346 systemd[1]: cri-containerd-152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55.scope: Consumed 4.667s CPU time, 27.4M memory peak, 4M read from disk. May 13 23:51:32.235781 containerd[1504]: time="2025-05-13T23:51:32.235744966Z" level=info msg="received exit event container_id:\"152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55\" id:\"152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55\" pid:2627 exit_status:1 exited_at:{seconds:1747180292 nanos:233598371}" May 13 23:51:32.236293 containerd[1504]: time="2025-05-13T23:51:32.236057565Z" level=info msg="TaskExit event in podsandbox handler container_id:\"152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55\" id:\"152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55\" pid:2627 exit_status:1 exited_at:{seconds:1747180292 nanos:233598371}" May 13 23:51:32.263705 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55-rootfs.mount: Deactivated successfully. May 13 23:51:32.393515 systemd[1]: cri-containerd-c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878.scope: Deactivated successfully. May 13 23:51:32.393973 systemd[1]: cri-containerd-c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878.scope: Consumed 8.627s CPU time, 43.8M memory peak. May 13 23:51:32.398303 containerd[1504]: time="2025-05-13T23:51:32.397953122Z" level=info msg="received exit event container_id:\"c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878\" id:\"c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878\" pid:3130 exit_status:1 exited_at:{seconds:1747180292 nanos:397354043}" May 13 23:51:32.398303 containerd[1504]: time="2025-05-13T23:51:32.398250001Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878\" id:\"c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878\" pid:3130 exit_status:1 exited_at:{seconds:1747180292 nanos:397354043}" May 13 23:51:32.430330 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878-rootfs.mount: Deactivated successfully. May 13 23:51:32.714785 systemd[1]: cri-containerd-33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9.scope: Deactivated successfully. May 13 23:51:32.715175 systemd[1]: cri-containerd-33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9.scope: Consumed 6.217s CPU time, 65.9M memory peak, 3.9M read from disk. May 13 23:51:32.722642 containerd[1504]: time="2025-05-13T23:51:32.721995874Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9\" id:\"33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9\" pid:2644 exit_status:1 exited_at:{seconds:1747180292 nanos:720801357}" May 13 23:51:32.727847 containerd[1504]: time="2025-05-13T23:51:32.727588981Z" level=info msg="received exit event container_id:\"33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9\" id:\"33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9\" pid:2644 exit_status:1 exited_at:{seconds:1747180292 nanos:720801357}" May 13 23:51:32.755438 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9-rootfs.mount: Deactivated successfully. May 13 23:51:32.767661 kubelet[2780]: I0513 23:51:32.767404 2780 scope.go:117] "RemoveContainer" containerID="c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878" May 13 23:51:32.769942 kubelet[2780]: I0513 23:51:32.769478 2780 scope.go:117] "RemoveContainer" containerID="152fdcc8426b4927e0486f67eed501329ff0b77674168c602dd02bb15f2d8f55" May 13 23:51:32.772830 containerd[1504]: time="2025-05-13T23:51:32.772153030Z" level=info msg="CreateContainer within sandbox \"5d0ffeaab52566e7db58f605bce7024eff088e7f2b3e7eddedec93db5aacd7a4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" May 13 23:51:32.776142 containerd[1504]: time="2025-05-13T23:51:32.776017100Z" level=info msg="CreateContainer within sandbox \"db3b737e89b7f17d83d874b3ee74520d1f351a648c95a706901194847310610c\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" May 13 23:51:32.787529 containerd[1504]: time="2025-05-13T23:51:32.784849958Z" level=info msg="Container cadae976ba9a30e0f645578e89d3b4327d698a7c696a60ee737b16a241f4e6c5: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:32.790685 containerd[1504]: time="2025-05-13T23:51:32.790643024Z" level=info msg="Container b568fac85887f8ff41549675602fecf670fef340abe17b70d8c0aab48ef84c59: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:32.797763 containerd[1504]: time="2025-05-13T23:51:32.797687086Z" level=info msg="CreateContainer within sandbox \"5d0ffeaab52566e7db58f605bce7024eff088e7f2b3e7eddedec93db5aacd7a4\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"cadae976ba9a30e0f645578e89d3b4327d698a7c696a60ee737b16a241f4e6c5\"" May 13 23:51:32.799188 containerd[1504]: time="2025-05-13T23:51:32.798483404Z" level=info msg="StartContainer for \"cadae976ba9a30e0f645578e89d3b4327d698a7c696a60ee737b16a241f4e6c5\"" May 13 23:51:32.799546 containerd[1504]: time="2025-05-13T23:51:32.799520761Z" level=info msg="connecting to shim cadae976ba9a30e0f645578e89d3b4327d698a7c696a60ee737b16a241f4e6c5" address="unix:///run/containerd/s/cd343983996a2457f24e26f8ba7bb8abc41075cd91cae3510f15ae5fd8fc4ea4" protocol=ttrpc version=3 May 13 23:51:32.800664 containerd[1504]: time="2025-05-13T23:51:32.800615039Z" level=info msg="CreateContainer within sandbox \"db3b737e89b7f17d83d874b3ee74520d1f351a648c95a706901194847310610c\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"b568fac85887f8ff41549675602fecf670fef340abe17b70d8c0aab48ef84c59\"" May 13 23:51:32.801313 containerd[1504]: time="2025-05-13T23:51:32.801231117Z" level=info msg="StartContainer for \"b568fac85887f8ff41549675602fecf670fef340abe17b70d8c0aab48ef84c59\"" May 13 23:51:32.803652 containerd[1504]: time="2025-05-13T23:51:32.803610511Z" level=info msg="connecting to shim b568fac85887f8ff41549675602fecf670fef340abe17b70d8c0aab48ef84c59" address="unix:///run/containerd/s/93612302044af4e350d606a86221cc76236bdbf5e8fbc314ae7dac098529ca52" protocol=ttrpc version=3 May 13 23:51:32.829416 systemd[1]: Started cri-containerd-b568fac85887f8ff41549675602fecf670fef340abe17b70d8c0aab48ef84c59.scope - libcontainer container b568fac85887f8ff41549675602fecf670fef340abe17b70d8c0aab48ef84c59. May 13 23:51:32.833015 systemd[1]: Started cri-containerd-cadae976ba9a30e0f645578e89d3b4327d698a7c696a60ee737b16a241f4e6c5.scope - libcontainer container cadae976ba9a30e0f645578e89d3b4327d698a7c696a60ee737b16a241f4e6c5. May 13 23:51:32.892475 containerd[1504]: time="2025-05-13T23:51:32.891644812Z" level=info msg="StartContainer for \"cadae976ba9a30e0f645578e89d3b4327d698a7c696a60ee737b16a241f4e6c5\" returns successfully" May 13 23:51:32.912111 containerd[1504]: time="2025-05-13T23:51:32.912049401Z" level=info msg="StartContainer for \"b568fac85887f8ff41549675602fecf670fef340abe17b70d8c0aab48ef84c59\" returns successfully" May 13 23:51:33.778402 kubelet[2780]: I0513 23:51:33.778148 2780 scope.go:117] "RemoveContainer" containerID="33d475f3b90eee7c414c427d9beb39f3b541908f8381fc27b25110ccb16c84e9" May 13 23:51:33.783708 containerd[1504]: time="2025-05-13T23:51:33.782886907Z" level=info msg="CreateContainer within sandbox \"d05c4f82ed799321aaf1608ef36f8f707e728b40d6b65935e37253dacdc1bf51\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" May 13 23:51:33.798970 containerd[1504]: time="2025-05-13T23:51:33.798886667Z" level=info msg="Container 2ac998ae530e892861c48eb2475c6065da90a326faa311027b40410cee278295: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:33.811682 containerd[1504]: time="2025-05-13T23:51:33.811445956Z" level=info msg="CreateContainer within sandbox \"d05c4f82ed799321aaf1608ef36f8f707e728b40d6b65935e37253dacdc1bf51\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"2ac998ae530e892861c48eb2475c6065da90a326faa311027b40410cee278295\"" May 13 23:51:33.814102 containerd[1504]: time="2025-05-13T23:51:33.812754832Z" level=info msg="StartContainer for \"2ac998ae530e892861c48eb2475c6065da90a326faa311027b40410cee278295\"" May 13 23:51:33.814894 containerd[1504]: time="2025-05-13T23:51:33.814854147Z" level=info msg="connecting to shim 2ac998ae530e892861c48eb2475c6065da90a326faa311027b40410cee278295" address="unix:///run/containerd/s/2c03b64b8fd65efaa8d716073da9eb54b5a1000a6652c713832344d38e8704c1" protocol=ttrpc version=3 May 13 23:51:33.849803 systemd[1]: Started cri-containerd-2ac998ae530e892861c48eb2475c6065da90a326faa311027b40410cee278295.scope - libcontainer container 2ac998ae530e892861c48eb2475c6065da90a326faa311027b40410cee278295. May 13 23:51:33.913404 containerd[1504]: time="2025-05-13T23:51:33.913274581Z" level=info msg="StartContainer for \"2ac998ae530e892861c48eb2475c6065da90a326faa311027b40410cee278295\" returns successfully" May 13 23:51:36.258328 systemd[1]: cri-containerd-cadae976ba9a30e0f645578e89d3b4327d698a7c696a60ee737b16a241f4e6c5.scope: Deactivated successfully. May 13 23:51:36.260467 containerd[1504]: time="2025-05-13T23:51:36.258485700Z" level=info msg="received exit event container_id:\"cadae976ba9a30e0f645578e89d3b4327d698a7c696a60ee737b16a241f4e6c5\" id:\"cadae976ba9a30e0f645578e89d3b4327d698a7c696a60ee737b16a241f4e6c5\" pid:7393 exit_status:1 exited_at:{seconds:1747180296 nanos:258019902}" May 13 23:51:36.260467 containerd[1504]: time="2025-05-13T23:51:36.258576140Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cadae976ba9a30e0f645578e89d3b4327d698a7c696a60ee737b16a241f4e6c5\" id:\"cadae976ba9a30e0f645578e89d3b4327d698a7c696a60ee737b16a241f4e6c5\" pid:7393 exit_status:1 exited_at:{seconds:1747180296 nanos:258019902}" May 13 23:51:36.291932 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cadae976ba9a30e0f645578e89d3b4327d698a7c696a60ee737b16a241f4e6c5-rootfs.mount: Deactivated successfully. May 13 23:51:36.573747 kubelet[2780]: E0513 23:51:36.572866 2780 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:47910->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4284-0-0-n-732e99817a.183f3b3696cf1a09 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4284-0-0-n-732e99817a,UID:9ce869bf71179ce9ccb9f6662ffc6652,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-n-732e99817a,},FirstTimestamp:2025-05-13 23:51:26.087469577 +0000 UTC m=+344.905735297,LastTimestamp:2025-05-13 23:51:26.087469577 +0000 UTC m=+344.905735297,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-n-732e99817a,}" May 13 23:51:36.800284 kubelet[2780]: I0513 23:51:36.799575 2780 scope.go:117] "RemoveContainer" containerID="c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878" May 13 23:51:36.800284 kubelet[2780]: I0513 23:51:36.800030 2780 scope.go:117] "RemoveContainer" containerID="cadae976ba9a30e0f645578e89d3b4327d698a7c696a60ee737b16a241f4e6c5" May 13 23:51:36.800763 kubelet[2780]: E0513 23:51:36.800736 2780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-789496d6f5-w8hcx_tigera-operator(458eda0b-c132-4a02-9169-ebb10765d15d)\"" pod="tigera-operator/tigera-operator-789496d6f5-w8hcx" podUID="458eda0b-c132-4a02-9169-ebb10765d15d" May 13 23:51:36.802758 containerd[1504]: time="2025-05-13T23:51:36.802721451Z" level=info msg="RemoveContainer for \"c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878\"" May 13 23:51:36.808510 containerd[1504]: time="2025-05-13T23:51:36.808214477Z" level=info msg="RemoveContainer for \"c03f9142e10ccef79636508622dc26eddb534f6f56645542681b8fb057211878\" returns successfully"